a protocol for a function \(f\) is a pair of functions \(A,B:\qty {0,1}^{*} \times \qty {0,1}* \to [0,1,STOP]\) whereby:
- on input \((x,y)\), we initialize round counter \(r=0\), and initial (empty) message \(b_0 = \epsilon\)
- while \(b_{r} \neq STOP\)
- \(r++\)
- if \(r\) is odd, then Alice sends \(b_{r} = A\qty(x, b_1 \cdots b_{r-1})\)
- else Bob sends \(b_{r} = B\qty(y, b_{1} … b_{r-1})\)
- our function output \(b_{r-1}\), and we call \(r-1\) the number of rounds
protocol cost
the cost of a protocol \(P\) for \(f\) on \(n\) bit strings is:
\begin{equation} \max_{x,y \in \qty {0,1}^{n}} \text{number of rounds in P required to compute $f(x,y)$} \end{equation}
i.e: what is the input that could make running this protocol the longest? (remember we call \(r-1\) the number of rounds)
Communication Complexity
the Communication Complexity of \(f\) on \(n\) bit strings is the minimum protocol cost over all possible protocols for \(f\) on n bit strings.
the range of communication complexity is $[2, 2n]$—the former because each party has to say something to communicate, the latter because…
trivial protocol
there is always a “trivial” protocol: Alice can always just send her the bits of her \(x\) in the odd rounds, and Bob can send his bits of his \(y\) in the even rounds, and after \(2n\) we just communicated the input entirely and tada the end and they can compute \(f(x,y)\) and be done.
examples
parity
Consider:
\begin{equation} PARITY(x,y) = \qty(\sum_{i}^{}x_{i} + \sum_{i}^{} y_{i}) \ \text{mod}\ 2 \end{equation}
- Alice sends \(b_1 = \qty(\sum_{i}^{}x_{i} \ \text{mod}\ 2)\)
- Bob send \(b_2 = \qty(b_{1} + \sum_{i}^{}y_{i}\ \text{mod}\ 2)\)
the end! because mods can distribute to each part of the sum. hence, the Communication Complexity of PARITY is 2—this is a minimal communication (2 round communication)
majority
Consider:
what is the most frequent bit in \(xy\)?
- Alice sends \(N_{x}\), the number of \(1\) in \(x\)
- Bob computes \(N_{y}\), the number of \(1\) in \(y\)
- Bob sends \(1\) IFF \(N_{x}+N_{y}\) is greater than \(\frac{\qty(|x|+|y|)}{2} = n\)
hence, we need to send at least \(\log_{2}(n)\) bits where Alice is sending over the number of $1$s.
equals
\begin{equation} \text{EQUALS}(x,y) = 1 \text{ IFF } x =y \end{equation}
complexity of EQUALS
The communication complexity of EQUALS is \(\theta (n)\). Every protocol for EQUALS needs \(\geq n\) bits of communication. This also results in that every streaming algorithm for EQUALS needs \(cn\) bits of memory, for some constant \(c>0\).
Let’s define the communication pattern to be the sequence of bits that are set. Assume for the sake of contradiction that there is a protocol for which communicating EQUALS only takes \(\leq n-1\) bits.
That means that there’s only \(2^{n-1}\) possible communication patterns to communicate this protocol. By pigeonhole, there’s something of length \(n\) exactly for which on \((x,x)\) and \((y,y)\) uses the same communication pattern when \(x \neq y\) (because there are \(2^{n}\) such pairs \((x,x)\), so it should produce correspondingly \(2^{n}\) such patterns).
Notice! This means that the communication pattern of \((x,y)\) is also going to be the same (we induce this by noticing that each turn, if the pattern is the same, each party is going to see the same inputs and will return the same outputs.)
This is contradiction because the protocol outputs the same bit for both \((y,y)\) and \((x,y)\), which is not good.
getting better results with randomized protocols
general idea: after applying some error correcting code, a single-bit error will result in errors in many-many bits; we then can send just a few random bits and then be able to be fairly sure that the underlying strings are roughly the same.
- Alice picks a random prime number
- She sends \(p\) to Bob, and sends \(x \ \text{mod}\ p\) to bob \(O(\log n)\) bits to send
- bob checks whether \(y = x\ \text{mod}\ p\)
protocols, DFA, Streaming Algorithms, Communication Complexity
Let \(L \subseteq \qty{0,1}^{*}\), let \(f_{L}: \qty {0,1}^{*} \times \qty {0,1}^{*} \to \qty {0,1}\)
for \(x,y\) with \(|x| = |y|\) written as:
\begin{equation} f_{L}(x,y) = 1 \Leftrightarrow xy \in L \end{equation}
for instance, if we had a language:
\begin{equation} L = \qty { x x \mid x \in {0,1}^{*}} \end{equation}
this is equal to the function EQUALS above.
bounded Communication Complexity
if \(L\) has a Streaming Algorithms using \(\leq s\) bits of space, then the communication complexity of \(f_{L}\) is at most \(O(s)\).
Proof
Alice can run the streaming algorithm \(A\) on \(x\); Alice sends the memory of \(A\) after doing that: this uses \(s\) bits of space. And then Bob loads the memory, and runs it from there.
If accept, then return a \(1\); otherwise, no.
Corollary
For every regular language, the communication complexity of \(f_{L}\) is \(O(1)\). Because we can just send our state ID over.