The primary goal of this paper is to define and study the interactive information complexity of functions. Let $f(x,y)$ be a function, and suppose Alice is given $x$ and Bob is given $y$. Informally, the interactive information complexity $IC(f)$ of $f$ is the least amount of information Alice and Bob ... more >>>
We prove an unconditional lower bound that any linear program that achieves an $O(n^{1-\epsilon})$ approximation for clique has size $2^{\Omega(n^\epsilon)}$. There has been considerable recent interest in proving unconditional lower bounds against any linear program. Fiorini et al proved that there is no polynomial sized linear program for traveling salesman. ... more >>>
We develop a new local characterization of the zero-error information complexity function for two party communication problems, and use it to compute the exact internal and external information complexity of the 2-bit AND function: $IC(AND,0) = C_{\wedge}\approx 1.4923$ bits, and $IC^{ext}(AND,0) = \log_2 3 \approx 1.5839$ bits. This leads to ... more >>>
We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform ... more >>>
Communication complexity is a central model of computation introduced by Yao in 1979, where
two players, Alice and Bob, receive inputs x and y respectively and want to compute $f(x; y)$ for some fixed
function f with the least amount of communication. Recently people have revisited the question of the ...
more >>>
We obtain a strong direct product theorem for two-party bounded round communication complexity.
Let suc_r(\mu,f,C) denote the maximum success probability of an r-round communication protocol that uses
at most C bits of communication in computing f(x,y) when (x,y)~\mu.
Jain et al. [JPY12] have recently showed that if
more >>>
We precisely characterize the role of private randomness in the ability of Alice to send a message to Bob while minimizing the amount of information revealed to him. We show that if using private randomness a message can be transmitted while revealing $I$ bits of information, the transmission can be ... more >>>
We introduce a novel technique which enables two players to maintain an estimate of the internal information cost of their conversation in an online fashion without revealing much extra information. We use this construction to obtain new results about communication complexity and information-theoretically secure computation.
As a first corollary, ... more >>>
We show an exponential gap between communication complexity and information complexity, by giving an explicit example for a communication task (relation), with information complexity $\leq O(k)$, and distributional communication complexity $\geq 2^k$. This shows that a communication protocol cannot always be compressed to its internal information. By a result of ... more >>>
We prove a version of "Reversed Newman Theorem" in context of information complexity: every private-coin communication protocol with information complexity $I$ and communication complexity $C$ can be replaced by public-coin protocol with the same behavior so that it's information complexity does not exceed $O\left(\sqrt{IC}\right)$. This result holds for unbounded-round communication ... more >>>
We show an exponential gap between communication complexity and information complexity for boolean functions, by giving an explicit example of a partial function with information complexity $\leq O(k)$, and distributional communication complexity $\geq 2^k$. This shows that a communication protocol for a partial boolean function cannot always be compressed to ... more >>>
We show that $T$ rounds of interaction over the binary symmetric channel $BSC_{1/2-\epsilon}$ with feedback can be simulated with $O(\epsilon^2 T)$ rounds of interaction over a noiseless channel. We also introduce a more general "energy cost'' model of interaction over a noisy channel. We show energy cost to be equivalent ... more >>>
The information complexity of a function $f$ is the minimum amount of information Alice and Bob need to exchange to compute the function $f$. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function $f$ to within any additive error $\alpha>0$, thus resolving an ... more >>>
Does the information complexity of a function equal its communication complexity? We examine whether any currently known techniques might be used to show a separation between the two notions. Recently, Ganor et al. provided such a separation in the distributional setting for a specific input distribution ?. We show that ... more >>>
We study the complexity of parallelizing streaming algorithms (or equivalently, branching programs). If $M(f)$ denotes the minimum average memory required to compute a function $f(x_1,x_2, \dots, x_n)$ how much memory is required to compute $f$ on $k$ independent streams that arrive in parallel? We show that when the inputs (updates) ... more >>>
We study the relationship between communication and information in 2-party communication protocols when the information is asymmetric. If $I^A$ denotes the number of bits of information revealed by the first party, $I^B$ denotes the information revealed by the second party, and $C$ is the number of bits of communication in ... more >>>
Information complexity is the interactive analogue of Shannon's classical information theory. In recent years this field has emerged as a powerful tool for proving strong communication lower bounds, and for addressing some of the major open problems in communication complexity and circuit complexity. A notable achievement of information complexity is ... more >>>
A simulation of an interactive protocol entails the use of an interactive communication to produce the output of the protocol to within a fixed statistical distance $\epsilon$. Recent works in the TCS community have propagated that the information complexity of the protocol plays a central role in characterizing the minimum ... more >>>
We study $k$-party set disjointness in the simultaneous message-passing model, and show that even if each element $i\in[n]$ is guaranteed to either belong to all $k$ parties or to at most $O(1)$ parties in expectation (and to at most $O(\log n)$ parties with high probability), then $\Omega(n \min(\log 1/\delta, \log ... more >>>
Motivated by the quest for a broader understanding of communication complexity of simple functions, we introduce the class of ''permutation-invariant'' functions. A partial function $f:\{0,1\}^n \times \{0,1\}^n\to \{0,1,?\}$ is permutation-invariant if for every bijection $\pi:\{1,\ldots,n\} \to \{1,\ldots,n\}$ and every $\mathbf{x}, \mathbf{y} \in \{0,1\}^n$, it is the case that $f(\mathbf{x}, \mathbf{y}) ... more >>>
We show an exponential gap between communication complexity and external information complexity, by analyzing a communication task suggested as a candidate by Braverman [Bra13]. Previously, only a separation of communication complexity and internal information complexity was known [GKR14,GKR15].
More precisely, we obtain an explicit example of a search problem with ... more >>>
In this paper three complexity measures are studied: (i) internal information, (ii) external information, and (iii) a measure called here "output information". Internal information (i) measures the counter-party privacy-loss inherent in a communication protocol. Similarly, the output information (iii) measures the reduction in input-privacy that is inherent when the output ... more >>>
We study the problem of compressing interactive communication to its
information content $I$, defined as the amount of information that the
participants learn about each other's inputs. We focus on the case when
the participants' inputs are distributed independently and show how to
compress the communication to $O(I\log^{2}I)$ bits, with ...
more >>>
We consider the standard two-party communication model. The central problem studied in this article is how much one can save in information complexity by allowing an error of $\epsilon$.
For arbitrary functions, we obtain lower bounds and upper bounds indicating a gain that is of order $\Omega(h(\epsilon))$ and $O(h(\sqrt{\epsilon}))$. ...
more >>>
In the recent paper of~\cite{BR16}, the authors show that, for any constant $10^{-15} > \varepsilon > 0$ the communication complexity of $\varepsilon$-approximate Nash equilibria in $2$-player $n \times n$ games is $n^{\Omega(\varepsilon)}$, resolving the long open problem of whether or not there exists a polylogarithmic communication protocol. In this paper ... more >>>
We study common randomness where two parties have access to i.i.d. samples from a known random source, and wish to generate a shared random key using limited (or no) communication with the largest possible probability of agreement. This problem is at the core of secret key generation in cryptography, with ... more >>>
We define the marginal information of a communication protocol, and use it to prove XOR lemmas for communication complexity. We show that if every $C$-bit protocol has bounded advantage for computing a Boolean function $f$, then every $\tilde \Omega(C \sqrt{n})$-bit protocol has advantage $\exp(-\Omega(n))$ for computing the $n$-fold xor $f^{\oplus ... more >>>