Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > KEYWORD > INFORMATION COMPLEXITY:
Reports tagged with information complexity:
TR11-123 | 15th September 2011
Mark Braverman

Interactive information complexity

The primary goal of this paper is to define and study the interactive information complexity of functions. Let $f(x,y)$ be a function, and suppose Alice is given $x$ and Bob is given $y$. Informally, the interactive information complexity $IC(f)$ of $f$ is the least amount of information Alice and Bob ... more >>>


TR12-131 | 18th October 2012
Mark Braverman, Ankur Moitra

An Information Complexity Approach to Extended Formulations

Revisions: 1

We prove an unconditional lower bound that any linear program that achieves an $O(n^{1-\epsilon})$ approximation for clique has size $2^{\Omega(n^\epsilon)}$. There has been considerable recent interest in proving unconditional lower bounds against any linear program. Fiorini et al proved that there is no polynomial sized linear program for traveling salesman. ... more >>>


TR12-171 | 3rd December 2012
Mark Braverman, Ankit Garg, Denis Pankratov, Omri Weinstein

From Information to Exact Communication

We develop a new local characterization of the zero-error information complexity function for two party communication problems, and use it to compute the exact internal and external information complexity of the 2-bit AND function: $IC(AND,0) = C_{\wedge}\approx 1.4923$ bits, and $IC^{ext}(AND,0) = \log_2 3 \approx 1.5839$ bits. This leads to ... more >>>


TR12-177 | 19th December 2012
Mark Braverman, Ankit Garg, Denis Pankratov, Omri Weinstein

Information lower bounds via self-reducibility

We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform ... more >>>


TR13-015 | 18th January 2013
Iordanis Kerenidis, Mathieu Laurière, David Xiao

New lower bounds for privacy in communication protocols

Communication complexity is a central model of computation introduced by Yao in 1979, where
two players, Alice and Bob, receive inputs x and y respectively and want to compute $f(x; y)$ for some fixed
function f with the least amount of communication. Recently people have revisited the question of the ... more >>>


TR13-035 | 6th March 2013
Mark Braverman, Anup Rao, Omri Weinstein, Amir Yehudayoff

Direct product via round-preserving compression

Revisions: 1

We obtain a strong direct product theorem for two-party bounded round communication complexity.
Let suc_r(\mu,f,C) denote the maximum success probability of an r-round communication protocol that uses
at most C bits of communication in computing f(x,y) when (x,y)~\mu.
Jain et al. [JPY12] have recently showed that if
more >>>


TR13-130 | 17th September 2013
Mark Braverman, Ankit Garg

Public vs private coin in bounded-round information

Revisions: 1

We precisely characterize the role of private randomness in the ability of Alice to send a message to Bob while minimizing the amount of information revealed to him. We show that if using private randomness a message can be transmitted while revealing $I$ bits of information, the transmission can be ... more >>>


TR14-047 | 8th April 2014
Mark Braverman, Omri Weinstein

An Interactive Information Odometer with Applications

Revisions: 1

We introduce a novel technique which enables two players to maintain an estimate of the internal information cost of their conversation in an online fashion without revealing much extra information. We use this construction to obtain new results about communication complexity and information-theoretically secure computation.

As a first corollary, ... more >>>


TR14-049 | 11th April 2014
Anat Ganor, Gillat Kol, Ran Raz

Exponential Separation of Information and Communication

Revisions: 1

We show an exponential gap between communication complexity and information complexity, by giving an explicit example for a communication task (relation), with information complexity $\leq O(k)$, and distributional communication complexity $\geq 2^k$. This shows that a communication protocol cannot always be compressed to its internal information. By a result of ... more >>>


TR14-062 | 22nd March 2014
Alexander Kozachinsky

On the role of private coins in unbounded-round Information Complexity

We prove a version of "Reversed Newman Theorem" in context of information complexity: every private-coin communication protocol with information complexity $I$ and communication complexity $C$ can be replaced by public-coin protocol with the same behavior so that it's information complexity does not exceed $O\left(\sqrt{IC}\right)$. This result holds for unbounded-round communication ... more >>>


TR14-113 | 27th August 2014
Anat Ganor, Gillat Kol, Ran Raz

Exponential Separation of Information and Communication for Boolean Functions

We show an exponential gap between communication complexity and information complexity for boolean functions, by giving an explicit example of a partial function with information complexity $\leq O(k)$, and distributional communication complexity $\geq 2^k$. This shows that a communication protocol for a partial boolean function cannot always be compressed to ... more >>>


TR14-119 | 15th September 2014
Mark Braverman, Jieming Mao

Simulating Noisy Channel Interaction

We show that $T$ rounds of interaction over the binary symmetric channel $BSC_{1/2-\epsilon}$ with feedback can be simulated with $O(\epsilon^2 T)$ rounds of interaction over a noiseless channel. We also introduce a more general "energy cost'' model of interaction over a noisy channel. We show energy cost to be equivalent ... more >>>


TR15-023 | 10th February 2015
Mark Braverman, Jon Schneider

Information complexity is computable

The information complexity of a function $f$ is the minimum amount of information Alice and Bob need to exchange to compute the function $f$. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function $f$ to within any additive error $\alpha>0$, thus resolving an ... more >>>


TR15-028 | 27th February 2015
Lila Fontes, Rahul Jain, Iordanis Kerenidis, Sophie Laplante, Mathieu Laurière, Jérémie Roland

Relative Discrepancy does not separate Information and Communication Complexity

Does the information complexity of a function equal its communication complexity? We examine whether any currently known techniques might be used to show a separation between the two notions. Recently, Ganor et al. provided such a separation in the distributional setting for a specific input distribution ?. We show that ... more >>>


TR15-039 | 16th March 2015
Anup Rao, Makrand Sinha

On Parallelizing Streaming Algorithms

We study the complexity of parallelizing streaming algorithms (or equivalently, branching programs). If $M(f)$ denotes the minimum average memory required to compute a function $f(x_1,x_2, \dots, x_n)$ how much memory is required to compute $f$ on $k$ independent streams that arrive in parallel? We show that when the inputs (updates) ... more >>>


TR15-055 | 13th April 2015
Sivaramakrishnan Natarajan Ramamoorthy, Anup Rao

How to Compress Asymmetric Communication

We study the relationship between communication and information in 2-party communication protocols when the information is asymmetric. If $I^A$ denotes the number of bits of information revealed by the first party, $I^B$ denotes the information revealed by the second party, and $C$ is the number of bits of communication in ... more >>>


TR15-060 | 14th April 2015
Omri Weinstein

Information Complexity and the Quest for Interactive Compression

Revisions: 1

Information complexity is the interactive analogue of Shannon's classical information theory. In recent years this field has emerged as a powerful tool for proving strong communication lower bounds, and for addressing some of the major open problems in communication complexity and circuit complexity. A notable achievement of information complexity is ... more >>>


TR15-070 | 22nd April 2015
Himanshu Tyagi, Shaileshh Venkatakrishnan , Pramod Viswanath, Shun Watanabe

Information Complexity Density and Simulation of Protocols

Revisions: 1

A simulation of an interactive protocol entails the use of an interactive communication to produce the output of the protocol to within a fixed statistical distance $\epsilon$. Recent works in the TCS community have propagated that the information complexity of the protocol plays a central role in characterizing the minimum ... more >>>


TR15-083 | 14th May 2015
Omri Weinstein, David Woodruff

The Simultaneous Communication of Disjointness with Applications to Data Streams

Revisions: 1

We study $k$-party set disjointness in the simultaneous message-passing model, and show that even if each element $i\in[n]$ is guaranteed to either belong to all $k$ parties or to at most $O(1)$ parties in expectation (and to at most $O(\log n)$ parties with high probability), then $\Omega(n \min(\log 1/\delta, \log ... more >>>


TR15-087 | 30th May 2015
Badih Ghazi, Pritish Kamath, Madhu Sudan

Communication Complexity of Permutation-Invariant Functions

Motivated by the quest for a broader understanding of communication complexity of simple functions, we introduce the class of ''permutation-invariant'' functions. A partial function $f:\{0,1\}^n \times \{0,1\}^n\to \{0,1,?\}$ is permutation-invariant if for every bijection $\pi:\{1,\ldots,n\} \to \{1,\ldots,n\}$ and every $\mathbf{x}, \mathbf{y} \in \{0,1\}^n$, it is the case that $f(\mathbf{x}, \mathbf{y}) ... more >>>


TR15-088 | 31st May 2015
Anat Ganor, Gillat Kol, Ran Raz

Exponential Separation of Communication and External Information

We show an exponential gap between communication complexity and external information complexity, by analyzing a communication task suggested as a candidate by Braverman [Bra13]. Previously, only a separation of communication complexity and internal information complexity was known [GKR14,GKR15].

More precisely, we obtain an explicit example of a search problem with ... more >>>


TR15-209 | 29th December 2015
Eli Ben-Sasson, Gal Maor

On the information leakage of public-output protocols

In this paper three complexity measures are studied: (i) internal information, (ii) external information, and (iii) a measure called here "output information". Internal information (i) measures the counter-party privacy-loss inherent in a communication protocol. Similarly, the output information (iii) measures the reduction in input-privacy that is inherent when the output ... more >>>


TR16-081 | 20th May 2016
Alexander A. Sherstov

Compressing interactive communication under product distributions

We study the problem of compressing interactive communication to its
information content $I$, defined as the amount of information that the
participants learn about each other's inputs. We focus on the case when
the participants' inputs are distributed independently and show how to
compress the communication to $O(I\log^{2}I)$ bits, with ... more >>>


TR16-190 | 21st November 2016
Yuval Dagan, Yuval Filmus, Hamed Hatami, Yaqiao Li

Trading information complexity for error

We consider the standard two-party communication model. The central problem studied in this article is how much one can save in information complexity by allowing an error of $\epsilon$.
For arbitrary functions, we obtain lower bounds and upper bounds indicating a gain that is of order $\Omega(h(\epsilon))$ and $O(h(\sqrt{\epsilon}))$. ... more >>>


TR17-071 | 14th April 2017
Young Kun Ko, Arial Schvartzman

Bounds for the Communication Complexity of Two-Player Approximate Correlated Equilibria

Revisions: 1

In the recent paper of~\cite{BR16}, the authors show that, for any constant $10^{-15} > \varepsilon > 0$ the communication complexity of $\varepsilon$-approximate Nash equilibria in $2$-player $n \times n$ games is $n^{\Omega(\varepsilon)}$, resolving the long open problem of whether or not there exists a polylogarithmic communication protocol. In this paper ... more >>>


TR17-119 | 25th July 2017
Badih Ghazi, T.S. Jayram

Resource-Efficient Common Randomness and Secret-Key Schemes

We study common randomness where two parties have access to i.i.d. samples from a known random source, and wish to generate a shared random key using limited (or no) communication with the largest possible probability of agreement. This problem is at the core of secret key generation in cryptography, with ... more >>>


TR23-194 | 5th December 2023
Siddharth Iyer, Anup Rao

XOR Lemmas for Communication via Marginal Information

Revisions: 2

We define the marginal information of a communication protocol, and use it to prove XOR lemmas for communication complexity. We show that if every $C$-bit protocol has bounded advantage for computing a Boolean function $f$, then every $\tilde \Omega(C \sqrt{n})$-bit protocol has advantage $\exp(-\Omega(n))$ for computing the $n$-fold xor $f^{\oplus ... more >>>




ISSN 1433-8092 | Imprint