Under the auspices of the Computational Complexity Foundation (CCF)

REPORTS > DETAIL:

### Revision(s):

Revision #1 to TR14-101 | 1st January 2015 09:56

#### Internal compression of protocols to entropy

Revision #1
Authors: Balthazar Bauer, Shay Moran, Amir Yehudayoff
Accepted on: 1st January 2015 09:56
Keywords:

Abstract:

We study internal compression of communication protocols to their internal entropy, which is the entropy of the transcript from the players' perspective. We first show that errorless compression to the internal entropy (and hence to the internal information) is impossible. We then provide two internal compression schemes with error. One of a protocol of Fiege et al. for finding the first difference between two strings. The second and main one is an internal compression with error $\epsilon > 0$ of a protocol with internal entropy $H^{int}$ and communication complexity $C$ to a protocol with communication at most order $(H^{int}/\epsilon)^2 \log(\log(C))$.

This immediately implies a similar compression to the internal information of public coin protocols, which exponentially improves over previously known public coin compressions in the dependence on $C$. It further shows that in a recent protocol of Ganor, Kol and Raz it is impossible to move the private randomness to be public without an exponential cost.
To the best of our knowledge, no such example was previously known.

Changes to previous version:

In this version, we use a different definition for the term "simulation" and discuss the difference between it and the previous one.

### Paper:

TR14-101 | 8th August 2014 11:58

#### Internal compression of protocols to entropy

TR14-101
Authors: Balthazar Bauer, Shay Moran, Amir Yehudayoff
Publication: 8th August 2014 14:52
Keywords:

Abstract:

We study internal compression of communication protocols
to their internal entropy, which is the entropy of the transcript from the players' perspective.
We first show that errorless compression to the internal entropy
(and hence to the internal information) is impossible.
We then provide two internal compression schemes with error.
One of a protocol of Fiege et al. for finding the first difference
between two strings.
The second and main one is an internal compression with error $\epsilon > 0$ of a protocol with internal entropy $H^{int}$ and communication complexity $C$ to a protocol with communication at most order $(H^{int}/\epsilon)^2 \log(\log(C))$.

This immediately implies a similar compression to the internal information of public coin protocols, which exponentially improves over previously known public coin compressions in the dependence on $C$.
It further shows that in a recent protocol of Ganor, Kol and Raz it is impossible to move the private randomness to be public without an
exponential cost.
To the best of our knowledge, no such example was previously known.

ISSN 1433-8092 | Imprint