Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #1 to TR15-165 | 9th April 2016 03:47

Towards Optimal Deterministic Coding for Interactive Communication

RSS-Feed




Revision #1
Authors: Ran Gelles, Bernhard Haeupler, Gillat Kol, Noga Ron-Zewi, Avi Wigderson
Accepted on: 9th April 2016 03:47
Downloads: 1119
Keywords: 


Abstract:

We show an \emph{efficient, deterministic} interactive coding scheme that simulates any interactive protocol under random errors with nearly optimal parameters.
Specifically, our coding scheme achieves a communication rate of $1 - O(\sqrt{H({\epsilon})})$ and
a failure probability of~$\exp(-n/ \log n)$, where $n$ is the protocol length and each bit is flipped independently with constant probability $\epsilon$.
Prior to our work, all nontrivial deterministic schemes (either efficient or not) had a rate bounded away from~$1$. Furthermore, the best failure probability achievable by an efficient deterministic scheme with constant rate was only quasi-polynomial, i.e., of the form $\exp(- \log^{O(1)} n)$ (Braverman, ITCS 2012). A rate of $1 - \tilde \Theta(\sqrt{H({\epsilon})})$ is essentially optimal (up to poly-logarithmic factors) by a result of Kol and Raz (STOC 2013).

A central contribution in deriving our coding scheme
is a novel code-concatenation scheme, a notion standard in coding theory which we adapt for the interactive setting.
Essential to our concatenation approach is an explicit, efficiently encodable and decodable {\em linear tree code} of length~$n$ that has relative distance $\Omega(1/\log n)$ and
rate approaching~$1$, defined over an $O(\log n)$-bit alphabet.
The fact that our tree code is linear, and in particular can be made {\em systematic}, turns out to play an important role in our concatenation scheme.

We use the above tree code as the ``outer code'' in the concatenation scheme. The necessary deterministic ``inner code'' is achieved by a nontrivial derandomization of the randomized interactive coding scheme of (Haeupler, STOC 2014). This deterministic coding scheme (with {\em exponential} running time,
but applied here to $O(\log n)$ bit blocks) can handle an $\epsilon$ fraction of adversarial errors with a communication rate of $1 - O(\sqrt{H({\epsilon})})$.



Changes to previous version:

Changes in abstract and introduction


Paper:

TR15-165 | 14th October 2015 03:56

Towards Optimal Deterministic Coding for Interactive Communication





TR15-165
Authors: Ran Gelles, Bernhard Haeupler, Gillat Kol, Noga Ron-Zewi, Avi Wigderson
Publication: 17th October 2015 02:02
Downloads: 1447
Keywords: 


Abstract:

We study \emph{efficient, deterministic} interactive coding schemes that simulate any interactive protocol both under random and adversarial errors, and can achieve a constant communication rate independent of the protocol length.

For channels that flip bits independently with probability~$\epsilon<1/2$, our coding scheme achieves a communication rate of $1 - O(\sqrt{H({\epsilon})})$ and a failure probability of~$\exp(-n/ \log n)$ in length $n$ protocols. Prior to our work, all nontrivial deterministic schemes (either efficient or not) had a rate bounded away from~$1$. Furthermore, the best failure probability achievable by an efficient deterministic coding scheme with constant rate was only quasi-polynomial, i.e., of the form $\exp(- \log^{O(1)} n)$ (Braverman, ITCS 2012).

For channels in which an adversary controls the noise pattern our coding scheme can tolerate $\Omega(1/\log n)$ fraction of errors with rate approaching 1. Once more, all previously known nontrivial deterministic schemes (either efficient or not) in the adversarial setting had a rate bounded away from~$1$, and no nontrivial efficient deterministic coding schemes were known with any constant rate.

Essential to both results is an explicit, efficiently encodable and decodable {\em systematic tree code} of length~$n$ that has relative distance $\Omega(1/\log n)$ and rate approaching~$1$, defined over an $O(\log n)$-bit alphabet. No nontrivial tree code (either efficient or not) was known to approach rate~$1$, and no nontrivial distance bound was known for any efficient constant rate tree code. The fact that our tree code is {\em systematic}, turns out to play an important role in obtaining rate $1 - O(\sqrt{H({\epsilon})})$ in the random error model, and approaching rate $1$ in the adversarial error model.

A central contribution in deriving our coding schemes for random and adversarial errors, is a novel code-concatenation scheme, a notion standard in coding theory which we adapt for the interactive setting. We use the above tree code as the ``outer code'' in this concatenation. The necessary deterministic ``inner code'' is achieved by a nontrivial derandomization of the randomized interactive coding scheme of (Haeupler, STOC 2014). This deterministic coding scheme (with {\em exponential} running time, but applied here to $O(\log n)$ bit blocks) can handle an $\epsilon$ fraction of adversarial errors with a communication rate of $1 - O(\sqrt{H({\epsilon})})$.



ISSN 1433-8092 | Imprint