Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #2 to TR20-047 | 16th November 2020 10:39

Explicit Uniquely Decodable Codes for Space Bounded Channels That Achieve List-Decoding Capacity

RSS-Feed




Revision #2
Authors: Ronen Shaltiel, Jad Silbak
Accepted on: 16th November 2020 10:39
Downloads: 558
Keywords: 


Abstract:

We consider codes for space bounded channels. This is a model for communication under noise that was introduced by Guruswami and Smith (J. ACM 2016) and lies between the Shannon (random) and Hamming (adversarial) models. In this model, a channel is a space bounded procedure that reads the codeword in one pass, and modifies at most a $p$ fraction of the bits of the codeword.

Explicit uniquely decodable codes for space bounded channels: Our main result is that for every $0 \le p \le \frac{1}{4}$, there exists a constant $\delta\ge 0$ and a \emph{uniquely decodable} code that is \emph{explicit} (meaning that encoding and decoding are in poly-time) and has rate $1-H(p)$ for channels with space $n^{\delta}$.

This improves upon previous explicit codes by Guruswami and Smith, and Kopparty, Shaltiel and Silbak (FOCS 2019). Specifically, we obtain the same space and rate as earlier works, even though prior work gave only list-decodable codes (rather than uniquely decodable codes).

Complete characterization of the capacity of space bounded channels: Together with a result by Guruswami and Smith showing the impossibility of unique decoding for $p \ge \frac{1}{4}$, our techniques also give a complete characterization of the capacity $R(p)$ of space $n^{1-o(1)}$ channels, specifically: $R(p)=1-H(p)$ for $0 \le p \le 1/4$ and $R(p)=0$ for $p \ge 1/4$.

In particular, $R(\cdot)$ is not continuous at $p=1/4$. This capacity is strictly larger than the capacity of Hamming channels for every $0 \le p \le \frac{1}{4}$, and matches the capacity of list decoding, and binary symmetric channels in this range.

Our results are incomparable to recent work on casual channels. Casual channels are stronger channels in which the channel reads the codeword in one pass, but there is no space restriction. The best known codes for casual channels, due to Chen, Jaggi and Langberg (STOC 2015), are shown to exist by the probabilistic method, and no explicit codes are known. Furthermore, our results imply that for $p\ge p_0 \approx 0.0804$, there is a separation between the capacities of space bounded channels and casual channels, and the capacity of the former is strictly larger than that of the latter.

Our approach builds on previous explicit list decodable codes for space bounded channels. We introduce and study a notion of ``\emph{evasivenss}'' of codes, which is concerned with whether a decoding algorithm rejects a word that is obtained when a channel induces few errors to a \emph{uniformly chosen} string. We use evasiveness (as well as several additional new ideas related to coding theory and pseudorandomness) to identify the ``correct'' message in the list. Loosely speaking, this is achieved by arguing that on ``incorrect messages'' the decoding algorithm cannot distinguish the codeword from a uniform string.



Changes to previous version:

Minor changes


Revision #1 to TR20-047 | 16th November 2020 10:32

Explicit Uniquely Decodable Codes for Space Bounded Channels That Achieve List-Decoding Capacity





Revision #1
Authors: Ronen Shaltiel, Jad Silbak
Accepted on: 16th November 2020 10:32
Downloads: 470
Keywords: 


Abstract:

We consider codes for space bounded channels. This is a model for communication under noise that was introduced by Guruswami and Smith (J. ACM 2016) and lies between the Shannon (random) and Hamming (adversarial) models. In this model, a channel is a space bounded procedure that reads the codeword in one pass, and modifies at most a $p$ fraction of the bits of the codeword.

\begin{itemize}
\item \emph{Explicit uniquely decodable codes for space bounded channels:} Our main result is that for every $0 \le p 0$ and a \emph{uniquely decodable} code with rate $1-H(p)$ for channels with space $n^{\delta}$. This code is \emph{explicit} (meaning that encoding and decoding are in poly-time).

This improves upon previous explicit codes by Guruswami and Smith, and Kopparty, Shaltiel and Silbak (FOCS 2019). Specifically, we obtain the same space and rate as earlier works, even though prior work gave only \emph{list-decodable} codes (rather than uniquely decodable codes).

\item \emph{Complete characterization of the capacity of space bounded channels:} Together with a result by Guruswami and Smith showing the impossibility of unique decoding for $p \ge \frac{1}{4}$, our techniques also give a complete characterization of the capacity $R(p)$ of space $n^{1-o(1)}$ channels, specifically:
\[R(p)= \begin{cases}
1-H(p) & 0 \le p < 1/4 \\
0 & p \ge 1/4. \\
\end{cases}\]

This capacity is strictly larger than the capacity of Hamming channels for every $0 < p < \frac{1}{4}$, and matches the capacity of list decoding, and binary symmetric channels in this range. Curiously, this shows that $R(\cdot)$ is not continuous at $p=1/4$.
\end{itemize}

Our results are incomparable to recent work on casual channels (these are stronger channels that read the codeword in one pass, but there is no space restriction). The best known codes for casual channels, due to Chen, Jaggi and Langberg (STOC 2015), are shown to exist by the probabilistic method, and no explicit codes are known.

A key new ingredient in our construction is a new notion of ``\emph{evasiveness}'' of codes, which is concerned with whether a decoding algorithm rejects a word that is obtained when a channel induces few errors to a \emph{uniformly chosen} (or \emph{pseudorandom}) string. We use evasiveness (as well as several additional new ideas related to coding theory and pseudorandomness) to identify the ``correct'' message in the list obtained by previous list-decoding algorithms.



Changes to previous version:

Improved presentation


Paper:

TR20-047 | 16th April 2020 13:14

Explicit Uniquely Decodable Codes for Space Bounded Channels That Achieve List-Decoding Capacity





TR20-047
Authors: Ronen Shaltiel, Jad Silbak
Publication: 16th April 2020 13:28
Downloads: 671
Keywords: 


Abstract:

We consider codes for space bounded channels. This is a model for communication under noise that was introduced by Guruswami and Smith (J. ACM 2016) and lies between the Shannon (random) and Hamming (adversarial) models. In this model, a channel is a space bounded procedure that reads the codeword in one pass, and modifies at most a $p$ fraction of the bits of the codeword.

Explicit uniquely decodable codes for space bounded channels: Our main result is that for every $0 \le p \le \frac{1}{4}$, there exists a constant $\delta\ge 0$ and a \emph{uniquely decodable} code that is \emph{explicit} (meaning that encoding and decoding are in poly-time) and has rate $1-H(p)$ for channels with space $n^{\delta}$.

This improves upon previous explicit codes by Guruswami and Smith, and Kopparty, Shaltiel and Silbak (FOCS 2019). Specifically, we obtain the same space and rate as earlier works, even though prior work gave only list-decodable codes (rather than uniquely decodable codes).

Complete characterization of the capacity of space bounded channels: Together with a result by Guruswami and Smith showing the impossibility of unique decoding for $p \ge \frac{1}{4}$, our techniques also give a complete characterization of the capacity $R(p)$ of space $n^{1-o(1)}$ channels, specifically: $R(p)=1-H(p)$ for $0 \le p \le 1/4$ and $R(p)=0$ for $p \ge 1/4$.

In particular, $R(\cdot)$ is not continuous at $p=1/4$. This capacity is strictly larger than the capacity of Hamming channels for every $0 \le p \le \frac{1}{4}$, and matches the capacity of list decoding, and binary symmetric channels in this range.

Our results are incomparable to recent work on casual channels. Casual channels are stronger channels in which the channel reads the codeword in one pass, but there is no space restriction. The best known codes for casual channels, due to Chen, Jaggi and Langberg (STOC 2015), are shown to exist by the probabilistic method, and no explicit codes are known. Furthermore, our results imply that for $p\ge p_0 \approx 0.0804$, there is a separation between the capacities of space bounded channels and casual channels, and the capacity of the former is strictly larger than that of the latter.

Our approach builds on previous explicit list decodable codes for space bounded channels. We introduce and study a notion of ``\emph{evasivenss}'' of codes, which is concerned with whether a decoding algorithm rejects a word that is obtained when a channel induces few errors to a \emph{uniformly chosen} string. We use evasiveness (as well as several additional new ideas related to coding theory and pseudorandomness) to identify the ``correct'' message in the list. Loosely speaking, this is achieved by arguing that on ``incorrect messages'' the decoding algorithm cannot distinguish the codeword from a uniform string.



ISSN 1433-8092 | Imprint