The question of list decoding error-correcting codes over finite fields (under the Hamming metric) has been widely studied in recent years. Motivated by the similar discrete linear structure of linear codes and point lattices in $R^{N}$, and their many shared applications across complexity theory, cryptography, and coding theory, we initiate the study of list decoding for lattices. Namely: for a lattice $L\subseteq R^N$, given a target vector $r\in R^N$ and a distance parameter $d$, output the set of all lattice points $w\in L$ that are within distance $d$ of $r$.
In this work we focus on combinatorial and algorithmic questions related to list decoding for the well-studied family of Barnes-Wall lattices. Our main contributions are twofold:
1) We give tight (up to polynomials) combinatorial bounds on the worst-case list size, showing it to be polynomial in the lattice dimension for any error radius bounded away from the lattice's minimum distance (in the Euclidean norm).
2) Building on the unique decoding algorithm of Micciancio and Nicolosi (ISIT '08), we give a list-decoding algorithm that runs in time polynomial in the lattice dimension and worst-case list size, for any error radius. Moreover, our algorithm is highly parallelizable, and with sufficiently many processors can run in parallel time only poly-logarithmic in the lattice dimension.
In particular, our results imply a polynomial-time list-decoding algorithm for any error radius bounded away from the minimum distance, thus beating a typical barrier for natural error-correcting codes posed by the Johnson radius.
Fixed a bug in the statement of Lemma 2.3 and made several edits throughout the paper to improve the presentation.
The question of list decoding error-correcting codes over finite fields (under the Hamming metric) has been widely studied in recent years. Motivated by the similar discrete structure of linear codes and point lattices in $\R^{N}$, and their many shared applications across complexity theory, cryptography, and coding theory, we initiate the study of list decoding for lattices. Namely: for a lattice $L\subseteq \R^N$, given a target vector $r\in \R^N$ and a distance parameter $d$, output the set of all lattice points $w\in L$ that are within distance $d$ of $r$.
In this work we focus on combinatorial and algorithmic questions related to list decoding for the well-studied family of Barnes-Wall lattices. Our main contributions are twofold:
1) We give tight (up to polynomials) combinatorial bounds on the worst-case list size, showing it to be polynomial in the lattice dimension for any error radius bounded away from the lattice's minimum distance (in the Euclidean norm).
2) Building on the unique decoding algorithm of Micciancio and Nicolosi (ISIT '08), we give a list-decoding algorithm that runs in time polynomial in the lattice dimension and worst-case list size, for any error radius. Moreover, our algorithm is highly parallelizable, and with sufficiently many processors can run in parallel time only poly-logarithmic in the lattice dimension.
In particular, our results imply a polynomial-time list-decoding algorithm for any error radius bounded away from the minimum distance, thus beating a typical barrier for error-correcting codes posed by the Johnson radius.
Fixed a few typos.
The question of list decoding error-correcting codes over finite fields (under the Hamming metric) has been widely studied in recent years. Motivated by the similar discrete structure of linear codes and point lattices in $R^{N}$, and their many shared applications across complexity theory, cryptography, and coding theory, we initiate the study of list decoding for lattices. Namely: for a lattice $L\subseteq R^N$, given a target vector $r\in R^N$ and a distance parameter $d$, output the set of all lattice points $w \in L$ that are within distance $d$ of $r$.
In this work we focus on combinatorial and algorithmic questions related to list decoding for the well-studied family of Barnes-Wall lattices. Our main contributions are twofold:
1) We give tight (up to polynomials) combinatorial bounds on the worst-case list size, which is polynomial in the lattice dimension, for any error radius bounded away from the lattice's minimum distance (in the Euclidean norm).
2) Building on the unique decoding algorithm of Micciancio and Nicolosi (ISIT '08), we give a list-decoding algorithm that runs in time polynomial in the lattice dimension and worst-case list size, for any error radius. Moreover, our algorithm is highly parallelizable, and with sufficiently many processors can run in parallel time only poly-logarithmic in the lattice dimension.
In particular, our results imply a polynomial-time list-decoding algorithm for error bounded away from the minimum distance, thus beating a typical barrier for error-correcting codes posed by the Johnson radius.