Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR25-147 | 16th October 2025 08:33

Decoding Balanced Linear Codes With Preprocessing

RSS-Feed

Abstract:

Prange's information set algorithm is a decoding algorithm for arbitrary linear codes. It decodes corrupted codewords of any $\mathbb{F}_2$-linear code $C$ of message length $n$ up to relative error rate $O(\log n / n)$ in $\poly(n)$ time. We show that the error rate can be improved to $O((\log n)^2 / n)$, provided: (1) the decoder has access to a polynomial-length advice string that depends on $C$ only, and (2) $C$ is $n^{-\Omega(1)}$-balanced.

As a consequence we improve the error tolerance in decoding random linear codes if inefficient preprocessing of the code is allowed. This reveals potential vulnerabilities in cryptographic applications of Learning Noisy Parities with low noise rate.

Our main technical result is that the Hamming weight of $Hw$, where $H$ is a random sample of \emph{short dual} codewords, measures the proximity of a word $w$ to the code in the regime of interest. Given such $H$ as advice, our algorithm corrects errors by locally minimizing this measure. We show that for most codes, the error rate tolerated by our decoder is asymptotically optimal among all algorithms whose decision is based on thresholding $Hw$ for an arbitrary polynomial-size advice matrix $H$.



ISSN 1433-8092 | Imprint