We construct binary linear codes that are efficiently list-decodable
up to a fraction $(1/2-\eps)$ of errors. The codes encode $k$ bits
into $n = {\rm poly}(k/\eps)$ bits and are constructible and
list-decodable in time polynomial in $k$ and $1/\eps$ (in
particular, in our results $\eps$ need not be constant and can even
be polynomially small in $n$). Our results give the best known polynomial
dependence of $n$ on $k$ and $1/\eps$ for such codes. Specifically, we are
able to achieve $n \le O(k^3/\eps^{3+\gamma})$ or, if a linear
dependence on $k$ is required, $n \le O(k/\eps^{5+\gamma})$, where
$\gamma > 0$ is an arbitrary constant. The best previously known
constructive bounds in this setting were $n \le O(k^2/\eps^4)$ and
$n \le O(k/\eps^6)$. Non-constructively, a random linear encoding of
length $n = O(k/\eps^2)$ suffices, but no sub-exponential algorithm
is known for list decoding random codes.
Our construction with a cubic dependence on $\eps$ is obtained by
concatenating the recent Parvaresh-Vardy (PV) codes with dual BCH codes,
and crucially exploits the soft decoding algorithm for PV
codes. This result yields better hardness results for the problem of
approximating NP witnesses in the model of Kumar and Sivakumar. Our
result with the linear dependence on $k$ is based on concatenation
of the PV code with an arbitrary inner code of good minimum
distance.
In addition to being a basic question in coding theory, codes that
are list-decodable from a fraction $(1/2-\eps)$ of errors for $\eps
\to 0$ have found many uses in complexity theory. In addition, our
codes have the property that all nonzero codewords have relative
Hamming weights in the range $(1/2-\eps ,1/2+\eps)$; this {\em
$\eps$-biased} property is a fundamental notion in
pseudorandomness.