We give explicit constructions of extractors which work for a source of
any min-entropy on strings of length n. These extractors can extract any
constant fraction of the min-entropy using O(log^2 n) additional random
bits, and can extract all the min-entropy using O(log^3 n) additional
random bits. Both of these constructions use fewer truly random bits than
any previous construction which works for all min-entropies and extracts a
constant fraction of the min-entropy. We then improve our second
construction and show that we can reduce the entropy loss to
2log(1/epsilon)+O(1) bits, while still using O(log^3 n) truly random bits
(where entropy loss is defined as [(source min-entropy)+ (# truly random
bits used)- (# output bits)], and epsilon is the statistical difference
from uniform achieved). This entropy loss is optimal up to a constant
additive term.
Our extractors are obtained by observing that a weaker notion of
"combinatorial design" suffices for the Nisan-Wigderson pseudorandom
generator, which underlies the recent extractor of Trevisan. We give
near-optimal constructions of such "weak designs" which achieve much
better parameters than possible with the notion of designs used by
Nisan-Wigderson and Trevisan.
We also show how to improve our constructions (and Trevisan's
construction) when the required statistical difference epsilon from the
uniform distribution is relatively small. This improvement is obtained by
using multilinear error-correcting codes over finite fields, rather than
the arbitrary error-correcting codes used by Trevisan.