On an input probability distribution with some (min-)entropy
an {\em extractor} outputs a distribution with a (near) maximum
entropy rate (namely the uniform distribution).
A natural weakening of this concept is a condenser, whose
output distribution has a higher entropy rate than the
input distribution (without losing
much of the initial entropy).
In this paper we construct efficient explicit condensers.
The condenser constructions combine
(variants or more efficient versions of)
ideas from several works,
including the block extraction scheme of Nisan and Zuckerman,
the observation made by Nisan and Ta-Shma
that a failure of the block extraction scheme is also useful,
the recursive ``win-win'' case analysis of Impagliazzo Shaltiel and
Wigderson, and the error correction of random sources used by Trevisan.
As a natural byproduct, (via repeated iterating of condensers),
we obtain new extractor constructions.
The new extractors
give significant qualitative improvements over
previous ones for sources of arbitrary min-entropy; they are nearly
optimal simultaneously in the main two parameters - seed length and
output length. Specifically, our extractors can make any of these
two parameters optimal (up to a
constant factor), only at a poly-logarithmic loss in the other.
Previous constructions require polynomial loss in both cases for
general sources.
We also give a simple reduction converting ``standard'' extractors
(which are good for an average seed) to
``strong'' ones (which are good for most seeds), with essentially the
same parameters.
With it, all the above improvements apply to strong extractors as well.