Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR18-066 | 8th April 2018 17:41

Two-Source Condensers with Low Error and Small Entropy Gap via Entropy-Resilient Functions

RSS-Feed

Abstract:

In their seminal work, Chattopadhyay and Zuckerman (STOC'16) constructed a two-source extractor with error $\varepsilon$ for $n$-bit sources having min-entropy $poly\log(n/\varepsilon)$. Unfortunately, the construction running-time is $poly(n/\varepsilon)$, which means that with polynomial-time constructions, only polynomially-large errors are possible. Our main result is a $poly(n,\log(1/\varepsilon))$-time computable two-source condenser. For any $k \ge poly\log(n/\varepsilon)$, our condenser transforms two independent $(n,k)$-sources to a distribution over $m = k-O(\log(1/\varepsilon))$ bits that is $\varepsilon$-close to having min-entropy $m - o(\log(1/\varepsilon))$. Hence, achieving entropy gap of $o(\log(1/\varepsilon))$.

The bottleneck for obtaining low error in recent constructions of two-source extractors lies in the use of resilient functions. Informally, this is a function that receives input bits from $r$ players with the property that the function's output has small bias even if a bounded number of corrupted players feed adversarial inputs after seeing the inputs of the other players. The drawback of using resilient functions is that the error cannot be smaller than $1/r$. This, in return, forces the running time of the construction to be polynomial in $1/\varepsilon$.

A key component in our construction is a variant of resilient functions which we call entropy-resilient functions. This variant can be seen as playing the above game for several rounds. The goal of the corrupted players is to reduce, with as high probability as they can, the min-entropy accumulated throughout the rounds. We show that while the bias decreases only polynomially with the number of players in a one-round game, their success probability decreases exponentially in the entropy gap they are attempting to incur in a repeated game.



ISSN 1433-8092 | Imprint