Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #2 to TR19-079 | 2nd June 2019 10:05

Average Bias and Polynomial Sources

RSS-Feed




Revision #2
Authors: Arnab Bhattacharyya, Philips George John, Suprovat Ghoshal, Raghu Meka
Accepted on: 2nd June 2019 10:05
Downloads: 52
Keywords: 


Abstract:

This paper has been withdrawn due to a mistake.



Changes to previous version:

This paper is being withdrawn.


Revision #1 to TR19-079 | 2nd June 2019 10:04

Average Bias and Polynomial Sources





Revision #1
Authors: Arnab Bhattacharyya, Philips George John, Suprovat Ghoshal, Raghu Meka
Accepted on: 2nd June 2019 10:04
Downloads: 36
Keywords: 


Abstract:

This paper has been withdrawn due to a mistake.



Changes to previous version:

This paper is being withdrawn.


Paper:

TR19-079 | 28th May 2019 08:11

Average Bias and Polynomial Sources





TR19-079
Authors: Arnab Bhattacharyya, Philips George John, Suprovat Ghoshal, Raghu Meka
Publication: 1st June 2019 19:39
Downloads: 56
Keywords: 


Abstract:

We identify a new notion of pseudorandomness for randomness sources, which we call the average bias. Given a distribution $Z$ over $\{0,1\}^n$, its average bias is: $b_{\text{av}}(Z) =2^{-n} \sum_{c \in \{0,1\}^n} |\mathbb{E}_{z \sim Z}(-1)^{\langle c, z\rangle}|$. A source with average bias at most $2^{-k}$ has min-entropy at least $k$, and so low average bias is a stronger condition than high min-entropy. We observe that the inner product function is an extractor for any source with average bias less than $2^{-n/2}$.

The notion of average bias especially makes sense for polynomial sources, i.e., distributions sampled by low-degree $n$-variate polynomials over $\mathbb{F}_2$. For the well-studied case of affine sources, it is easy to see that min-entropy $k$ is exactly equivalent to average bias of $2^{-k}$. We show that for quadratic sources, min-entropy $k$ implies that the average bias is at most $2^{-\Omega(\sqrt{k})}$. We use this relation to design dispersers for separable quadratic sources with a min-entropy guarantee.



ISSN 1433-8092 | Imprint