Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #1 to TR14-112 | 13th July 2015 14:37

Entropy of weight distributions of small-bias spaces and pseudobinomiality

RSS-Feed




Revision #1
Authors: Louay Bazzi
Accepted on: 13th July 2015 14:37
Downloads: 903
Keywords: 


Abstract:

A classical bound in information theory asserts that small $L_1$-distance between probability distributions implies small difference in Shannon entropy, but the converse need not be true.
We show that if a probability distribution on $\{0,1\}^n$ has small-bias, then the converse holds for its weight distribution in the proximity of the binomial distribution. Namely, we argue that if a probability distribution $\mu$ on $\{0,1\}^n$ is $\delta$-biased, then $\| \overline{\mu} - \operatorname{bin}_n \|_1^2 \leq (2\ln{2})(n \delta +
H(\operatorname{bin}_n) - H(\overline{\mu}))$, where $\overline{\mu}$ is the weight distribution of $\mu$ and $\operatorname{bin}_n$ is the binomial distribution on $\{0,\ldots, n\}$. The key result behind this bound is a lemma which asserts the non-positivity of all the Fourier coefficients of the log-binomial function $L:\{0,1\}^n \rightarrow {\mathbb R}$ given by $L(x) = \lg{\operatorname{bin}_n(|x|) }$.
The original question which motivated the work reported in this paper is the problem of explicitly constructing a small subset of $\{0,1\}^n$ which is $\epsilon$-pseudobinomial in the sense that the weight distribution of each of its restrictions and translations is $\epsilon$-close to the binomial distribution.
We study the notion of pseudobinomiality and we conclude that, for spaces with $n^{-\Theta(1)}$-small bias, the pseudobinomiality error in the $L_1$-sense is equivalent to that in the entropy-difference-sense, in the $n^{-\Theta(1)}$-error regime. We also study the notion of average case pseudobinomiality, and we show that for spaces with $n^{-\Theta(1)}$-small bias, the average entropy of the weight distribution of a random translation of the space is $n^{-\Theta(1)}$-close to the entropy of the binomial distribution. We discuss resulting questions on the pseudobinomiality of sums of independent small-bias spaces. Using the above results, we show that the following conjectures are equivalent: (1) For all independent $\delta$-biased random vectors $X,Y\in \{0,1\}^n$, the
${\mathbb F}_2$-sum $X+Y$ is $O((n\delta)^{\Theta(1)})$-pseudobinomial; (2) For all independent $\delta$-biased random vectors $X,Y\in \{0,1\}^n$, the entropy of the weight of the sum $H(|X+Y|)\geq \min\{H(|X|),H(|Y|)\} - O((n\delta)^{\Theta(1)}).$



Changes to previous version:

Introduction expanded; references added


Paper:

TR14-112 | 23rd August 2014 12:18

Entropy of weight distributions of small-bias spaces and pseudobinomiality





TR14-112
Authors: Louay Bazzi
Publication: 25th August 2014 08:59
Downloads: 2547
Keywords: 


Abstract:

A classical bound in Information Theory asserts that small $L_1$-distance between probability distributions implies small difference in Shannon entropy, but the converse need not be true. We show that if a probability distribution on $\{0,1\}^n$ has small-bias, then the converse holds for its weight distribution in the proximity of the binomial distribution. Namely, we argue that if a probability distribution $\mu$ on $\{0,1\}^n$ is $\delta$-biased, then $\| \overline{\mu} - bin_n \|_1^2 \leq (2\ln{2})(n \delta + H(bin_n) - H(\overline{\mu}))$, where $\overline{\mu}$ is the weight distribution of $\mu$ and $bin_n$ is the binomial distribution on $\{0,\ldots, n\}$. The key result behind this bound is a lemma which asserts the non-positivity of all the Fourier coefficients of the log-binomial function $L:\{0,1\}^n \rightarrow R$ given by $L(x) = \lg{ bin_n(|x|) }$.

The original question which motivated the work reported is this paper is the problem of explicitly constructing a small subset of $\{0,1\}^n$ which is $\epsilon$-pseudobinomial in the sense that the weight distribution of each of its restrictions and translations is $\e$-close to the binomial distribution. We study the notion of pseudobinomiality and we conclude that, for spaces with $n^{-\Theta(1)}$-small bias, the pseudobinomiality error in the $L_1$-sense is equivalent to that in the entropy-difference-sense, in the $n^{-\Theta(1)}$-error regime. We also study the notion of average case pseudobinomiality, and we show that for spaces with $n^{-\Theta(1)}$-small bias, the average entropy of the weight distribution of a random translation of the space is $n^{-\Theta(1)}$-close to the entropy of the binomial distribution. We discuss resulting questions on the pseudobinomiality of sums of independent small-bias spaces. Using the above results, we show that the following conjectures are equivalent: (1) For all independent $\delta$-biased random vectors $X,Y\in \{0,1\}^n$, the $\F_2$-sum $X+Y$ is $O((n\delta)^{\Theta(1)})$-pseudobinomial; (2) For all independent $\delta$-biased random vectors $X,Y\in \{0,1\}^n$, the entropy of the weight of the sum $H(|X+Y|)\geq \min\{H(|X|),H(|Y|)\} - O((n\delta)^{\Theta(1)}).$



ISSN 1433-8092 | Imprint