Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #1 to TR15-129 | 23rd June 2016 07:49

On the entropy of a noisy function

RSS-Feed




Revision #1
Authors: Alex Samorodnitsky
Accepted on: 23rd June 2016 07:49
Downloads: 695
Keywords: 


Abstract:

Let $f$ be a nonnegative function on $\{0,1\}^n$. We upper bound the entropy of the image of $f$ under the noise operator with noise parameter $\epsilon$ by the average entropy of conditional expectations of $f$, given sets of roughly $(1-2\epsilon)^2 \cdot n$ variables.

As an application, we show that for a boolean function $f$, which is close to a characteristic function $g$ of a subcube of dimension $n-1$, the entropy of the noisy version of $f$ is at most that of the noisy version of $g$.

This, combined with a recent result of Ordentlich, Shayevitz, and Weinstein, shows that the "Most informative boolean function" conjecture of Courtade and Kumar holds for high noise (close to $1/2$).

Namely, if $X$ is uniformly distributed in $\{0,1\}^n$ and $Y$ is obtained by flipping each coordinate of $X$ independently with probability $\epsilon$, then, provided $\epsilon \ge 1/2 - \delta$ for a sufficiently small positive constant $\delta$, for any boolean function $f$ holds $I\Big(f(X);Y\Big) \le 1 - H(\epsilon)$.



Changes to previous version:

Significantly revised and merged with a follow-up paper.


Paper:

TR15-129 | 7th August 2015 23:50

On the entropy of a noisy function





TR15-129
Authors: Alex Samorodnitsky
Publication: 10th August 2015 15:52
Downloads: 1086
Keywords: 


Abstract:

Let $f$ be a nonnegative function on $\{0,1\}^n$. We upper bound the entropy of the image of $f$ under the noise operator with noise parameter $\epsilon$ by the average entropy of conditional expectations of $f$, given sets of roughly $(1-2\epsilon)^2 \cdot n$ variables.

As an application, we show that for a boolean function $f$, which is close to a characteristic function $g$ of a subcube of dimension $n-1$, the entropy of the noisy version of $f$ is at most that of the noisy version of $g$.

This, combined with a recent result of Ordentlich, Shayevitz, and Weinstein, shows that the "Most informative boolean function" conjecture of Courtade and Kumar holds for balanced boolean functions and high noise (close to $1/2$).

Namely, if $X$ is uniformly distributed in $\{0,1\}^n$ and $Y$ is obtained by flipping each coordinate of $X$ independently with probability $\epsilon$, then, provided $\epsilon \ge 1/2 - \delta$ for a sufficiently small positive constant $\delta$, for any balanced boolean function $f$ holds $I\Big(f(X);Y\Big) \le 1 - H(\epsilon)$.



ISSN 1433-8092 | Imprint