__
Revision #1 to TR15-129 | 23rd June 2016 07:49
__
#### On the entropy of a noisy function

**Abstract:**
Let $f$ be a nonnegative function on $\{0,1\}^n$. We upper bound the entropy of the image of $f$ under the noise operator with noise parameter $\epsilon$ by the average entropy of conditional expectations of $f$, given sets of roughly $(1-2\epsilon)^2 \cdot n$ variables.

As an application, we show that for a boolean function $f$, which is close to a characteristic function $g$ of a subcube of dimension $n-1$, the entropy of the noisy version of $f$ is at most that of the noisy version of $g$.

This, combined with a recent result of Ordentlich, Shayevitz, and Weinstein, shows that the "Most informative boolean function" conjecture of Courtade and Kumar holds for high noise (close to $1/2$).

Namely, if $X$ is uniformly distributed in $\{0,1\}^n$ and $Y$ is obtained by flipping each coordinate of $X$ independently with probability $\epsilon$, then, provided $\epsilon \ge 1/2 - \delta$ for a sufficiently small positive constant $\delta$, for any boolean function $f$ holds $I\Big(f(X);Y\Big) \le 1 - H(\epsilon)$.

**Changes to previous version:**
Significantly revised and merged with a follow-up paper.

__
TR15-129 | 7th August 2015 23:50
__

#### On the entropy of a noisy function

**Abstract:**
Let $f$ be a nonnegative function on $\{0,1\}^n$. We upper bound the entropy of the image of $f$ under the noise operator with noise parameter $\epsilon$ by the average entropy of conditional expectations of $f$, given sets of roughly $(1-2\epsilon)^2 \cdot n$ variables.

As an application, we show that for a boolean function $f$, which is close to a characteristic function $g$ of a subcube of dimension $n-1$, the entropy of the noisy version of $f$ is at most that of the noisy version of $g$.

This, combined with a recent result of Ordentlich, Shayevitz, and Weinstein, shows that the "Most informative boolean function" conjecture of Courtade and Kumar holds for balanced boolean functions and high noise (close to $1/2$).

Namely, if $X$ is uniformly distributed in $\{0,1\}^n$ and $Y$ is obtained by flipping each coordinate of $X$ independently with probability $\epsilon$, then, provided $\epsilon \ge 1/2 - \delta$ for a sufficiently small positive constant $\delta$, for any balanced boolean function $f$ holds $I\Big(f(X);Y\Big) \le 1 - H(\epsilon)$.