ECCC-Report TR13-052https://eccc.weizmann.ac.il/report/2013/052Comments and Revisions published for TR13-052en-usThu, 13 Sep 2018 14:49:15 +0300
Revision 2
| Upper Bounds on Fourier Entropy |
Sourav Chakraborty,
Raghav Kulkarni,
Satyanarayana V. Lokam,
Nitin Saurabh
https://eccc.weizmann.ac.il/report/2013/052#revision2Given a function $f : {0,1}^n \to \{+1,-1\}$, its {\em Fourier Entropy} is defined
to be $-\sum_S \widehat{f}(S)^2 \log \widehat{f}(S)^2$, where $\hat{f}$ denotes the
Fourier transform of $f$.
In the analysis of Boolean functions, an outstanding open question
is a conjecture of Friedgut and Kalai (1996), called the Fourier Entropy
Influence (FEI) Conjecture, asserting that the Fourier Entropy of any Boolean
function $f$ is bounded above, up to a constant factor, by the total influence (= average sensitivity) of $f$.
In this paper we give several upper bounds on the Fourier Entropy.
We first give upper bounds on the Fourier Entropy of Boolean functions in terms of several
complexity measures that are known to be bigger than the influence. These
complexity measures include, among others, the logarithm of the number of
leaves and the average depth of a parity decision tree. We then show that for
the class of Linear Threshold Functions (LTF), the Fourier Entropy is
$O(\sqrt{n})$. It is known that the average sensitivity for the class
of LTF is $\Theta(\sqrt{n})$. We also establish a bound of
$O_d(n^{1-\frac{1}{4d+6}})$ for general degree-$d$ polynomial threshold functions.
Our proof is based on a new upper bound on the
\emph{derivative of noise sensitivity}. Next we proceed to show that the
FEI Conjecture holds for read-once formulas that use $\mathsf{AND}$, $\mathsf{OR}$,
$\mathsf{XOR}$, and $\mathsf{NOT}$
gates. The last result is independent of a result due to
O'Donnell and Tan [OT'13] for read-once formulas with
\emph{arbitrary} gates of bounded fan-in, but our proof is completely
elementary and very different from theirs. Finally, we give a general bound
involving the first and second moments of sensitivities of a function
(average sensitivity being the first moment), which holds for real-valued
functions as well.
Thu, 13 Sep 2018 14:49:15 +0300https://eccc.weizmann.ac.il/report/2013/052#revision2
Revision 1
| Upper Bounds on Fourier Entropy |
Sourav Chakraborty,
Raghav Kulkarni,
Satyanarayana V. Lokam,
Nitin Saurabh
https://eccc.weizmann.ac.il/report/2013/052#revision1Given a function $f : \{0,1\}^n \to \reals$, its {\em Fourier Entropy} is defined to be $-\sum_S \fcsq{f}{S} \log \fcsq{f}{S}$, where $\fhat$ denotes the Fourier transform of $f$. This quantity arises in a number of applications, especially in the study of Boolean functions. An outstanding open question is a conjecture of Friedgut and Kalai (1996), called Fourier Entropy Influence (FEI) Conjecture, asserting that the Fourier Entropy of any Boolean function $f$ is bounded above,
up to a constant factor, by the total influence (= average sensitivity) of $f$.
In this paper we give several upper bounds on the Fourier Entropy of Boolean as well as
real valued functions. We give a general bound involving the $(1+\delta)$-th moment of $|S|$ w.r.t. the distribution $\fcsq{f}{S}$; the FEI conjecture needs the first moment of $|S|$. A variant of this bound uses the first and second moments of sensitivities (average sensitivity being the first moment). We also give upper bounds on the Fourier Entropy of Boolean functions in terms of several complexity measures that are known to be bigger than the influence. These complexity measures include, among others, the logarithm of the number of leaves and the average depth of a decision tree. Finally, we show that the FEI Conjecture holds for two special classes of functions, namely linear threshold functions and read-once formulas.Thu, 04 Apr 2013 05:51:52 +0300https://eccc.weizmann.ac.il/report/2013/052#revision1
Paper TR13-052
| {Upper Bounds on Fourier Entropy |
Sourav Chakraborty,
Raghav Kulkarni,
Satyanarayana V. Lokam,
Nitin Saurabh
https://eccc.weizmann.ac.il/report/2013/052iven a function $f : \{0,1\}^n \to \reals$, its {\em Fourier Entropy} is defined to be $-\sum_S \fcsq{f}{S} \log \fcsq{f}{S}$, where $\fhat$ denotes the Fourier transform of $f$. This quantity arises in a number of applications, especially in the study of Boolean functions. An outstanding open question is a conjecture of Friedgut and Kalai (1996), called Fourier Entropy Influence (FEI) Conjecture, asserting that the Fourier Entropy of any Boolean function $f$ is bounded above,
up to a constant factor, by the total influence (= average sensitivity) of $f$.
In this paper we give several upper bounds on the Fourier Entropy of Boolean as well as
real valued functions. We give a general bound involving the $(1+\delta)$-th moment of $|S|$ w.r.t. the distribution $\fcsq{f}{S}$; the FEI conjecture needs the first moment of $|S|$. A variant of this bound uses the first and second moments of sensitivities (average sensitivity being the first moment). We also give upper bounds on the Fourier Entropy of Boolean functions in terms of several complexity measures that are known to be bigger than the influence. These complexity measures include, among others, the logarithm of the number of leaves and the average depth of a decision tree. Finally, we show that the FEI Conjecture holds for two special classes of functions, namely linear threshold functions and read-once formulas.Wed, 03 Apr 2013 23:01:55 +0300https://eccc.weizmann.ac.il/report/2013/052