We give a simpler proof, via query elimination, of a result due to O'Donnell, Saks, Schramm and Servedio, which shows a lower bound on the zero-error randomized query complexity of a function $f$ in terms of the maximum influence of any variable of $f$. Our lower bound also applies to ... more >>>
We show that for any (partial) query function $f:\{0,1\}^n\rightarrow \{0,1\}$, the randomized communication complexity of $f$ composed with $\mathrm{Index}^n_m$ (with $m= \poly(n)$) is at least the randomized query complexity of $f$ times $\log n$. Here $\mathrm{Index}_m : [m] \times \{0,1\}^m \rightarrow \{0,1\}$ is defined as $\mathrm{Index}_m(x,y)= y_x$ (the $x$th bit ... more >>>
We show that all known classical adversary lower bounds on randomized query complexity are equivalent for total functions, and are equal to the fractional block sensitivity $\text{fbs}(f)$. That includes the Kolmogorov complexity bound of Laplante and Magniez and the earlier relational adversary bound of Aaronson. For partial functions, we show ... more >>>
We prove that for every decision tree, the absolute values of the Fourier coefficients of given order $\ell\geq1$ sum to at most $c^{\ell}\sqrt{{d\choose\ell}(1+\log n)^{\ell-1}},$ where $n$ is the number of variables, $d$ is the tree depth, and $c>0$ is an absolute constant. This bound is essentially tight and settles a ... more >>>