We solve an open problem of Kushilevitz and Nisan
(1997) in communication complexity. Let R_{eps}(f)
and D^{mu}_{eps}(f) denote the randomized and
mu-distributional communication complexities of
f, respectively (eps a small constant). Yao's
well-known Minimax Principle states that
R_{eps}(f) = max_{mu} { D^{mu}_{eps}(f) }.
Kushilevitz and Nisan (1997) ask whether this equality
is approximately preserved if the maximization is
taken over product distributions only, rather than
all distributions mu:
R_{eps}(f) = (max_{mu product} { D^{mu}_{eps}(f) })^{O(1)} ?
We refute this hypothesis in the strongest terms.
Namely, we show the existence of a function
f:{-1,1}x{-1,1}->{-1,1} for which
max_{mu product} { D^{mu}_{eps}(f) } = O(1)
but
R_{1/3}(f) =Omega(n).
Furthermore, f has discrepancy O(2^{-n(1/2-eps)}),
which is almost the smallest possible. In particular,
f is a hardest function for every major model of
communication. Yet, the distributional method
restricted to product distributions can certify at
best an Omega(1) communication lower bound for f.
Our result also gives an essentially optimal separation,
Omega(1) vs. O(2^{-n(1/2-eps)}), between discrepancy
under product and nonproduct distributions, improving
on the author's recent result (Sherstov 2007).
Finally, we give an essentially optimal separation,
O(1) vs. Omega(N^{1-eps}), between the
statistical-query complexity and sign rank of an
N-by-N sign matrix. This settles a open question
recently posed by the author (Sherstov 2007) and
completes the taxonomy of the main complexity measures
of sign matrices.