We present a novel technique, based on the Jensen-Shannon divergence
from information theory, to prove lower bounds on the query complexity
of sampling algorithms that approximate functions over arbitrary
domain and range. Unlike previous methods, our technique does not
use a reduction from a binary decision problem, but rather from
a multi-way decision problem. As a result, it gives
stronger bounds for functions that possess a large set of inputs,
each two of which exhibit a gap in the function value. <p>
We demonstrate the technique with new query complexity lower bounds
for three fundamental problems: (1) the "election problem", for which
we obtain a quadratic improvement over previous bounds, (2) low rank matrix
approximation, for which we prove the first lower bounds, showing that
the algorithms given for this problem are almost optimal,
and (3) matrix reconstruction. <p>
In addition, we introduce a new method for proving lower
bounds on the expected query complexity of functions,
using the Kullback-Leibler divergence. We demonstrate its use
by a simple query complexity lower bound for the mean.