Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR03-037 | 30th April 2003 00:00

Sampling Lower Bounds via Information Theory

RSS-Feed




TR03-037
Authors: Ziv Bar-Yossef
Publication: 30th May 2003 21:19
Downloads: 4434
Keywords: 


Abstract:

We present a novel technique, based on the Jensen-Shannon divergence
from information theory, to prove lower bounds on the query complexity
of sampling algorithms that approximate functions over arbitrary
domain and range. Unlike previous methods, our technique does not
use a reduction from a binary decision problem, but rather from
a multi-way decision problem. As a result, it gives
stronger bounds for functions that possess a large set of inputs,
each two of which exhibit a gap in the function value. <p>

We demonstrate the technique with new query complexity lower bounds
for three fundamental problems: (1) the "election problem", for which
we obtain a quadratic improvement over previous bounds, (2) low rank matrix
approximation, for which we prove the first lower bounds, showing that
the algorithms given for this problem are almost optimal,
and (3) matrix reconstruction. <p>

In addition, we introduce a new method for proving lower
bounds on the expected query complexity of functions,
using the Kullback-Leibler divergence. We demonstrate its use
by a simple query complexity lower bound for the mean.



ISSN 1433-8092 | Imprint