Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR02-060 | 15th July 2002 00:00

New Lower Bounds for Statistical Query Learning

RSS-Feed




TR02-060
Authors: Ke Yang
Publication: 3rd November 2002 19:32
Downloads: 3427
Keywords: 


Abstract:

We prove two lower bounds on the Statistical Query (SQ) learning
model. The first lower bound is on weak-learning. We prove that for a
concept class of SQ-dimension $d$, a running time of
$\Omega(d/\log d)$ is needed. The SQ-dimension of a concept class is
defined to be the maximum number of concepts that are ``uniformly
correlated'', in that each pair of them have nearly the same
correlation. This lower bound matches the upper bound
in~\cite{BFJ+94}, up to a logarithmic factor. We prove this lower
bound against an ``honest SQ-oracle'', which gives a stronger result
than the ones against the more frequently used ``adversarial
SQ-oracles''. The second lower bound is more general. It gives a
continuous trade-off between the ``advantage'' of an algorithm in
learning the target function and the number of queries it needs to
make, where the advantage of an algorithm is the probability it
succeeds in predicting a label minus the probability it doesn't.
Both lower bounds extend and/or strengthen previous results, and
solved an open problem left in~\cite{Y01}.

A preliminary version of this paper appeared in
the Proceedings of the Fifteenth Annual Conference on
Computational Learning Theory, 2002 (COLT 2002)



ISSN 1433-8092 | Imprint