Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR17-116 | 5th July 2017 15:43

Mixing Implies Strong Lower Bounds for Space Bounded Learning

RSS-Feed




TR17-116
Authors: Michal Moshkovitz, Dana Moshkovitz
Publication: 5th July 2017 17:57
Downloads: 1045
Keywords: 


Abstract:

With any hypothesis class one can associate a bipartite graph whose vertices are the hypotheses H on one side and all possible labeled examples X on the other side, and an hypothesis is connected to all the labeled examples that are consistent with it. We call this graph the hypotheses graph. We prove that any hypothesis class whose hypotheses graph is mixing cannot be learned using less than 2^(Omega(log|H|^2)) memory states unless the learner uses at least a large number of |H|^Omega(1) labeled examples. In contrast, there is a learner that uses 2^Theta(log|X| log |H|) memory states and only Theta(log|H|) labeled examples, and there is a learner that uses only |H| memory states but a large number Theta(|H|log|H|) of labeled examples. Our work builds on a combinatorial framework we suggested in a previous work for proving lower bounds on space bounded learning. The strong lower bound is obtained by considering a new notion of pseudorandomness for a sequence of graphs that represents the learner.



ISSN 1433-8092 | Imprint