ECCC-Report TR17-116https://eccc.weizmann.ac.il/report/2017/116Comments and Revisions published for TR17-116en-usWed, 05 Jul 2017 17:57:49 +0300
Paper TR17-116
| Mixing Implies Strong Lower Bounds for Space Bounded Learning |
Michal Moshkovitz,
Dana Moshkovitz
https://eccc.weizmann.ac.il/report/2017/116With any hypothesis class one can associate a bipartite graph whose vertices are the hypotheses H on one side and all possible labeled examples X on the other side, and an hypothesis is connected to all the labeled examples that are consistent with it. We call this graph the hypotheses graph. We prove that any hypothesis class whose hypotheses graph is mixing cannot be learned using less than 2^(Omega(log|H|^2)) memory states unless the learner uses at least a large number of |H|^Omega(1) labeled examples. In contrast, there is a learner that uses 2^Theta(log|X| log |H|) memory states and only Theta(log|H|) labeled examples, and there is a learner that uses only |H| memory states but a large number Theta(|H|log|H|) of labeled examples. Our work builds on a combinatorial framework we suggested in a previous work for proving lower bounds on space bounded learning. The strong lower bound is obtained by considering a new notion of pseudorandomness for a sequence of graphs that represents the learner.Wed, 05 Jul 2017 17:57:49 +0300https://eccc.weizmann.ac.il/report/2017/116