Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > AUTHORS > MICHAL MOSHKOVITZ:
All reports by Author Michal Moshkovitz:

TR17-116 | 5th July 2017
Michal Moshkovitz, Dana Moshkovitz

Mixing Implies Strong Lower Bounds for Space Bounded Learning

With any hypothesis class one can associate a bipartite graph whose vertices are the hypotheses H on one side and all possible labeled examples X on the other side, and an hypothesis is connected to all the labeled examples that are consistent with it. We call this graph the hypotheses ... more >>>


TR17-017 | 5th February 2017
Michal Moshkovitz, Dana Moshkovitz

Mixing Implies Lower Bounds for Space Bounded Learning

One can learn any hypothesis class $H$ with $O(\log|H|)$ labeled examples. Alas, learning with so few examples requires saving the examples in memory, and this requires $|X|^{O(\log|H|)}$ memory states, where $X$ is the set of all labeled examples. A question that arises is how many labeled examples are needed in ... more >>>




ISSN 1433-8092 | Imprint