Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #1 to TR15-126 | 23rd March 2016 04:11

Memory, Communication, and Statistical Queries

RSS-Feed




Revision #1
Authors: Jacob Steinhardt, Gregory Valiant, Stefan Wager
Accepted on: 23rd March 2016 04:11
Downloads: 1091
Keywords: 


Abstract:

If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well-studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show several strong lower bounds on learning parity functions with bounded communication (for example, that any multi-round multiparty protocol for learning parity functions over length $n$ inputs in which each party receives a list of $\le n/4$ examples but is limited to at most $n/16$ bits of communication, requires an exponential number of parties), as well as the first upper bounds on solving generic sparse linear regression problems with limited memory.



Changes to previous version:

Substantial clean-up of writing.


Paper:

TR15-126 | 27th July 2015 09:49

Memory, Communication, and Statistical Queries


Abstract:

If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well-studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show several strong lower bounds on learning parity functions with bounded communication (for example, that any multi-round multiparty protocol for learning parity functions over length $n$ inputs in which each party receives a list of $\le n/4$ examples but is limited to at most $n/16$ bits of communication, requires an exponential number of parties), as well as the first upper bounds on solving generic sparse linear regression problems with limited memory.



ISSN 1433-8092 | Imprint