ECCC-Report TR10-022https://eccc.weizmann.ac.il/report/2010/022Comments and Revisions published for TR10-022en-usTue, 23 Feb 2010 20:32:38 +0200
Paper TR10-022
| Lower Bounds and Hardness Amplification for Learning Shallow Monotone Formulas |
Vitaly Feldman,
Homin Lee,
Rocco Servedio
https://eccc.weizmann.ac.il/report/2010/022Much work has been done on learning various classes of ``simple'' monotone functions under the uniform distribution. In this paper we give the first unconditional lower bounds for learning problems of this sort by showing that polynomial-time algorithms cannot learn constant-depth monotone Boolean formulas under the uniform distribution in the well-studied Statistical Query model.
Using a recent characterization of Strong Statistical Query learnability due to Feldman, we first show that depth-3 monotone formulas of size $n^{o(1)}$ cannot be learned by any polynomial-time Statistical Query algorithm to accuracy $1 - 1/(\log n)^{\Omega(1)}.$ We then build on this result to show that depth-4 monotone formulas of size $n^{o(1)}$ cannot be learned even to a certain ${\frac 1 2} + o(1)$ accuracy in polynomial time. This improved hardness is achieved using a general technique that we introduce for amplifying the hardness of ``mildly hard'' learning problems in either the PAC or Statistical Query framework. This hardness amplification for learning builds on the ideas in the work of O'Donnell on hardness amplification for approximating functions using small circuits, and may be of independent interest.
Tue, 23 Feb 2010 20:32:38 +0200https://eccc.weizmann.ac.il/report/2010/022