Weizmann Logo
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



TR23-080 | 1st June 2023 03:06

Improved Learning from Kolmogorov Complexity


Authors: Halley Goldberg, Valentine Kabanets
Publication: 1st June 2023 03:07
Downloads: 523


Carmosino, Impagliazzo, Kabanets, and Kolokolova (CCC, 2016) showed that the existence of natural properties in the sense of Razborov and Rudich (JCSS, 1997) implies PAC learning algorithms in the sense of Valiant (Comm. ACM, 1984), for boolean functions in $\P/\poly$, under the uniform distribution and with membership queries. It is still an open problem to get from natural properties learning algorithms that do not rely on membership queries but rather use randomly drawn labeled examples.

Natural properties may be understood as an average-case version of MCSP, the problem of deciding the minimum size of a circuit computing a given truth-table. Problems related to MCSP include those concerning time-bounded Kolmogorov complexity. MKTP, for example, asks for the KT-complexity of a given string. KT-complexity is a relaxation of circuit size, as it does away with the requirement that a short description of a string be interpreted as a boolean circuit.
In this work, under assumptions of MKTP and the related problem $\mktp$ being easy on average, we get learning algorithms for boolean functions in $\P/\poly$ that
\item work over any distribution $D$ samplable by a family of polynomial-size circuits (given explicitly in the case of $\MKTP$),
\item only use randomly drawn labeled examples from $D$, and
\item are agnostic (do not require the target function to belong to the hypothesis class).
Our results build upon the recent work of Hirahara and Nanashima (FOCS, 2021) who showed similar learning consequences but under a stronger assumption that $\NP$ is easy on average.

ISSN 1433-8092 | Imprint