ECCC-Report TR23-080https://eccc.weizmann.ac.il/report/2023/080Comments and Revisions published for TR23-080en-usThu, 01 Jun 2023 03:07:23 +0300
Paper TR23-080
| Improved Learning from Kolmogorov Complexity |
Valentine Kabanets,
Halley Goldberg
https://eccc.weizmann.ac.il/report/2023/080Carmosino, Impagliazzo, Kabanets, and Kolokolova (CCC, 2016) showed that the existence of natural properties in the sense of Razborov and Rudich (JCSS, 1997) implies PAC learning algorithms in the sense of Valiant (Comm. ACM, 1984), for boolean functions in $\P/\poly$, under the uniform distribution and with membership queries. It is still an open problem to get from natural properties learning algorithms that do not rely on membership queries but rather use randomly drawn labeled examples.
Natural properties may be understood as an average-case version of MCSP, the problem of deciding the minimum size of a circuit computing a given truth-table. Problems related to MCSP include those concerning time-bounded Kolmogorov complexity. MKTP, for example, asks for the KT-complexity of a given string. KT-complexity is a relaxation of circuit size, as it does away with the requirement that a short description of a string be interpreted as a boolean circuit.
In this work, under assumptions of MKTP and the related problem $\mktp$ being easy on average, we get learning algorithms for boolean functions in $\P/\poly$ that
\begin{itemize}
\item work over any distribution $D$ samplable by a family of polynomial-size circuits (given explicitly in the case of $\MKTP$),
\item only use randomly drawn labeled examples from $D$, and
\item are agnostic (do not require the target function to belong to the hypothesis class).
\end{itemize}
Our results build upon the recent work of Hirahara and Nanashima (FOCS, 2021) who showed similar learning consequences but under a stronger assumption that $\NP$ is easy on average.Thu, 01 Jun 2023 03:07:23 +0300https://eccc.weizmann.ac.il/report/2023/080