Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR25-092 | 12th July 2025 05:33

Complexity-Theoretic Inductive Inference

RSS-Feed

Abstract:

Inductive inference, introduced by Solomonoff (Information and Control, 1964), is a foundational concept in knowledge acquisition, formulated as the task of extrapolating a sequence of symbols. In his seminal work, Solomonoff established a fundamental theorem for recursion-theoretic universal inductive inference, applicable to sequences generated by all Turing machines, based on the (uncomputable) task of computing Kolmogorov complexity.

In this work, we present a complexity-theoretic counterpart to Solomonoff’s theorem: we construct an efficient universal inductive inference algorithm, applicable to sequences efficiently generated by all randomized Turing machines, assuming that time-bounded Kolmogorov complexity can be efficiently approximated. This assumption is known to follow from the average-csae easiness of NP. As corollaries, we obtain various worst-case learning algorithms, such as distributional learning and learning adaptively changing distributions, under the assumption that NP is easy on average.

Our inductive inference algorithm and its analysis build on techniques developed in meta-complexity theory. At a high level, the algorithm divides a given sequence of symbols into two parts, "advice" and "context", and extrapolates the next symbol of the context according to the time-bounded universal distribution given the advice. We prove that this algorithm avoids computationally deep strings, which are hard instances for every average-case algorithm.



ISSN 1433-8092 | Imprint