Weizmann Logo
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



TR22-072 | 15th May 2022 12:21

Probabilistic Kolmogorov Complexity with Applications to Average-Case Complexity



Understanding the relationship between the worst-case and average-case complexities of $\mathrm{NP}$ and of other subclasses of $\mathrm{PH}$ is a long-standing problem in complexity theory. Over the last few years, much progress has been achieved in this front through the investigation of meta-complexity: the complexity of problems that refer to the complexity of the input string $x$ (e.g., given a string $x$, estimate its time-bounded Kolmogorov complexity). In particular, [Hir21] employed techniques from meta-complexity to show that if $\mathrm{DistNP} \subseteq \mathrm{AvgP}$ then $\mathrm{UP} \subseteq \mathrm{DTIME}[2^{O(n/\log n)}]$. While this and related results [HN21, CHV22] offer exciting progress after a long gap, they do not survive in the setting of ramdonmized computations: roughly speaking, "randomness" is the opposite of "structure", and upper bounding the amount of structure (time-bounded Kolmogorov complexity) of different objects is crucial in recent applications of meta-complexity. This limitation is significant, since randomized computations are ubiquitous in algorithm design and give rise to a more robust theory of average-case complexity [IL90].

In this work, we develop a probabilistic theory of meta-complexity, by incorporating randomness into the notion of complexity of a string $x$. This is achieved through a new probabilistic variant of time-bounded Kolmogorov complexity that we call $\mathrm{pK}^t$ complexity. Informally, $\mathrm{pK}^t(x)$ measures the complexity of $x$ when shared randomness is available to all parties involved in a computation. By porting key results from meta-complexity to the probabilistic domain of $\mathrm{pK}^t$ complexity and its variants, we are able to establish new connections between worst-case and average-case complexity in the important setting of probabilistic computations:

- If $\mathrm{DistNP} \subseteq \mathrm{AvgBPP}$, then $\mathrm{UP}\subseteq\mathrm{RTIME}\!\left[2^{O(n/\log n)}\right]$.

- If $\mathrm{Dist\Sigma}^{\mathrm{P}}_{2} \subseteq \mathrm{AvgBPP}$, then $\mathrm{AM} \subseteq \mathrm{BPTIME}\!\left[2^{O(n/\log n)}\right]$.

- In the fine-grained setting [CHV22], we get $\mathrm{UTIME}[2^{O(\sqrt{n\log n})}]\subseteq \mathrm{RTIME}[2^{O(\sqrt{n\log n})}]$ and $\mathrm{AMTIME}[2^{O(\sqrt{n\log n})}] \subseteq \mathrm{BPTIME}[2^{O(\sqrt{n\log n})}]$ from stronger average-case assumptions.

- If $\mathrm{DistPH}\subseteq\mathrm{AvgBPP}$, then $\mathrm{PH}\subseteq\mathrm{BPTIME}\!\left[2^{O(n/\log n)}\right]$. Specifically, for any $\ell \geq 0$, if $\mathrm{Dist\Sigma}_{\ell+2}^{\mathrm{P}}\subseteq\mathrm{AvgBPP}$ then $\mathrm{\Sigma}_{\ell}^{\mathrm{P}}\subseteq\mathrm{BPTIME}\!\left[2^{O(n/\log n)}\right]$.

- Strengthening a result from [HN21], we show that if $\mathrm{DistNP}\subseteq\mathrm{AvgBPP}$ then polynomial size Boolean circuits can be agnostically PAC learned under any unknown $\mathrm{P}/\mathrm{poly}$-samplable distribution in polynomial time.

In some cases, our framework allows us to significantly simplify existing proofs, or to extend results to the more challenging probabilistic setting with little to no extra effort.

ISSN 1433-8092 | Imprint