A central open question within meta-complexity is that of NP-hardness of problems such as MCSP and MK^tP. Despite a large body of work giving consequences of and barriers for NP-hardness of these problems under (restricted) deterministic reductions, very little is known in the setting of randomized reductions. In this work, we give consequences of randomized NP-hardness reductions for both approximating and exactly computing time-bounded and time-unbounded Kolmogorov complexity.
In the setting of approximate K^{poly} complexity, our results are as follows.
1. Under a derandomization assumption, for any constant \delta > 0, if approximating K^t complexity within n^{\delta} additive error is hard for SAT under an honest randomized non-adaptive Turing reduction running in time polynomially less than t, then NP = coNP.
2. Under the same assumptions, the worst-case hardness of NP is equivalent to the existence of one-way functions.
Item 1 above may be compared with a recent work of Saks and Santhanam (CCC'22), which makes the same assumptions except with \omega(\log n) additive error, obtaining the conclusion NE = coNE.
In the setting of exact K^{poly} complexity, where the barriers of Item 1 and [Saks, Santhanam; CCC'22] do not apply, we show:
3. If computing K^t complexity is hard for SAT under reductions as in Item 1, then the average-case hardness of NP is equivalent to the existence of one-way functions. That is, ``Pessiland'' is excluded.
Finally, we give consequences of NP-hardness of exact time-unbounded Kolmogorov complexity under randomized reductions.
4. If computing Kolmogorov complexity is hard for SAT under a randomized many-one reduction running in time t_R and with failure probability at most 1/(t_R)^{16}, then coNP is contained in non-interactive statistical zero-knowledge; thus NP \subseteq coAM. Also, the worst-case hardness of NP is equivalent to the existence of one-way functions.
We further exploit the connection to NISZK along with a previous work of Allender et al. (ITCS'23) to show that hardness of K complexity under randomized many-one reductions is highly robust with respect to failure probability, approximation error, output length, and threshold parameter.