Symmetry of Information (SoI) is a fundamental result in Kolmogorov complexity stating that for all $n$-bit strings $x$ and $y$, $K(x,y) = K(y) + K(x \mid y)$ up to an additive error of $O(\log n)$ [ZL70]. In contrast, understanding whether SoI holds for time-bounded Kolmogorov complexity measures is closely related to longstanding open problems in complexity theory and cryptography, such as the P versus NP question [LW95, Hir22] and the existence of one-way functions [HILNO23, HLO24, HLN24].
In this paper, we prove that SoI fails for $rKt$ complexity, the randomized analogue of Levin's $Kt$ complexity [Lev84]. This is the first unconditional result of this type for a randomized notion of time-bounded Kolmogorov complexity. More generally, we establish a close relationship between the validity of SoI for $rKt$ and the existence of randomized algorithms approximating $rKt(x)$. Motivated by applications in cryptography, we also establish the failure of SoI for a related notion called $pKt$ complexity [HLO24], and provide an extension of the results to the average-case setting. Finally, we prove a near-optimal lower bound on the complexity of estimating conditional $rKt$, a result that might be of independent interest.
Our findings complement those of [Ron04], who demonstrated the failure of SoI for $Kt$ complexity. In contrast, the randomized setting poses a significant challenge, which we overcome using insights from [KK25], structural results about $rKt$ implied by SoI, and techniques from meta-complexity [Oli19] and the theory of computational pseudorandomness [TV07].