The information contained in a string $x$ about a string $y$
is defined as the difference between the Kolmogorov complexity
of $y$ and the conditional Kolmogorov complexity of $y$ given $x$,
i.e., $I(x:y)=\C(y)-\C(y|x)$. From the well-known Kolmogorov--Levin Theorem it follows that $I(x:y)$ is symmetric up to a small ...
more >>>
We give a simplified proof of Hirahara's STOC'21 result showing that $DistPH \subseteq AvgP$ would imply $PH \subseteq DTIME[2^{O(n/\log n)}]$. The argument relies on a proof of the new result: Symmetry of Information for time-bounded Kolmogorov complexity under the assumption that $NP$ is easy on average, which is interesting in ... more >>>
Symmetry of Information (SoI) is a fundamental property of Kolmogorov complexity that relates the complexity of a pair of strings and their conditional complexities. Understanding if this property holds in the time-bounded setting is a longstanding open problem. In the nineties, Longpré and Mocas (1993) and Longpré and Watanabe (1995) ... more >>>
We introduce $\mathrm{pKt}$ complexity, a new notion of time-bounded Kolmogorov complexity that can be seen as a probabilistic analogue of Levin's $\mathrm{Kt}$ complexity. Using $\mathrm{pKt}$ complexity, we upgrade two recent frameworks that characterize one-way functions ($\mathrm{OWFs}$) via symmetry of information and meta-complexity, respectively. Among other contributions, we establish the following ... more >>>
The coding theorem for Kolmogorov complexity states that any string sampled from a computable distribution has a description length close to its information content. A coding theorem for resource-bounded Kolmogorov complexity is the key to obtaining fundamental results in average-case complexity, yet whether any samplable distribution admits a coding theorem ... more >>>