Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR04-031 | 22nd March 2004 00:00

On Polynomially Time Bounded Symmetry of Information

RSS-Feed

Abstract:

The information contained in a string $x$ about a string $y$
is defined as the difference between the Kolmogorov complexity
of $y$ and the conditional Kolmogorov complexity of $y$ given $x$,
i.e., $I(x:y)=\C(y)-\C(y|x)$. From the well-known Kolmogorov--Levin Theorem it follows that $I(x:y)$ is symmetric up to a small additive term $O(\log \C(x,y))$. We are interested in if this property can hold for several versions of polynomial time bounded Kolmogorov complexity.
It is proven in papers by L. Longpr\'e and S. Mocas
and L. Longpr\'e and O. Watanabe that, under a natural assumption, symmetry of information does not hold for the polynomial time bounded printing version of Kolmogorov complexity. In this paper, we investigate symmetry of information for some variants of distinguishing complexity $\CD$, where $\CD(x)$ is the length of a shortest program which accepts $x$ and only $x$. We show relativized worlds where symmetry of information does not hold for deterministic and nondeterministic polynomial time distinguishing complexities $\CD^{\poly}$ and $\CND^{\poly}$. For nondeterministic polynomial time distinguishing with randomness, $\CAM^{\poly}$, we prove that symmetry of information holds for most pairs of strings in any set in $\NP$. In proving this last statement we extend a recent result of Buhrman et al., which may be of independent utility.



ISSN 1433-8092 | Imprint