Loading jsMath...
Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR08-106 | 12th November 2008 00:00

A Divergence Formula for Randomness and Dimension

RSS-Feed

Abstract:

If S is an infinite sequence over a finite alphabet \Sigma and \beta is a probability measure on \Sigma, then the {\it dimension} of S with respect to \beta, written \dim^\beta(S), is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension \dim(S) when \beta is the uniform probability measure. This paper shows that \dim^\beta(S) and its dual \Dim^\beta(S), the {\it strong dimension} of S with respect to \beta, can be used in conjunction with randomness to measure the similarity of two probability measures \alpha and \beta on \Sigma. Specifically, we prove that the {\it divergence formula}

\dim^\beta(R) = \Dim^\beta(R) =\frac{\CH(\alpha)}{\CH(\alpha) + \D(\alpha || \beta)}
holds whenever \alpha and \beta are computable, positive probability measures on \Sigma and R \in \Sigma^\infty is random with respect to \alpha. In this formula, \CH(\alpha) is the Shannon entropy of \alpha, and \D(\alpha||\beta) is the Kullback-Leibler divergence between \alpha and \beta. We also show that the above formula holds for all sequences R that are \alpha-normal (in the sense of Borel) when \dim^\beta(R) and \Dim^\beta(R) are replaced by the more effective finite-state dimensions \dimfs^\beta(R) and \Dimfs^\beta(R). In the course of proving this, we also prove finite-state compression characterizations of \dimfs^\beta(S) and \Dimfs^\beta(S).



ISSN 1433-8092 | Imprint