TR01-043 Authors: Mikhail V. Vyugin, Vladimir Vyugin

Publication: 18th June 2001 15:38

Downloads: 1035

Keywords:

A new notion of predictive complexity and corresponding amount of

information are considered.

Predictive complexity is a generalization of Kolmogorov complexity

which bounds the ability of any algorithm to predict elements of

a sequence of outcomes. We consider predictive complexity for a wide class

of bounded loss functions which are generalization of square-loss function.

Relations between unconditional $KG(x)$ and conditional $KG(x|y)$

predictive complexities are studied. We define an algorithm

which has some ``expanding property''. It transforms with positive

probability sequences of given predictive

complexity into sequences of essentially bigger predictive complexity.

A concept of amount of predictive information $IG(y:x)$ is studied. We show

that this information is non-commutative in a very strong sense and present

asymptotic relations between values $IG(y:x)$, $IG(x:y)$, $KG(x)$ and

$KG(y)$.