We consider the task of compression of information when the source of the information and the destination do not agree on the prior, i.e., the distribution from which the information is being generated. This setting was considered previously by Kalai et al. (ICS 2011) who suggested that this was a ... more >>>
We prove a lower estimate on the increase in entropy when two copies of a conditional random variable $X | Y$, with $X$ supported on $\mathbb{Z}_q=\{0,1,\dots,q-1\}$ for prime $q$, are summed modulo $q$. Specifically, given two i.i.d. copies $(X_1,Y_1)$ and $(X_2,Y_2)$ of a pair of random variables $(X,Y)$, with $X$ ... more >>>