We study the problem of non-interactive correlation distillation
(NICD). Suppose Alice and Bob each has a string, denoted by
$A=a_0a_1\cdots a_{n-1}$ and $B=b_0b_1\cdots b_{n-1}$,
respectively. Furthermore, for every $k=0,1,...,n-1$, $(a_k,b_k)$ is
independently drawn from a distribution $\noise$, known as the ``noise
mode''. Alice and Bob wish to ``distill'' the correlation
non-interactively, i.e., they wish to each apply a function to their
strings, and output one random bit, denoted by $X$ and $Y$, such that
$\prob[X=Y]$ can be made as close to 1 as possible. The problem is,
for what noise model can they succeed? This problem is related to
various topics in computer science, including information
reconciliation and random beacons. In fact, if NICD is indeed possible
for some general class of noise models, then some of these topics
would, in some sense, become straightforward corollaries.
We prove two negative results on NICD for various noise models.
We prove that for these models, it is impossible to distill the
correlation to be arbitrarily close to 1. We also give an example
where Alice and Bob can increase their correlation with one bit of
communication (in this case they need to each output two bits). This
example, which may be of its own interest, demonstrates that even the
smallest amount of communication is provably more powerful than no
communication.
An extended abstract of this paper is to appear in the Latin American
Theoretical INformatics (LATIN 2004), Buenos Aires, Argentina, 2004.