We study common randomness where two parties have access to i.i.d. samples from a known random source, and wish to generate a shared random key using limited (or no) communication with the largest possible probability of agreement. This problem is at the core of secret key generation in cryptography, with connections to communication under uncertainty and locality sensitive hashing. We take the approach of treating correlated sources as a critical resource, and ask whether common randomness can be generated resource-efficiently.
We consider two notable sources in this setup arising from correlated bits and correlated Gaussians. We design the first explicit schemes that use only a polynomial number of samples (in the key length) so that the players can generate shared keys that agree with constant probability using optimal communication. The best previously known schemes were both non-constructive and used an exponential number of samples. In the amortized setting, we characterize the largest achievable ratio of key length to communication in terms of the external and internal information costs, two well-studied quantities in theoretical computer science. In the relaxed setting where the two parties merely wish to improve the correlation between the generated keys of length $k$, we show that there are no interactive protocols using $o(k)$ bits of communication having agreement probability even as small as $2^{-o(k)}$. For the related communication problem where the players wish to compute a joint function $f$ of their inputs using i.i.d samples from a known source, we give a zero-communication protocol using $2^{O(c)}$ bits where $c$ is the interactive randomized public-coin communication complexity of $f$. This matches the lower bound shown previously while the best previously known upper bound was doubly exponential in $c$.