Communication complexity is a central model of computation introduced by Yao in 1979, where
two players, Alice and Bob, receive inputs x and y respectively and want to compute $f(x; y)$ for some fixed
function f with the least amount of communication. Recently people have revisited the question of the privacy
of such protocols: is it possible for Alice and Bob to compute $f(x; y)$ without revealing too much information
about their inputs? There are two types of privacy for communication protocols that have been proposed:
first, an information theoretic definition (by Bar-Yehuda et al, and Klauck), which for Boolean functions is equivalent to the
notion of information cost introduced by Chakrabarti et al in 2001 and that has since found many important applications;
second, a combinatorial definition introduced by Feigenbaum et al in 2010 and further developed by Ada et al in 2012.
We provide new results for both notions of privacy, as well as the relation between them. Our new lower
bound techniques both for the combinatorial and the information-theoretic definitions enable us to give tight
bounds for the privacy of several functions, including Equality, Disjointness, Inner Product, Greater Than. In
the process we also prove tight bounds (up to 1 or 2 additive bits) for the external information complexity of
these functions.
We also extend the definitions of privacy to bounded-error randomized protocols and provide a relation
between the two notions and the communication complexity. Again, we are able to prove tight bounds for the
above-mentioned functions as well as the Vector in Subspace and Gap Hamming Distance problems.