We develop a new local characterization of the zero-error information complexity function for two party communication problems, and use it to compute the exact internal and external information complexity of the 2-bit AND function: $IC(AND,0) = C_{\wedge}\approx 1.4923$ bits, and $IC^{ext}(AND,0) = \log_2 3 \approx 1.5839$ bits. This leads to a tight (upper and lower bound) characterization of the communication complexity of the set intersection problem on subsets of $\{1,\ldots,n\}$, whose randomized communication complexity tends to $C_{\wedge}\cdot n\pm o(n)$ as the error tends to zero.
The information-optimal protocol we present has an infinite number of rounds. We show this is necessary by proving that the rate of convergence of the r-round information cost of AND to $IC(AND,0)=C_\wedge$ behaves like $\Theta(1/r^2)$, i.e. that the r-round information complexity of AND is $C_\wedge+\Theta(1/r^2)$.
We leverage the tight analysis obtained for the information complexity of AND to calculate and prove the exact communication complexity of the set disjointness function $Disj_n(X,Y) = \neg \vee_{i=1}^{n} AND(x_i,y_i)$ with error tending to 0, which turns out to be $= C_{DISJ}\cdot n \pm o(n)$, where $C_{DISJ}\approx 0.4827$. Our rate of convergence results imply that an optimal protocol for set disjointness will have to use $\omega(1)$ rounds of communication, since every $r$-round protocol will be sub-optimal by at least $\Omega(n/r^2)$ bits of communication.
We also obtain the tight bound of $\frac{2}{\ln 2}k\pm o(k)$ on the communication complexity of disjointness of sets of size $\le k$. An asymptotic bound of $\Theta(k)$ was previously shown by Hastad and Wigderson.