We study the problem of compressing interactive communication to its
information content $I$, defined as the amount of information that the
participants learn about each other's inputs. We focus on the case when
the participants' inputs are distributed independently and show how to
compress the communication to $O(I\log^{2}I)$ bits, with no dependence
on the original communication cost. This result improves quadratically on
previous work by Kol (STOC 2016) and essentially matches the well-known
lower bound $\Omega(I)$.