We study the relationship between communication and information in 2-party communication protocols when the information is asymmetric. If $I^A$ denotes the number of bits of information revealed by the first party, $I^B$ denotes the information revealed by the second party, and $C$ is the number of bits of communication in the protocol, we show that
- one can simulate the protocol using order $ I^A + \sqrt[4]{C^3 \cdot I^B} \cdot \log C + \sqrt{C\cdot I^B} \cdot \log C $ bits of communication,
- one can simulate the protocol using order $I^A \cdot 2^{O(I^B)}$ bits of communication
The first result gives the best known bound on the complexity of a simulation when $I^A \gg I^B ,C^{3/4}$. The second gives the best known bound when $I^B \ll \log C$. In addition we show that if a function is computed by a protocol with asymmetric information complexity, then the inputs must have a large, nearly monochromatic rectangle of the right dimensions, a fact that is useful for proving lower bounds on lopsided communication problems.