We show that randomized communication complexity can be superlogarithmic in the partition number of the associated communication matrix, and we obtain near-optimal randomized lower bounds for the Clique vs. Independent Set problem. These results strengthen the deterministic lower bounds obtained in prior work (G\"o\"os, Pitassi, and Watson, FOCS 2015). One of our main technical contributions states that information complexity when the cost is measured with respect to only 1-inputs (or only 0-inputs) is essentially equivalent to information complexity with respect to all inputs.
We show that \emph{randomized} communication complexity can be superlogarithmic in the partition number of the associated communication matrix, and we obtain near-optimal \emph{randomized} lower bounds for the Clique vs.\ Independent Set problem. These results strengthen the deterministic lower bounds obtained in prior work (G\"o\"os, Pitassi, and Watson, {\small FOCS~2015}).