Weizmann Logo
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



TR15-028 | 27th February 2015 14:28

Relative Discrepancy does not separate Information and Communication Complexity



Does the information complexity of a function equal its communication complexity? We examine whether any currently known techniques might be used to show a separation between the two notions. Recently, Ganor et al. provided such a separation in the distributional setting for a specific input distribution ?. We show that in the non- distributional setting, the relative discrepancy bound they defined is, in fact, smaller than the information complexity, and hence it cannot be used to separate information and communication complexity. In addition, in the distributional case, we provide an equivalent linear program formulation for relative discrepancy and relate it to variants of the partition bound, resolving also an open question regarding the relation of the partition bound and information complexity. Last, we prove the equivalence between the adaptive relative discrepancy and the public-coin partition bound, which implies that the logarithm of the adaptive relative discrepancy bound is quadratically tight with respect to communication.

ISSN 1433-8092 | Imprint