Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #1 to TR12-131 | 13th February 2014 17:25

An Information Complexity Approach to Extended Formulations

RSS-Feed




Revision #1
Authors: Mark Braverman, Ankur Moitra
Accepted on: 13th February 2014 17:25
Downloads: 2538
Keywords: 


Abstract:

We prove an unconditional lower bound that any extended formulation that achieves an $O(n^{1-\epsilon})$ approximation for clique has size $2^{\Omega(n^\epsilon)}$. There has been considerable recent interest in proving lower bounds for extended formulations. Fiorini et al proved that there is no polynomial sized extended formulation for traveling salesman. Braun et al proved that there is no polynomial sized $O(n^{1/2 - \epsilon})$-approximate extended formulation for clique. Here we prove an optimal and unconditional lower bound against extended formulations for clique that matches H{\aa}stad's celebrated hardness result. Interestingly, the techniques used to prove such lower bounds have closely followed the progression of techniques used in communication complexity. Here we develop an information theoretic framework to approach these questions, and we use it to prove our main result.

Also we resolve a related question: How many bits of communication are needed to get $\epsilon$-advantage over random guessing for disjointness? Kalyanasundaram and Schnitger proved that a protocol that gets constant advantage requires $\Omega(n)$ bits of communication. This result in conjunction with amplification implies that any protocol that gets $\epsilon$-advantage requires $\Omega(\epsilon^2 n)$ bits of communication. Here we improve this bound to $\Omega(\epsilon n)$, which is optimal for any $\epsilon > 0$.


Paper:

TR12-131 | 18th October 2012 03:15

An Information Complexity Approach to Extended Formulations





TR12-131
Authors: Mark Braverman, Ankur Moitra
Publication: 18th October 2012 03:18
Downloads: 5189
Keywords: 


Abstract:

We prove an unconditional lower bound that any linear program that achieves an $O(n^{1-\epsilon})$ approximation for clique has size $2^{\Omega(n^\epsilon)}$. There has been considerable recent interest in proving unconditional lower bounds against any linear program. Fiorini et al proved that there is no polynomial sized linear program for traveling salesman. Braun et al proved that there is no polynomial sized $O(n^{1/2 - \epsilon})$-approximate linear program for clique. Here we prove an optimal and unconditional lower bound against linear programs for clique that matches H{\aa}stad's celebrated hardness result. Interestingly, the techniques used to prove such lower bounds have closely followed the progression of techniques used in communication complexity. Here we develop an information theoretic framework to approach these questions, and we use it to prove our main result.

Also we resolve a related question: How many bits of communication are needed to get $\epsilon$-advantage over random guessing for disjointness? Kalyanasundaram and Schnitger proved that a protocol that gets constant advantage requires $\Omega(n)$ bits of communication. This result in conjunction with amplification implies that any protocol that gets $\epsilon$-advantage requires $\Omega(\epsilon^2 n)$ bits of communication. Here we improve this bound to $\Omega(\epsilon n)$, which is optimal for any $\epsilon > 0$.



ISSN 1433-8092 | Imprint