We provide a new framework for establishing strong lower bounds on the nonnega- tive rank of matrices by means of common information, a notion previously introduced in Wyner [1975]. Common information is a natural lower bound for the nonnegative rank of a matrix and by combining it with Hellinger distance estimations we can compute the (almost) exact common information of UDISJ partial matrix. The bounds are obtained very naturally and improve previous results by Braverman and Moitra [2012] in terms of being (almost) optimal. We also establish robustness of this estimation under various perturbations of the UDISJ partial matrix, where rows and columns are randomly or adversarially removed or where entries are randomly or adversarially altered. This robustness translates, via a variant of Yannakakis’ Factorization Theorem, to lower bounds on the average case and adversarial approximate extension complexity. We present the first family of polytopes, the hard pair introduced in Braun et al. [2012] related to the CLIQUE problem, with high average case and adversarial approximate extension complexity. The framework relies on a strengthened version of the link between information theory and Hellinger distance from Bar-Yossef et al. [2004]. We also provide an information theoretic variant of the fooling set method that allows us to extend fooling set lower bounds from extension complexity to approximate extension complexity.
Clarification thanks to Ola Svensson.
We provide a new framework for establishing strong lower bounds on the nonnegative rank of matrices by means of common information, a notion previously introduced in Wyner [1975]. Common information is a natural lower bound for the nonnegative rank of a matrix and by combining it with Hellinger distance estimations we can compute the (almost) exact common information of UDISJ partial matrix. The bounds are obtained very naturally and improve previous results by Braverman and Moitra [2012] in terms of being (almost) optimal. We also establish robustness of this estimation under various perturbations of the UDISJ partial matrix, where rows and columns are randomly or adversarially removed or where entries are randomly or adversarially altered. This robustness translates, via a variant of Yannakakis’ Factorization Theorem, to lower bounds on the average case and adversarial approximate extension complexity. We present the first family of polytopes, the hard pair introduced in Braun et al. [2012] related to the CLIQUE problem, with high average case and adversarial approximate extension complexity. The framework relies on a strengthened version of the link between information theory and Hellinger distance from Bar-Yossef et al. [2004]. We also provide an information theoretic variant of the fooling set method that allows us to extend fooling set lower bounds from extension complexity to approximate extension complexity.
Typos, shortening, streamlining.
We provide a new framework for establishing strong lower bounds on the nonnegative rank of matrices by means of common information, a notion previously introduced in Wyner [1975]. Common information is a natural lower bound for the nonnegative rank of a matrix and by combining it with Hellinger distance estimations we can compute the (almost) exact common information of UDISJ partial matrix. The bounds are obtained very naturally and improve previous results by Braverman and Moitra [2012] in terms of being (almost) optimal. We also establish robustness of this estimation under various perturbations of the UDISJ partial matrix, where rows and columns are randomly or adversarially removed or where entries are randomly or adversarially altered. This robustness translates, via a variant of Yannakakis’ Factorization Theorem, to lower bounds on the average case and adversarial approximate extension complexity. We present the first family of polytopes, the hard pair introduced in Braun et al. [2012] related to the CLIQUE problem, with high average case and adversarial approximate extension complexity. The framework relies on a strengthened version of the link between information theory and Hellinger distance from Bar-Yossef et al. [2004]. We also provide an information theoretic variant of the fooling set method that allows us to extend fooling set lower bounds from extension complexity to approximate extension complexity.
Correct typos, update references, improve discussion, improve an example.
We provide a new framework for establishing strong lower bounds on the nonnegative rank of matrices by means of common information, a notion previously introduced in Wyner [1975]. Common information is a natural lower bound for the nonnegative rank of a matrix and by combining it with Hellinger distance estimations we can compute the (almost) exact common information of UDISJ partial matrix. The bounds are obtained very naturally and improve previous results by Braverman and Moitra [2012] in terms of being (almost) optimal. We also establish robustness of this estimation under various perturbations of the UDISJ partial matrix, where rows and columns are randomly or adversarially removed or where entries are randomly or adversarially altered. This robustness translates, via a variant of Yannakakis’ Factorization Theorem, to lower bounds on the average case and adversarial approximate extension complexity. We present the first family of polytopes, the hard pair introduced in Braun et al. [2012] related to the CLIQUE problem, with high average case and adversarial approximate extension complexity. The framework relies on a strengthened version of the link between information theory and Hellinger distance from Bar-Yossef et al. [2004]. We also provide an information theoretic variant of the fooling set method that allows us to extend fooling set lower bounds from extension complexity to approximate extension complexity.
Mention recent works, including application to the stable set problem.
We provide a new framework for establishing strong lower bounds on the nonnega- tive rank of matrices by means of common information, a notion previously introduced in Wyner [1975]. Common information is a natural lower bound for the nonnegative rank of a matrix and by combining it with Hellinger distance estimations we can compute the (almost) exact common information of UDISJ partial matrix. The bounds are obtained very naturally and improve previous results by Braverman and Moitra [2012] in terms of being (almost) optimal. We also establish robustness of this estimation under various perturbations of the UDISJ partial matrix, where rows and columns are randomly or adversarially removed or where entries are randomly or adversarially altered. This robustness translates, via a variant of Yannakakis’ Factorization Theorem, to lower bounds on the average case and adversarial approximate extension complexity. We present the first family of polytopes, the hard pair introduced in Braun et al. [2012] related to the CLIQUE problem, with high average case and adversarial approximate extension complexity. The framework relies on a strengthened version of the link between information theory and Hellinger distance from Bar-Yossef et al. [2004]. We also provide an information theoretic variant of the fooling set method that allows us to extend fooling set lower bounds from extension complexity to approximate extension complexity.
Correction of typos and computational errors,
e.g. nonegative rank of UDISJ is now at least 0.3113n.
We provide a new framework for establishing strong lower bounds on the nonnega- tive rank of matrices by means of common information, a notion previously introduced in Wyner [1975]. Common information is a natural lower bound for the nonnegative rank of a matrix and by combining it with Hellinger distance estimations we can compute the (almost) exact common information of UDISJ partial matrix. The bounds are obtained very naturally and improve previous results by Braverman and Moitra [2012] in terms of being (almost) optimal. We also establish robustness of this estimation under various perturbations of the UDISJ partial matrix, where rows and columns are randomly or adversarially removed or where entries are randomly or adversarially altered. This robustness translates, via a variant of Yannakakis’ Factorization Theorem, to lower bounds on the average case and adversarial approximate extension complexity. We present the first family of polytopes, the hard pair introduced in Braun et al. [2012] related to the CLIQUE problem, with high average case and adversarial approximate extension complexity. The framework relies on a strengthened version of the link between information theory and Hellinger distance from Bar-Yossef et al. [2004]. We also provide an information theoretic variant of the fooling set method that allows us to extend fooling set lower bounds from extension complexity to approximate extension complexity.