Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #1 to TR11-120 | 15th December 2012 06:34

Advice Lower Bounds for the Dense Model Theorem

RSS-Feed




Revision #1
Authors: Thomas Watson
Accepted on: 15th December 2012 06:34
Downloads: 2580
Keywords: 


Abstract:

We prove a lower bound on the amount of nonuniform advice needed by black-box reductions for the Dense Model Theorem of Green, Tao, and Ziegler, and of Reingold, Trevisan, Tulsiani, and Vadhan. The latter theorem roughly says that for every distribution $D$ that is $\delta$-dense in a distribution that is $\epsilon'$-indistinguishable from uniform, there exists a ``dense model'' for $D$, that is, a distribution that is $\delta$-dense in the uniform distribution and is $\epsilon$-indistinguishable from $D$. This $\epsilon$-indistinguishability is with respect to an arbitrary small class of functions $F$. For the natural case where $\epsilon'\ge\Omega(\epsilon\delta)$ and $\epsilon\ge\delta^{O(1)}$, our lower bound implies that $\Omega\big(\sqrt{(1/\epsilon)\log(1/\delta)}\cdot\log|F|\big)$ advice bits are necessary. There is only a polynomial gap between our lower bound and the best upper bound for this case (due to Zhang), which is $O\big((1/\epsilon^2)\log(1/\delta)\cdot\log|F|\big)$. Our lower bound can be viewed as an analog of list size lower bounds for list-decoding of error-correcting codes, but for ``dense model decoding'' instead.


Paper:

TR11-120 | 6th September 2011 03:49

Advice Lower Bounds for the Dense Model Theorem





TR11-120
Authors: Thomas Watson
Publication: 11th September 2011 16:55
Downloads: 3035
Keywords: 


Abstract:

We prove a lower bound on the amount of nonuniform advice needed by black-box reductions for the Dense Model Theorem of Green, Tao, and Ziegler, and of Reingold, Trevisan, Tulsiani, and Vadhan. The latter theorem roughly says that for every distribution $D$ that is $\delta$-dense in a distribution that is $\epsilon'$-indistinguishable from uniform, there exists a ``dense model'' for $D$, that is, a distribution that is $\delta$-dense in the uniform distribution and is $\epsilon$-indistinguishable from $D$. This $\epsilon$-indistinguishability is with respect to an arbitrary small class of functions $F$. For the very natural case where $\epsilon'\ge\Omega(\epsilon\delta)$ and $\epsilon\ge\delta^{O(1)}$, our lower bound implies that $\Omega\big(\sqrt{(1/\epsilon)\log(1/\delta)}\cdot\log|F|\big)$ advice bits are necessary. There is only a polynomial gap between our lower bound and the best upper bound for this case (due to Zhang), which is $O\big((1/\epsilon^2)\log(1/\delta)\cdot\log|F|\big)$. Our lower bound can be viewed as an analog of list size lower bounds for list-decoding of error-correcting codes, but for ``dense model decoding'' instead.



ISSN 1433-8092 | Imprint