Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #3 to TR12-050 | 19th November 2012 11:45

Gradual Small-Bias Sample Spaces

RSS-Feed




Revision #3
Authors: Avraham Ben-Aroya, Gil Cohen
Accepted on: 19th November 2012 11:45
Downloads: 4115
Keywords: 


Abstract:

A (k,eps)-biased sample space is a distribution over {0,1}^n that eps-fools every nonempty linear test of size at most k. Since they were introduced by Naor and Naor (SIAM J. Computing, 1993), these sample spaces have become a central notion in theoretical computer science with a variety of applications. When constructing such spaces, one usually attempts to minimize the seed length as a function of n, k and eps. Given such a construction, if we reverse the roles and consider a fixed seed length, then the smaller we pick k, the better the bound on the bias eps becomes. However, once the space is constructed we have a single bound on the bias of all tests of size at most k.

In this work we initiate the study of a new pseudorandom object, which we call a gradual (k,eps)-biased sample space. Roughly speaking, this is a sample space that eps-fools linear tests of size exactly k and moreover, the bound on the bias for linear tests of size i < k decays as i gets smaller. We show how to construct gradual (k,eps)-biased sample spaces of size comparable to the (non-gradual) spaces constructed by Alon et al. (Random Structures and Algorithms, 1992), and prove a lower bound on their size. Our construction is based on the lossless expanders of Guruswami et al. (J. ACM, 2009), combined with the Quadratic Character Construction of Alon et al. (Random Structures and Algorithms, 1992).



Changes to previous version:

Motivation added.


Revision #2 to TR12-050 | 13th August 2012 22:19

Gradual Small-Bias Sample Spaces





Revision #2
Authors: Avraham Ben-Aroya, Gil Cohen
Accepted on: 13th August 2012 22:19
Downloads: 2542
Keywords: 


Abstract:

A $(k,\epsilon)$-biased sample space is a distribution over $\{0,1\}^n$ that $\epsilon$-fools every nonempty linear test of size at most $k$. Since they were introduced by Naor and Naor [SIAM J. Computing, 1993], these sample spaces have become a central notion in theoretical computer science with a variety of applications.
When constructing such spaces, one usually attempts to minimize the seed length as a function of $n, k$ and $\epsilon$. Given such a construction, if we reverse the roles and consider a fixed seed length, then the smaller we pick $k$, the better the bound on the bias $\epsilon$ becomes. However, once the space is constructed we have a \emph{single} bound on the bias of all tests of size at most $k$.

In this work we initiate the study of a new pseudorandom object, which we call a \emph{gradual} $(k,\epsilon)$-biased sample space. Roughly speaking, this is a sample space
that $\epsilon$-fools linear tests of size \emph{exactly} $k$ and moreover, the bound on the bias for linear tests of size $i \le k$ decays as $i$ gets smaller.
We show how to construct gradual $(k,\epsilon)$-biased sample spaces of size comparable to the (non-gradual) spaces constructed by Alon et al. [FOCS 1990], and prove a lower bound on their size. Our construction is based on the lossless expanders of Guruswami et al. [J. ACM, 2009], combined with the Quadratic Character Construction of Alon et al. [FOCS 1990].



Changes to previous version:

Elborated section on non-linear decay.


Revision #1 to TR12-050 | 8th May 2012 22:23

Gradual Small-Bias Sample Spaces





Revision #1
Authors: Avraham Ben-Aroya, Gil Cohen
Accepted on: 8th May 2012 22:23
Downloads: 2678
Keywords: 


Abstract:

A $(k,\epsilon)$-biased sample space is a distribution over $\{0,1\}^n$ that $\epsilon$-fools every nonempty linear test of size at most $k$. Since they were introduced by Naor and Naor [SIAM J. Computing, 1993], these sample spaces have become a central notion in theoretical computer science with a variety of applications.

When constructing such spaces, one usually attempts to minimize the seed length as a function of $n, k$ and $\epsilon$. Given such a construction, if we reverse the roles and consider a fixed seed length, then the smaller we pick $k$, the better the bound on the bias $\epsilon$ becomes. However, once the space is constructed we have a \emph{single} bound on the bias of all tests of size at most $k$.

In this work we consider the problem of getting ``more mileage" out of $(k,\epsilon)$-biased sample spaces. Namely, we study a generalization of these sample spaces which we call \emph{gradual} $(k,\epsilon)$-biased sample spaces. Roughly speaking, these are sample spaces that $\epsilon$-fool linear tests of size \emph{exactly} $k$ and moreover, the bound on the bias of linear tests of size $i \le k$ decays as $i$ gets smaller.

We show how to construct gradual $(k,\epsilon)$-biased sample spaces of size comparable to the (non-gradual) spaces constructed by Alon et al. [FOCS 1990]. Our construction is based on the lossless expanders of Guruswami et al.[J. ACM, 2009], combined with the Quadratic Character Construction of Alon et al. [FOCS 1990].



Changes to previous version:

Fixed some typos.


Paper:

TR12-050 | 25th April 2012 15:45

Gradual Small-Bias Sample Spaces





TR12-050
Authors: Avraham Ben-Aroya, Gil Cohen
Publication: 27th April 2012 16:59
Downloads: 3175
Keywords: 


Abstract:

A $(k,\epsilon)$-biased sample space is a distribution over $\{0,1\}^n$ that $\epsilon$-fools every nonempty linear test of size at most $k$. Since they were introduced by Naor and Naor [SIAM J. Computing, 1993], these sample spaces have become a central notion in theoretical computer science with a variety of applications.

When constructing such spaces, one usually attempts to minimize the seed length as a function of $n, k$ and $\epsilon$. Given such a construction, if we reverse the roles and consider a fixed seed length, then the smaller we pick $k$, the better the bound on the bias $\epsilon$ becomes. However, once the space is constructed we have a \emph{single} bound on the bias of all tests of size at most $k$.

In this work we consider the problem of getting ``more mileage" out of $(k,\epsilon)$-biased sample spaces. Namely, we study a generalization of these sample spaces which we call \emph{gradual} $(k,\epsilon)$-biased sample spaces. Roughly speaking, these are sample spaces that $\epsilon$-fool linear tests of size \emph{exactly} $k$ and moreover, the bound on the bias of linear tests of size $i \le k$ decays as $i$ gets smaller.

We show how to construct gradual $(k,\epsilon)$-biased sample spaces of size comparable to the (non-gradual) spaces constructed by Alon et al. [FOCS 1990]. Our construction is based on the lossless expanders of Gurusawmi et al.[J. ACM, 2009], combined with the Quadratic Character Construction of Alon et al. [FOCS 1990].



ISSN 1433-8092 | Imprint