Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #5 to TR18-194 | 4th March 2019 21:08

Anti-concentration in most directions

RSS-Feed




Revision #5
Authors: Amir Yehudayoff
Accepted on: 4th March 2019 21:08
Downloads: 1007
Keywords: 


Abstract:

We prove anti-concentration bounds for the inner product of two independent random vectors. For example, we show that if $A,B$ are subsets of the cube $\{\pm 1\}^n$ with $|A| \cdot |B| \geq 2^{1.01 n}$, and $X \in A$ and $Y \in B$ are sampled independently and uniformly, then the inner product $\langle X, Y\rangle$ takes on any fixed value with probability at most $O(\tfrac{1}{\sqrt{n}})$. Extending Hal\'asz work, we prove stronger bounds when the choices for $x$ are unstructured. We also describe applications to communication complexity, randomness extraction and additive combinatorics.



Changes to previous version:

More general scenarios.


Revision #4 to TR18-194 | 19th January 2019 02:50

Anti-concentration in most directions





Revision #4
Authors: Anup Rao, Amir Yehudayoff
Accepted on: 19th January 2019 02:50
Downloads: 719
Keywords: 


Abstract:

We prove an anti-concentration bound for the inner product of two independent random vectors in the discrete cube. If, for example, $A,B$ are subsets of the cube $\{\pm 1\}^n$ with $|A| \cdot |B| \geq 2^{1.01 n}$, and $X \in A$ and $Y \in B$ are sampled independently and uniformly, then $\su _i X_i Y_i$ takes on any fixed value with probability at most $O(\frac{1}{\sqrt{n}})$. The proof provides a general framework for establishing anti-concentration in discrete domains. We describe applications to communication complexity, randomness extraction and additive combinatorics.


Revision #3 to TR18-194 | 18th January 2019 23:20

Anti-concentration in most directions





Revision #3
Authors: Amir Yehudayoff, Anup Rao
Accepted on: 18th January 2019 23:20
Downloads: 605
Keywords: 


Abstract:

We prove an anti-concentration bound for the inner product of two independent random vectors in the discrete cube. If, for example, $A,B$ are subsets of the cube $\{\pm 1\}^n$ with $|A| \cdot |B| \geq 2^{1.01 n}$, and $X \in A$ and $Y \in B$ are sampled independently and uniformly,
then $\sum_i X_i Y_i$ takes on any fixed value with probability at most $O(1/{\sqrt{n}})$. The proof provides a general framework for establishing anti-concentration in discrete domains. We describe applications to communication complexity, randomness extraction and additive combinatorics.



Changes to previous version:

A sharp bound on the number of directions where anti-concentration does not hold.


Revision #2 to TR18-194 | 18th January 2019 23:15

Anti-concentration in most directions





Revision #2
Authors: Anup Rao, Amir Yehudayoff
Accepted on: 18th January 2019 23:15
Downloads: 668
Keywords: 


Abstract:

We prove an anti-concentration bound for the inner product of two independent random vectors in the discrete cube. If, for example, $A,B$ are subsets of the cube $\{\pm 1\}^n$ with $|A| \cdot |B| \geq 2^{1.01 n}$, and $X \in A$ and $Y \in B$ are sampled independently and uniformly, then $\ip{X}{Y}$ takes on any fixed value with probability at most $O(\tfrac{1}{\sqrt{n}})$. The proof provides a general framework for establishing anti-concentration in discrete domains. We describe applications to communication complexity, randomness extraction and additive combinatorics.



Changes to previous version:

A sharp bound on the number of directions where anti-concentration does not hold.


Revision #1 to TR18-194 | 28th November 2018 01:37

Anti-concentration in most directions





Revision #1
Authors: Amir Yehudayoff
Accepted on: 28th November 2018 01:37
Downloads: 831
Keywords: 


Abstract:

We prove anti-concentration for the inner product of two independent random vectors in the discrete cube. Our results imply Chakrabarti and Regev's lower bound on the randomized communication complexity of the gap-hamming problem. They are also meaningful in the context of randomness extraction. The proof provides a framework for establishing anti-concentration in discrete domains. The argument has two different components. A local component that uses harmonic analysis, and a global ("information theoretic") component.



Changes to previous version:

Conversations with Anup Rao lead to a sharp bound with a simpler proof. Some calculations from the previous version may be useful elsewhere.


Paper:

TR18-194 | 15th November 2018 20:15

Anti-concentration in most directions





TR18-194
Authors: Amir Yehudayoff
Publication: 18th November 2018 18:01
Downloads: 1305
Keywords: 


Abstract:

We prove anti-concentration for the inner product of two independent random vectors in the discrete cube. Our results imply Chakrabarti and Regev's lower bound on the randomized communication complexity of the gap-hamming problem. They are also meaningful in the context of randomness extraction. The proof provides a framework for establishing anti-concentration in discrete domains. The argument has two different components. A local component that uses harmonic analysis, and a global ("information theoretic") component.



ISSN 1433-8092 | Imprint