
PreviousNext
We investigate the sample complexity of mutual information and conditional mutual information testing. For conditional mutual information testing, given access to independent samples of a triple of random variables $(A, B, C)$ with unknown distribution, we want to distinguish between two cases: (i) $A$ and $C$ are conditionally independent, i.e., ... more >>>
Given a sequence of $N$ independent sources $\mathbf{X}_1,\mathbf{X}_2,\dots,\mathbf{X}_N\sim\{0,1\}^n$, how many of them must be good (i.e., contain some min-entropy) in order to extract a uniformly random string? This question was first raised by Chattopadhyay, Goodman, Goyal and Li (STOC '20), motivated by applications in cryptography, distributed computing, and the unreliable ... more >>>
A generalized polymorphism of a predicate $P \subseteq \{0,1\}^m$ is a tuple of functions $f_1,\dots,f_m\colon \{0,1\}^n \to \{0,1\}$ satisfying the following property: If $x^{(1)},\dots,x^{(m)} \in \{0,1\}^n$ are such that $(x^{(1)}_i,\dots,x^{(m)}_i) \in P$ for all $i$, then also $(f_1(x^{(1)}),\dots,f_m(x^{(m)})) \in P$.
We show that if $f_1,\dots,f_m$ satisfy this property for most ... more >>>
PreviousNext