We present the first efficient averaging sampler that achieves asymptotically optimal randomness complexity and near-optimal sample complexity. For any $\delta, \varepsilon > 0$ and any constant $\alpha > 0$, our sampler uses $O(m + \log (1 / \delta))$ random bits to output $t = O(\left(\frac{1}{\varepsilon^2} \log \frac{1}{\delta}\right)^{1 + \alpha})$ samples $Z_1, \dots, Z_t \in \{0, 1\}^m$ such that for any function $f: \{0, 1\}^m \to [0, 1]$,
\[
\Pr\left[\left|\frac{1}{t}\sum_{i=1}^t f(Z_i) - \mathbb{E}[f]\right| \leq \varepsilon\right] \geq 1 - \delta.
\]
The sample complexity is optimal up to the $O\left(\left(\frac{1}{\varepsilon^2} \log \frac{1}{\delta}\right)^{\alpha}\right)$ factor, and the randomness complexity is optimal up to a constant factor.
We use known connections with randomness extractors and list-decodable codes to give applications to these objects. Specifically, we give the first extractor construction with optimal seed length up to an arbitrarily small constant factor bigger than 1, when the min-entropy $k = \beta n$ for a large enough constant $\beta < 1$.
We present the first efficient averaging sampler that achieves asymptotically optimal randomness complexity and near-optimal sample complexity. For any $\delta, \epsilon > 0$ and any constant $\alpha > 0$, our sampler uses $O(m + \log (1 / \delta))$ random bits to output $t = O\left(\left(\frac{1}{\epsilon^2} \log \frac{1}{\delta}\right)^{1 + \alpha}\right)$ samples $Z_1, \dots, Z_t \in \{0, 1\}^m$ such that for any function $f: \{0, 1\}^m \to [0, 1]$,
\[
\Pr\left(\left|\frac{1}{t}\sum_{i=1}^t f(Z_i) - \mathbb{E}[f]\right| \leq \epsilon\right) \geq 1 - \delta.
\]
The sample complexity is optimal up to the $O\left(\left(\frac{1}{\epsilon^2} \log \frac{1}{\delta}\right)^{\alpha}\right)$ factor, and the randomness complexity is optimal up to a constant factor.
We use known connections with randomness extractors and list-decodable codes to give applications to these objects. Specifically, we give the first extractor construction with optimal seed length up to an arbitrarily small constant factor bigger than 1, when the min-entropy $k = \beta n$ for a large enough constant $\beta < 1$.
We present the first efficient averaging sampler that achieves asymptotically optimal randomness complexity and near-optimal sample complexity for natural parameter choices. Specifically, for any constant $\alpha > 0$, for $\delta > 2^{-\mathrm{poly}(1 / \varepsilon)}$, it uses $m + O(\log (1 / \delta))$ random bits to output $t = O(\log(1 / \delta) / \varepsilon^{2 + \alpha})$ samples $Z_1, \dots Z_t \in \{0, 1\}^m$ such that for any function $f: \{0, 1\}^m \to [0, 1]$, \[
\Pr\left[\left|\frac{1}{t}\sum_{i=1}^tf(Z_i) - \mathbb{E} \, f\right| \le \varepsilon \right] \ge 1 - \delta.
\]
The sample complexity is optimal up to the $O(\varepsilon^\alpha)$ factor.
We use known connections with randomness extractors and list-decodable codes to give applications to these objects.