Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR25-164 | 2nd November 2025 18:42

Constant-time source decoding

RSS-Feed




TR25-164
Authors: Jordan Horacsek, Chin Ho Lee, Igor Shinkar, Emanuele Viola, Renfei Zhou
Publication: 2nd November 2025 21:11
Downloads: 78
Keywords: 


Abstract:

We give versions of Shannon's coding theorem where the decoder runs in constant time:

- Let $D=(D_1,D_2,\ldots,D_n)$ be a product distribution where the $D_i$ have constant support and have dyadic probability masses (i.e., of the form $a/2^b$ where $a,b$ are integers). Then $D$ can be sampled in constant time in the bit-probe model (equivalently, in NC$^0$) and randomness complexity $(h(D)+\varepsilon)n$, up to an exponentially small statistical error. The dyadic requirement is necessary.

- Every $p$-biased distribution can be sampled in constant time in the cell-probe model with randomness complexity $h(p)n + \sqrt{n} \cdot \rm{polylog}(n)$, up to a polynomially small statistical distance.

- We determine the tradeoffs between locality and statistical distance for sampling the $1/4$-biased distribution using non-trivial randomness complexity (e.g., $1.99n$). For 2 bit probes, essentially no non-trivial approximation is possible; for 3 bit probes, we give a sampler with $1/\rm{poly}(n)$ statistical distance and show that this is best possible; finally, 4 bit probes suffice for exponentially small distance.



ISSN 1433-8092 | Imprint