Weizmann Logo
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style

Reports tagged with Shannon entropy:
TR08-106 | 12th November 2008
Jack H. Lutz

A Divergence Formula for Randomness and Dimension

If $S$ is an infinite sequence over a finite alphabet $\Sigma$ and $\beta$ is a probability measure on $\Sigma$, then the {\it dimension} of $ S$ with respect to $\beta$, written $\dim^\beta(S)$, is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension $\dim(S)$ when $\beta$ is ... more >>>

TR13-001 | 2nd January 2013
Gillat Kol, Ran Raz

Interactive Channel Capacity

Revisions: 1

We study the interactive channel capacity of an $\epsilon$-noisy channel. The interactive channel capacity $C(\epsilon)$ is defined as the minimal ratio between the communication complexity of a problem (over a non-noisy channel), and the communication complexity of the same problem over the binary symmetric channel with noise rate $\epsilon$, where ... more >>>

TR14-112 | 23rd August 2014
Louay Bazzi

Entropy of weight distributions of small-bias spaces and pseudobinomiality

Revisions: 1

A classical bound in Information Theory asserts that small $L_1$-distance between probability distributions implies small difference in Shannon entropy, but the converse need not be true. We show that if a probability distribution on $\{0,1\}^n$ has small-bias, then the converse holds for its weight distribution in the proximity of the ... more >>>

ISSN 1433-8092 | Imprint