Next

__
TR24-155
| 11th October 2024
__

Shuichi Hirahara, Zhenjian Lu, Mikito Nanashima#### Optimal Coding for Randomized Kolmogorov Complexity and Its Applications

__
TR24-154
| 10th October 2024
__

Jesse Goodman, Xin Li, David Zuckerman#### Improved Condensers for Chor-Goldreich Sources

__
TR24-152
| 5th October 2024
__

Alexander A. Sherstov, Andrey Storozhenko#### The Communication Complexity of Approximating Matrix Rank

Next

Shuichi Hirahara, Zhenjian Lu, Mikito Nanashima

The coding theorem for Kolmogorov complexity states that any string sampled from a computable distribution has a description length close to its information content. A coding theorem for resource-bounded Kolmogorov complexity is the key to obtaining fundamental results in average-case complexity, yet whether any samplable distribution admits a coding theorem ... more >>>

Jesse Goodman, Xin Li, David Zuckerman

One of the earliest models of weak randomness is the Chor-Goldreich (CG) source. A $(t,n,k)$-CG source is a sequence of random variables $\mathbf{X}=(\mathbf{X}_1,\dots,\mathbf{X}_t) \sim (\{0,1\}^n)^t$, where each $\mathbf{X}_i$ has min-entropy $k$ conditioned on any fixing of $\mathbf{X}_1,\dots,\mathbf{X}_{i-1}$. Chor and Goldreich proved that there is no deterministic way to extract randomness ... more >>>

Alexander A. Sherstov, Andrey Storozhenko

We fully determine the communication complexity of approximating matrix rank, over any finite field $\mathbb{F}$. We study the most general version of this problem, where $0\leq r < R\leq n$ are given integers, Alice and Bob's inputs are matrices $A,B\in\mathbb{F}^{n\times n}$, respectively, and they need to distinguish between the cases ... more >>>

Next