Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR22-097 | 3rd July 2022 17:51

Unstructured Hardness to Average-Case Randomness

RSS-Feed




TR22-097
Authors: Lijie Chen, Ron D. Rothblum, Roei Tell
Publication: 8th July 2022 23:29
Downloads: 1149
Keywords: 


Abstract:

The leading technical approach in uniform hardness-to-randomness in the last two decades faced several well-known barriers that caused results to rely on overly strong hardness assumptions, and yet still yield suboptimal conclusions.

In this work we show uniform hardness-to-randomness results that *simultaneously break through all of the known barriers*. Specifically, consider any one of the following three assumptions:

1. For some $\epsilon>0$ there exists a function $f$ computable by uniform circuits of size $2^{O(n)}$ and depth $2^{o(n)}$ such that $f$ is hard for probabilistic time $2^{\epsilon\cdot n}$.

2. For every $c\in\mathbb{N}$ there exists a function $f$ computable by logspace-uniform circuits of polynomial size and depth $n^2$ such that every probabilistic algorithm running in time $n^{c}$ fails to compute $f$ on a $(1/n)$-fraction of the inputs.

2. For every $c\in\mathbb{N}$ there exists a logspace-uniform family of arithmetic formulas of degree $n^2$ over a field of size $\mathrm{poly}(n)$ such that no algorithm running in probabilistic time $n^{c}$ can evaluate the family on a worst-case input.

Assuming any of these hypotheses, where the hardness is for every sufficiently large input length $n\in\mathbb{N}$, we deduce that $\mathcal{RP}$ can be derandomized in *polynomial time and on *all input lengths*, on average. Furthermore, under the first assumption we also show that $\mathcal{BPP}$ can be derandomized in polynomial time, on average and on all input lengths, with logarithmically many advice bits.

On the way to these results we also resolve two related open problems. First, we obtain an *optimal worst-case to average-case reduction* for computing problems in linear space by uniform probabilistic algorithms; this result builds on a new instance checker based on the doubly efficient proof system of Goldwasser, Kalai, and Rothblum (J. ACM, 2015). Secondly, we resolve the main open problem in the work of Carmosino, Impagliazzo and Sabin (ICALP 2018), by deducing derandomization from weak and general fine-grained hardness hypotheses.



ISSN 1433-8092 | Imprint