To print higher-resolution math symbols, click the
Hi-Res Fonts for Printing button on the jsMath control panel.

jsMath
Loading jsMath...
Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR24-115 | 14th July 2024 00:39

On the Complexity of Avoiding Heavy Elements

RSS-Feed




TR24-115
Authors: Zhenjian Lu, Igor Oliveira, Hanlin Ren, Rahul Santhanam
Publication: 14th July 2024 00:39
Downloads: 376
Keywords: 


Abstract:

We introduce and study the following natural total search problem, which we call the {\it heavy element avoidance} (Heavy Avoid) problem: for a distribution on N bits specified by a Boolean circuit sampling it, and for some parameter \delta(N) \ge 1/\poly(N) fixed in advance, output an N-bit string that has probability less than \delta(N). We show that the complexity of Heavy Avoid is closely tied to frontier open questions in complexity theory about uniform randomized lower bounds and derandomization. Among other results, we show:

1. For a wide range of circuit classes C, including ACC^0, TC^0, NC^1 and general Boolean circuits, EXP does not have uniform randomized C-circuits if and only if Heavy Avoid for uniform implicit C-samplers has efficient deterministic algorithms infinitely often. This gives the first {\it algorithmic characterization} of lower bounds for EXP against uniform randomized low-depth circuits. We show similar algorithmic characterizations for lower bounds in PSPACE, NP and EXP^{NP}.

2. Unconditionally, there are polynomial-time {\it pseudodeterministic} algorithms that work infinitely often for several variants of Heavy Avoid, such as for uniform samplers of small randomness complexity. In contrast, the existence of a similar algorithm that solves Heavy Avoid for arbitrary polynomial-time samplers would solve a long-standing problem about hierarchies for probabilistic time.

3. If there is a time and depth efficient deterministic algorithm for Heavy Avoid, then BPP = P. Without the depth-efficiency requirement in the assumption, we still obtain a non-trivial form of infinitely-often deterministic simulation of randomized algorithms. These results are shown using {\it non-black-box} reductions, and we argue that the use of non-black-box reductions is essential here.



ISSN 1433-8092 | Imprint