Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Revision(s):

Revision #3 to TR22-025 | 13th May 2022 07:35

Derandomization from Time-Space Tradeoffs

RSS-Feed




Revision #3
Authors: Oliver Korten
Accepted on: 13th May 2022 07:35
Downloads: 438
Keywords: 


Abstract:

A recurring challenge in the theory of pseudorandomness and circuit complexity is the explicit construction of ``incompressible strings,'' i.e. finite objects which lack a specific type of structure or simplicity. In most cases, there is an associated NP search problem which we call the ``compression problem,'' where we are given a candidate object and must either find a compressed/structured representation of it or determine that none exist. For a particular notion of compressibility, a natural question is whether an efficient algorithm for the compression problem would aide us in the construction of incompressible objects. Consider the following two instances of this question:

(1) Does an efficient algorithm for circuit minimization imply efficient constructions of hard truth tables?

(2) Does an efficient algorithm for factoring integers imply efficient constructions of large prime numbers?

In this work, we connect these kinds of questions to the long-standing challenge of proving space/time tradeoffs for Turing machines, and proving stronger separations between the RAM and 1-tape computation models. In particular, one of our main theorems shows that modest space/time tradeoffs for deterministic exponential time, or separations between basic Turing machine memory models, would imply a positive answer to both (1) and (2).
These results apply to the derandomization of a wider class of explicit construction problems, where we have some efficient compression scheme that encodes $n$-bit strings using $<n$ bits, and we aim to construct an $n$-bit string which cannot be recovered from its encoding.



Changes to previous version:

Corrected an error in the proof of Theorem 8.


Revision #2 to TR22-025 | 11th May 2022 06:32

Derandomization from Time-Space Tradeoffs





Revision #2
Authors: Oliver Korten
Accepted on: 11th May 2022 06:32
Downloads: 147
Keywords: 


Abstract:

A recurring challenge in the theory of pseudorandomness and circuit complexity is the explicit construction of ``incompressible strings,'' i.e. finite objects which lack a specific type of structure or simplicity. In most cases, there is an associated NP search problem which we call the ``compression problem,'' where we are given a candidate object and must either find a compressed/structured representation of it or determine that none exist. For a particular notion of compressibility, a natural question is whether an efficient algorithm for the compression problem would aide us in the construction of incompressible objects. Consider the following two instances of this question:

(1) Does an efficient algorithm for circuit minimization imply efficient constructions of hard truth tables?

(2) Does an efficient algorithm for factoring integers imply efficient constructions of large prime numbers?

In this work, we connect these kinds of questions to the long-standing challenge of proving space/time tradeoffs for Turing machines, and proving stronger separations between the RAM and 1-tape computation models. In particular, one of our main theorems shows that modest space/time tradeoffs for deterministic exponential time, or separations between basic Turing machine memory models, would imply a positive answer to both (1) and (2).
These results apply to the derandomization of a wider class of explicit construction problems, where we have some efficient compression scheme that encodes $n$-bit strings using $<n$ bits, and we aim to construct an $n$-bit string which cannot be recovered from its encoding.



Changes to previous version:

title change


Revision #1 to TR22-025 | 11th May 2022 06:30

Efficient Low-Space Simulations From the Failure of the Weak Pigeonhole Principle





Revision #1
Authors: Oliver Korten
Accepted on: 11th May 2022 06:30
Downloads: 129
Keywords: 


Abstract:

A recurring challenge in the theory of pseudorandomness and circuit complexity is the explicit construction of ``incompressible strings,'' i.e. finite objects which lack a specific type of structure or simplicity. In most cases, there is an associated NP search problem which we call the ``compression problem,'' where we are given a candidate object and must either find a compressed/structured representation of it or determine that none exist. For a particular notion of compressibility, a natural question is whether an efficient algorithm for the compression problem would aide us in the construction of incompressible objects. Consider the following two instances of this question:

(1) Does an efficient algorithm for circuit minimization imply efficient constructions of hard truth tables?

(2) Does an efficient algorithm for factoring integers imply efficient constructions of large prime numbers?

In this work, we connect these kinds of questions to the long-standing challenge of proving space/time tradeoffs for Turing machines, and proving stronger separations between the RAM and 1-tape computation models. In particular, one of our main theorems shows that modest space/time tradeoffs for deterministic exponential time, or separations between basic Turing machine memory models, would imply a positive answer to both (1) and (2).
These results apply to the derandomization of a wider class of explicit construction problems, where we have some efficient compression scheme that encodes $n$-bit strings using $<n$ bits, and we aim to construct an $n$-bit string which cannot be recovered from its encoding.



Changes to previous version:

Changed title, modified some terminology, resolved typos, added section 6.


Paper:

TR22-025 | 15th February 2022 00:23

Efficient Low-Space Simulations From the Failure of the Weak Pigeonhole Principle





TR22-025
Authors: Oliver Korten
Publication: 20th February 2022 14:24
Downloads: 391
Keywords: 


Abstract:

A recurring challenge in the theory of pseudorandomness and circuit complexity is the explicit construction of ``incompressible strings,'' i.e. finite objects which lack a specific type of structure or simplicity. In most cases, there is an associated NP search problem which we call the ``compression problem,'' where we are given a candidate object and must either find a compressed/structured representation of it or determine that none exist. For a particular notion of compressibility, a natural question is whether an efficient algorithm for the compression problem would aide us in the construction of incompressible objects. Consider the following two instances of this question:

(1) Does an efficient algorithm for circuit minimization imply efficient constructions of hard truth tables?

(2) Does an efficient algorithm for factoring integers imply efficient constructions of large prime numbers?

In this work, we connect these kinds of questions to the long-standing challenge of proving space/time tradeoffs for Turing machines, and proving stronger separations between the RAM and 1-tape computation models. In particular, one of our main theorems shows that modest space/time tradeoffs for deterministic exponential time, or separations between basic Turing machine memory models, would imply a positive answer to both (1) and (2).
These results apply to a more general class of explicit construction problems, where we have some efficient compression scheme that encodes $n$-bit strings using $<n$ bits, and we aim to construct an $n$-bit string which cannot be recovered from its encoding.



ISSN 1433-8092 | Imprint