Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



ECCC BOOKS, LECTURES AND SURVEYS > THE COMPUTATIONAL COMPLEXITY OF RANDOMNESS:
Thomas Weir Watson

The Computational Complexity of Randomness

Download

University of California, Berkeley 2013

Abstract

This dissertation explores the multifaceted interplay between efficient computation and probability distributions. We organize the aspects of this interplay according to whether the randomness occurs primarily at the level of the problem or the level of the algorithm, and orthogonally according to whether the output is random or the input is random.

Part I concerns settings where the problem's output is random. A sampling problem associates to each input x a probability distribution D(x), and the goal is to output a sample from D(x) (or at least get statistically close) when given x. Although sampling algorithms are fundamental tools in statistical physics, combinatorial optimization, and cryptography, and algorithms for a wide variety of sampling problems have been discovered, there has been comparatively little research viewing sampling through the lens of computational complexity. We contribute to the understanding of the power and limitations of efficient sampling by proving a time hierarchy theorem which shows, roughly, that "a little more time gives a lot more power to sampling algorithms."

Part II concerns settings where the algorithm's output is random. Even when the specification of a computational problem involves no randomness, one can still consider randomized algorithms that produce a correct output with high probability. A basic question is whether one can derandomize algorithms, i.e., reduce or eliminate their dependence on randomness without incurring much loss in efficiency. Although derandomizing arbitrary time-efficient algorithms can only be done assuming unproven complexity-theoretic conjectures, it is possible to unconditionally construct derandomization tools called pseudorandom generators for restricted classes of algorithms. We give an improved pseudorandom generator for a new class, which we call combinatorial checkerboards. The notion of pseudorandomness shows up in many contexts besides derandomization. The so-called Dense Model Theorem, which has its roots in the famous theorem of Green and Tao that the primes contain arbitrarily long arithmetic progressions, is a result about pseudorandomness that has turned out to be a useful tool in computational complexity and cryptography. At the heart of this theorem is a certain type of reduction, and in this dissertation we prove a strong lower bound on the advice complexity of such reductions, which is analogous to a list size lower bound for list-decoding of error-correcting codes.

Part III concerns settings where the problem's input is random. We focus on the topic of randomness extraction, which is the problem of transforming a sample from a high-entropy but non-uniform probability distribution (that represents an imperfect physical source of randomness) into a uniformly distributed sample (which would then be suitable for use by a randomized algorithm). It is natural to model the input distribution as being sampled by an efficient algorithm (since randomness is generated by efficient processes in nature), and we give a construction of an extractor for the case where the input distribution's sampler is extremely efficient in parallel time. A related problem is to estimate the min-entropy ("amount of randomness") of a given parallel-time-efficient sampler, since this dictates how many uniformly random output bits one could hope to extract from it. We characterize the complexity of this problem, showing that it is "slightly harder than NP-complete."

Part IV concerns settings where the algorithm's input is random. Average-case complexity is the study of the power of algorithms that are allowed to fail with small probability over a randomly chosen input. This topic is motivated by cryptography and by modeling heuristics. A fundamental open question is whether the average-case hardness of NP is implied by the worst-case hardness of NP. We exhibit a new barrier to showing this implication, by proving that a certain general technique (namely, "relativizing proofs by reduction") cannot be used. We also prove results on hardness amplification, clarifying the quantitative relationship between the running time of an algorithm and the probability of failure over a random input (both of which are desirable to minimize).


Table of Contents
1 Introduction                                                                     1
1.1 Computational Complexity . . . . . . . . . . . . . . . . . . . . . . . . . .   1
1.2 Randomness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .   2
1.3 Our Contributions  . . . . . . . . . . . . . . . . . . . . . . . . . . . . .   5
1.3.1 Part I . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .   5
1.3.2 Part II  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .   6
1.3.3 Part III . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .   8
1.3.4 Part IV  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .   9

I The Problem's Output is Random                                                  12

2 Time Hierarchies for Sampling Distributions                                     13
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  13
2.1.1 Results  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  14
2.1.2 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  15
2.2 Intuition for Theorem 2.1  . . . . . . . . . . . . . . . . . . . . . . . . .  16
2.3 Proof of Theorem 2.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . .  20
2.4 List-Decoding from Ubiquitous Informative Errors . . . . . . . . . . . . . .  25
2.4.1 Discussion of the Model of Error-Correction  . . . . . . . . . . . . . . .  26
2.4.2 Proof of Theorem 2.3 . . . . . . . . . . . . . . . . . . . . . . . . . . .  27
2.4.2.1 A Combinatorial Lemma  . . . . . . . . . . . . . . . . . . . . . . . . .  27
2.4.2.2 Code Construction  . . . . . . . . . . . . . . . . . . . . . . . . . . .  28

II The Algorithm's Output is Random                                               31

3 Pseudorandom Generators for Combinatorial Checkerboards                         32
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  32
3.1.1 Combinatorial Checkerboards  . . . . . . . . . . . . . . . . . . . . . . .  33
3.1.2 Our Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  35
3.1.3 Overview of the Proof  . . . . . . . . . . . . . . . . . . . . . . . . . .  36
3.1.4 Preliminaries  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  37
3.2 Deriving Theorem 3.3 from Lemma 3.7 and Lemma 3.8  . . . . . . . . . . . . .  38
3.3 The High-Weight Case . . . . . . . . . . . . . . . . . . . . . . . . . . . .  40
3.3.1 Proof of Lemma 3.7 . . . . . . . . . . . . . . . . . . . . . . . . . . . .  40
3.3.2 Proof of Lemma 3.14  . . . . . . . . . . . . . . . . . . . . . . . . . . .  42
3.3.3 Proof of the Tree Labeling Lemma . . . . . . . . . . . . . . . . . . . . .  45
3.3.3.1 Intuition  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  46
3.3.3.2 Formal Argument  . . . . . . . . . . . . . . . . . . . . . . . . . . . .  46
3.4 The Low-Weight Case  . . . . . . . . . . . . . . . . . . . . . . . . . . . .  48
3.4.1 Proof of Lemma 3.8 . . . . . . . . . . . . . . . . . . . . . . . . . . . .  48
3.4.2 Dimension Reduction  . . . . . . . . . . . . . . . . . . . . . . . . . . .  51
3.4.3 Proof of Lemma 3.36  . . . . . . . . . . . . . . . . . . . . . . . . . . .  54

4 Advice Lower Bounds for the Dense Model Theorem                                 55
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  55
4.1.1 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  59
4.2 Intuition  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  60
4.3 Formal Proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  62
4.3.1 Binomial Distribution Tail . . . . . . . . . . . . . . . . . . . . . . . .  62
4.3.2 Combinatorial Designs  . . . . . . . . . . . . . . . . . . . . . . . . . .  65
4.3.3 Notational Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . .  66
4.3.4 Proof of Theorem 4.6 . . . . . . . . . . . . . . . . . . . . . . . . . . .  66
4.3.4.1 Distributions Without Dense Models . . . . . . . . . . . . . . . . . . .  67
4.3.4.2 Distributions That Cannot Be Covered . . . . . . . . . . . . . . . . . .  67
4.3.4.3 Setting the Parameters . . . . . . . . . . . . . . . . . . . . . . . . .  68
4.3.4.4 The Majority of Majorities . . . . . . . . . . . . . . . . . . . . . . .  69
4.3.4.5 Putting It All Together  . . . . . . . . . . . . . . . . . . . . . . . .  70

III The Problem's Input is Random                                                 72

5 Extractors and Lower Bounds for Locally Samplable Sources                       73
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  73
5.1.1 Results  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  75
5.1.2 Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  76
5.1.3 Concurrent Work  . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  77
5.1.4 Previous Work on the Power of Locally Computable Functions . . . . . . . .  77
5.2 Preliminaries  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  77
5.3 1-Local Sources  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  79
5.4 d-Local Sources  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  81
5.4.1 Superindependent Matchings . . . . . . . . . . . . . . . . . . . . . . . .  81
5.4.2 Proof of Theorem 5.13  . . . . . . . . . . . . . . . . . . . . . . . . . .  82
5.5 Increasing the Output Length . . . . . . . . . . . . . . . . . . . . . . . .  83
5.5.1 The General Method . . . . . . . . . . . . . . . . . . . . . . . . . . . .  83
5.5.2 Applying Theorem 5.19  . . . . . . . . . . . . . . . . . . . . . . . . . .  86
5.6 Improved Lower Bounds for Sampling Input-Output Pairs  . . . . . . . . . . .  87
5.7 Improved Extractors for Low-Weight Affine Sources  . . . . . . . . . . . . .  89
5.7.1 A Linear Strong Seeded Extractor with Seed Length log n + O(log k) . . . .  89
5.7.2 Proof of Theorem 5.10  . . . . . . . . . . . . . . . . . . . . . . . . . .  91

6 The Complexity of Estimating Min-Entropy                                        94
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  94
6.1.1 Definitions  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  95
6.1.2 Results  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  97
6.1.3 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  98
6.2 Proof of Theorem 6.3 and Theorem 6.4 . . . . . . . . . . . . . . . . . . . .  98
6.3 Proof of Theorem 6.5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
6.4 SBP is Closed Under Nondeterminism . . . . . . . . . . . . . . . . . . . . . 103

IV The Algorithm's Input is Random                                               105

7 Relativized Worlds Without Worst-Case to Average-Case Reductions for NP        106
7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
7.1.1 Notions of Reductions and Relationship to Previous Work  . . . . . . . . . 107
7.1.2 Results  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
7.1.3 Independent Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
7.1.4 Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.2 Preliminaries  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.2.1 Complexity Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.2.2 Reductions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
7.2.3 Relativization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
7.2.4 Uniform Complexity Classes . . . . . . . . . . . . . . . . . . . . . . . . 115
7.3 Intuition  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
7.3.1 Intuition for Theorem 7.1  . . . . . . . . . . . . . . . . . . . . . . . . 115
7.3.1.1 Intuition for Strengthening HeurBPP to AvgZPP  . . . . . . . . . . . . . 118
7.3.1.2 Intuition for Strengthening U to PSamp . . . . . . . . . . . . . . . . . 119
7.3.1.3 Intuition for Strengthening NP to BPP^NP_||  . . . . . . . . . . . . . . 120
7.3.2 Intuition for Theorem 7.2 and Theorem 7.3  . . . . . . . . . . . . . . . . 121
7.4 Generic Setup for the Formal Proofs  . . . . . . . . . . . . . . . . . . . . 123
7.5 Proof of Theorem 7.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
7.5.1 Main Construction  . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
7.5.2 Proof of Lemma 7.10  . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
7.6 Proof of Theorem 7.2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
7.6.1 Main Construction  . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
7.7 Proof of Theorem 7.3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
7.7.1 Main Construction  . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136

8 Query Complexity in Errorless Hardness Amplification                           138
8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
8.1.1 The Errorless XOR Lemma  . . . . . . . . . . . . . . . . . . . . . . . . . 140
8.1.2 Monotone Errorless Amplification . . . . . . . . . . . . . . . . . . . . . 141
8.1.3 Black-Box Lower Bounds . . . . . . . . . . . . . . . . . . . . . . . . . . 143
8.1.4 Preliminaries  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
8.2 Proof of Theorem 8.2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
8.3 Proof of Theorem 8.5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
8.4 Proof of Theorem 8.10  . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

Bibliography                                                                     154


ISSN 1433-8092 | Imprint