We show that "a little more time gives a lot more power to sampling algorithms." We prove that for every constant $k\ge 2$, every polynomial time bound $t$, and every polynomially small $\epsilon$, there exists a family of distributions on $k$ elements that can be sampled exactly in polynomial time but cannot be sampled within statistical distance $1-1/k-\epsilon$ in time $t$. This implies the following general time hierarchy for sampling distributions on arbitrary-size domains such as $\{0,1\}^n$: For every polynomial time bound $t$ and every constant $\epsilon>0$, there exists a family of distributions that can be sampled exactly in polynomial time but cannot be sampled within statistical distance $1-\epsilon$ in time $t$.
Our proof involves reducing the problem to a communication problem over a certain type of noisy channel. To solve the latter problem we use a type of list-decodable code for a setting where there is no bound on the number of errors but each error gives more information than an erasure. This type of code can be constructed using certain known traditional list-decodable codes, but we give a new construction that is elementary, self-contained, and tailored to this setting.
We show that "a little more time gives a lot more power to sampling algorithms." We prove that for every constant $k\ge 2$, every polynomial time bound $t$, and every polynomially small $\epsilon$, there exists a family of distributions on $k$ elements that can be sampled exactly in polynomial time but cannot be sampled within statistical distance $1-1/k-\epsilon$ in time $t$. This implies the following general time hierarchy for sampling distributions on arbitrary-size domains such as $\{0,1\}^n$: For every polynomial time bound $t$ and every constant $\epsilon>0$, there exists a family of distributions that can be sampled exactly in polynomial time but cannot be sampled within statistical distance $1-\epsilon$ in time $t$.
Our proof involves reducing the problem to a communication problem over a certain type of noisy channel. To solve the latter problem we use a type of list-decodable code for a setting where there is no bound on the number of errors but each error gives more information than an erasure. This type of code can be constructed using certain known traditional list-decodable codes, but we give a new construction that is elementary, self-contained, and tailored to this setting.
We prove that for every constant $k\ge 2$, every polynomial time bound $t$, and every polynomially small $\epsilon$, there exists a family of distributions on $k$ elements that can be sampled exactly in polynomial time but cannot be sampled within statistical distance $1-1/k-\epsilon$ in time $t$. Our proof involves reducing the problem to a communication problem over a certain type of noisy channel. We solve the latter problem by giving a construction of a new type of list-decodable code, for a setting where there is no bound on the number of errors but each error gives more information than an erasure.