We give two time- and space-efficient simulations of quantum computations with
intermediate measurements, one by classical randomized computations with
unbounded error and the other by quantum computations that use an arbitrary
fixed universal set of gates. Specifically, our simulations show that every
language solvable by a bounded-error quantum algorithm running in time $t$ and
space $s$ is also solvable by an unbounded-error randomized algorithm running
in time $O(t\cdot\log{t})$ and space $O(s+\log{t})$, as well as by a
bounded-error quantum algorithm restricted to use an arbitrary universal set
and running in time $O(t\cdot\polylog{t})$ and space $O(s+\log{t})$, provided
the universal set is closed under adjoint. We also develop a quantum model
that is particularly suitable for the study of general computations with
simultaneous time and space bounds.
As an application of our randomized simulation, we obtain the first nontrivial
lower bound for general quantum algorithms solving problems related to
satisfiability. Our bound applies to MajSAT and MajMajSAT, which are
the problems of determining the truth value of a given Boolean formula whose
variables are fully quantified by one or two majority quantifiers,
respectively. We prove that for every real $d$ and every positive real
$\delta$ there exists a real $c>1$ such that either:
(i) MajMajSAT does not have a bounded-error quantum algorithm running in
time $O(n^c)$, or
(ii) MajSAT does not have a bounded-error quantum algorithm running in
time $O(n^d)$ and space $O(n^{1-\delta})$.
In particular, MajMajSAT does not have a bounded-error quantum algorithm
running in time $O(n^{1+o(1)})$ and space $O(n^{1-\delta})$ for any
$\delta>0$. Our lower bounds hold for any reasonable uniform model of quantum
computation, in particular for the model we develop.
We give two time- and space-efficient simulations of quantum computations with
intermediate measurements, one by classical randomized computations with
unbounded error and the other by quantum computations that use an arbitrary
fixed universal set of gates. Specifically, our simulations show that every
language solvable by a bounded-error quantum algorithm running in time $t$ and
space $s$ is also solvable by an unbounded-error randomized algorithm running
in time $O(t\cdot\log{t})$ and space $O(s+\log{t})$, as well as by a
bounded-error quantum algorithm restricted to use an arbitrary universal set
and running in time $O(t\cdot\polylog{t})$ and space $O(s+\log{t})$, provided
the universal set is closed under adjoint. We also develop a quantum model
that is particularly suitable for the study of general computations with
simultaneous time and space bounds.
As an application of our randomized simulation, we obtain the first nontrivial
lower bound for general quantum algorithms solving problems related to
satisfiability. Our bound applies to MajSAT and MajMajSAT, which are
the problems of determining the truth value of a given Boolean formula whose
variables are fully quantified by one or two majority quantifiers,
respectively. We prove that for every real $d$ and every positive real
$\delta$ there exists a real $c>1$ such that either:
(i) MajMajSAT does not have a bounded-error quantum algorithm running in
time $O(n^c)$, or
(ii) MajSAT does not have a bounded-error quantum algorithm running in
time $O(n^d)$ and space $O(n^{1-\delta})$.
In particular, MajMajSAT does not have a bounded-error quantum algorithm
running in time $O(n^{1+o(1)})$ and space $O(n^{1-\delta})$ for any
$\delta>0$. Our lower bounds hold for any reasonable uniform model of quantum
computation, in particular for the model we develop.