We prove the first time-space lower bound tradeoffs for randomized
computation of decision problems. The bounds hold even in the
case that the computation is allowed to have arbitrary probability
of error on a small fraction of inputs. Our techniques are an
extension of those used by Ajtai in his time-space tradeoffs for
deterministic RAM algorithms computing element distinctness and for
Boolean branching programs computing a natural quadratic form.
Ajtai's bounds were of the following form: if the machine uses at most
$kn$ time for some constant $k$ it requires space at least
$\epsilon_k n$ for some constant $\epsilon_k$. In this paper we
provide an explicit relationship between $\epsilon_k$ and $k$ that
achieves larger lower bounds than those proven by Ajtai.
In particular, we obtain time-space tradeoff lower
bounds of the form $T=\Omega(n\sqrt{\log/\log\log (n/S)})$, which
implies that if the space is $n^{1-\epsilon}$ then
the time used is $\Omega(n\sqrt{\log /\log\log (n)})$.