We prove a time-space tradeoff lower bound of $T =
\Omega\left(n\log(\frac{n}{S}) \log \log(\frac{n}{S})\right) $ for
randomized oblivious branching programs to compute $1GAP$, also
known as the pointer jumping problem, a problem for which there is a
simple deterministic time $n$ and space $O(\log n)$ RAM (random
access machine) algorithm.
We give a similar time-space tradeoff of $T =
\Omega\left(n\log(\frac{n}{S}) \log \log(\frac{n}{S})\right)$ for
Boolean randomized oblivious branching programs computing
$\GIPshuffle$, a variation of the generalized inner product problem that
can be computed in time $n$ and space $O(\log^2 n)$ by a
deterministic Boolean branching program.
These are also the first lower bounds for randomized oblivious
branching programs computing explicit functions that apply for
$T=\omega(n\log n)$.
They also show that any simulation of
general branching programs by randomized oblivious ones requires either
a superlogarithmic increase in time
or an exponential increase in space.
We prove a time-space tradeoff lower bound of $T =
\Omega\left(n\log(\frac{n}{S}) \log \log(\frac{n}{S})\right) $ for
randomized oblivious branching programs to compute $1GAP$, also
known as the pointer jumping problem, a problem for which there is a
simple deterministic time $n$ and space $O(\log n)$ RAM (random
access machine) algorithm.
We give a similar time-space tradeoff of $T =
\Omega\left(n\log(\frac{n}{S}) \log \log(\frac{n}{S})\right)$ for
Boolean randomized oblivious branching programs computing
$\GIPshuffle$, a variation of the generalized inner product problem that
can be computed in time $n$ and space $O(\log^2 n)$ by a
deterministic Boolean branching program.
These are also the first lower bounds for randomized oblivious
branching programs computing explicit functions that apply for
$T=\omega(n\log n)$.
They also show that any simulation of
general branching programs by randomized oblivious ones requires either
a superlogarithmic increase in time
or a very substantial increase in space.
We prove a time-space tradeoff lower bound of $T =
\Omega\left(n\log(\frac{n}{S}) \log \log(\frac{n}{S})\right) $ for
randomized oblivious branching programs to compute $1GAP$, also
known as the pointer jumping problem, a problem for which there is a
simple deterministic time $n$ and space $O(\log n)$ RAM (random
access machine) algorithm.
We give a similar time-space tradeoff of $T =
\Omega\left(n\log(\frac{n}{S}) \log \log(\frac{n}{S})\right)$ for
Boolean randomized oblivious branching programs computing
$\GIPshuffle$, a variation of the generalized inner product problem that
can be computed in time $n$ and space $O(\log^2 n)$ by a
deterministic Boolean branching program.
These are also the first lower bounds for randomized oblivious
branching programs computing explicit functions that apply for
$T=\omega(n\log n)$.
They also show that any simulation of
general branching programs by randomized oblivious ones requires either
a superlogarithmic increase in time
or a very substantial increase in space.
We prove a time-space tradeoff lower bound of $T =
\Omega\left(n\log(\frac{n}{S}) \log \log(\frac{n}{S})\right) $ for
randomized oblivious branching programs to compute $1GAP$, also
known as the pointer jumping problem, a problem for which there is a
simple deterministic time $n$ and space $O(\log n)$ RAM (random
access machine) algorithm.
In a STOC 2010 paper, Ajtai (and independently, Damg{\aa}rd,
Meldgaard, and Nielsen) derived simulations of general RAMs by
randomized oblivious RAMs with only a polylogarithmic factor
increase in time and space. Our lower bound implies that a
superlogarithmic factor increase is indeed necessary in any such
simulation.
In contrast to the claim in the original version of our paper, the
model considered by Ajtai is incomparable to the one that we
consider here. (The same applies to Damgaard and Nielsen's
simulation to the extent that we understand it.)
Russell Impagliazzo brought to our attention that the randomized
oblivious branching programs we are considering are less general
than the oblivious randomized algorithms that Ajtai's simulation
uses. The main difference is that, in our model, for each fixing of
the random choices, the sequence of locations accessed must be
independent of the input though that sequence may depend on the
random choices, while in Ajtai's simulation, it is only necessary
that the {\em probability distribution} of the sequence of locations
accessed is input independent. Upon further investigation, we found
that Ajtai's simulation assumes that the original RAM algorithm only
has sequential access to its input, in which case the small space
upper bound for $1GAP_n$ does not apply (or alternatively the
original RAM algorithm has random access but at least linear space
which a deterministic oblivious branching program can use to
compute any function in linear time). Therefore, the upper bound
derived by Ajtai no longer applies in our model.
In our revised version of the paper, we derive the same
quantitative separation between general and randomized oblivious
algorithms in the case of Boolean inputs, a result that does not
follow from the separation for $1GAP_n$. We obtain this by
considering a variation of the $GIP$ problem, $GIP$-$MAP$ which
includes a pseudorandom permutation on the input bits of $GIP$ as
part of its input. To more accurately represent our results and to
avoid any confusion about their applicability we have changed the
title of our paper slightly.