TR22-145 Authors: Alexander Golovnev, Siyao Guo, Spencer Peters, Noah Stephens-Davidowitz

Publication: 9th November 2022 19:55

Downloads: 724

Keywords:

We study the black-box function inversion problem, which is the problem of finding $x \in [N]$ such that $f(x) = y$, given as input some challenge point $y$ in the image of a function $f : [N] \to [N]$, using $T$ oracle queries to $f$ and preprocessed advice $\sigma \in \{0,1\}^S$ depending on $f$. We prove a number of new results about this problem, as follows.

1. We show an algorithm that works for any $T$ and $S$ satisfying $T S^2 \cdot \max\{S,T\} = \widetilde{\Theta}(N^3)$. In the important setting when $S < T$, this improves on the celebrated algorithm of Fiat and Naor [STOC, 1991], which requires $T S^3 \geq N^3$. E.g., Fiat and Naor's algorithm is only non-trivial for $S \gg N^{2/3}$, while our algorithm gives a non-trivial tradeoff for any $S \gg N^{1/2}$. (Our algorithm and analysis are quite simple. As a consequence of this, we also give a self-contained and simple proof of Fiat and Naor's original result, with certain optimizations left out for simplicity.)

2. We show a non-adaptive algorithm (i.e., an algorithm whose $i$th query $x_i$ is chosen based entirely on $\sigma$ and $y$, and not on the $f(x_1),\ldots, f(x_{i-1})$) that works for any $T$ and $S$ satisfying $S = \Theta(N \log(N/T))$ giving the first non-trivial non-adaptive algorithm for this problem. E.g., setting $T = N/ \mathrm{poly\,log}(N)$ gives $S = \Theta(N \log \log N)$. This answers a question due to Corrigan-Gibbs and Kogan [TCC, 2019], who asked whether it was possible for a non-adaptive algorithm to work with parameters $T$ and $S$ satisfying $T + S/\log N < o(N)$. We also observe that our non-adaptive algorithm is what we call a guess-and-check algorithm, that is, it is non-adaptive and its final output is always one of the oracle queries $x_1,\ldots, x_T$. For guess-and-check algorithms, we prove a matching lower bound, therefore completely characterizing the achievable parameters $(S,T)$ for this natural class of algorithms. (Corrigan-Gibbs and Kogan showed that any such lower bound for arbitrary non-adaptive algorithms would imply new circuit lower bounds.)

3. We show equivalence between function inversion and a natural decision version of the problem in both the worst case and the average case, and similarly for functions $f : [N] \to [M]$ with different ranges.

All of the above results are most naturally described in a model with shared randomness (i.e., random coins shared between the preprocessing algorithm and the online algorithm). However, as an additional contribution, we show (using a technique from communication complexity due to Newman [IPL, 1991]) how to generically convert any algorithm that uses shared randomness into one that does not.