Several works have shown unconditional hardness (via integrality gaps) of computing equilibria using strong hierarchies of convex relaxations. Such results however only apply to the problem of computing equilibria that optimize a certain objective function and not to the (arguably more fundamental) task of finding \emph{any} equilibrium.
We present an algorithmic model based on the sum-of-squares (SoS) hierarchy that allows escaping this inherent limitation of integrality gaps. In this model, algorithms access the input game only through a relaxed solution to the natural SoS relaxation for computing equilibria. They can then adaptively construct a list of candidate solutions and invoke a verification oracle to check if any candidate on the list is a solution. This model captures most well-studied approximation algorithms such as those for Max-Cut, Sparsest Cut, and Unique-Games.
The state-of-the-art algorithms for computing exact and approximate equilibria in two-player, n-strategy games are captured in this model and require that at least one of i) size ($\approx$ running time) of the SoS relaxation or ii) the size of the list of candidates, be at least $2^{\Omega(n)}$ and $n^{\Omega(\log{(n)})}$ respectively. Our main result shows a lower bound that matches these upper bound up to constant factors in the exponent.
This can be interpreted as an unconditional confirmation, in our restricted algorithmic framework, of Rubinstein's recent \emph{conditional} hardness for computing approximate equilibria.
Our proof strategy involves constructing a family of games that all share a common sum-of-squares solution but every (approximate) equilibrium of one game is far from every (approximate) equilibrium of any other game in the family. Along the way, we strengthen the unconditional lower bound against enumerative algorithms for finding approximate equilibria due to Daskalakis-Papadimitriou and the classical hardness result for finding equilibria maximizing welfare due to Gilboa-Zemel.