Suppose we want to minimize a polynomial $p(x) = p(x_1, \dots, x_n)$, subject to some polynomial constraints $q_1(x), \dots, q_m(x) \geq 0$, using the Sum-of-Squares (SOS) SDP hierarachy. Assume we are in the "explicitly bounded" ("Archimedean") case where the constraints include $x_i^2 \leq 1$ for all $1 \leq i \leq n$. It is often stated that the degree-$d$ version of the SOS hierarchy can be solved, to high accuracy, in time $n^{O(d)}$. Indeed, I myself have stated this in several previous works.
The point of this note is to state (or remind the reader) that this is not obviously true. The difficulty comes not from the "$r$" in the Ellipsoid Algorithm, but from the "$R$"; a priori, we only know an exponential upper bound on the number of bits needed to write down the SOS solution. An explicit example is given of a degree-$2$ SOS program illustrating the difficulty.
Slightly improved an argument in Section 4, based on a suggestion of David Steurer.
Suppose we want to minimize a polynomial $p(x) = p(x_1, \dots, x_n)$, subject to some polynomial constraints $q_1(x), \dots, q_m(x) \geq 0$, using the Sum-of-Squares (SOS) SDP hierarachy. Assume we are in the "explicitly bounded" ("Archimedean") case where the constraints include $x_i^2 \leq 1$ for all $1 \leq i \leq n$. It is often stated that the degree-$d$ version of the SOS hierarchy can be solved, to high accuracy, in time $n^{O(d)}$. Indeed, I myself have stated this in several previous works.
The point of this note is to state (or remind the reader) that this is not obviously true. The difficulty comes not from the "$r$" in the Ellipsoid Algorithm, but from the "$R$"; a priori, we only know an exponential upper bound on the number of bits needed to write down the SOS solution. An explicit example is given of a degree-$2$ SOS program illustrating the difficulty.