Under the auspices of the Computational Complexity Foundation (CCF)

REPORTS > DETAIL:

### Paper:

TR19-054 | 9th April 2019 20:18

#### Bridging between 0/1 and Linear Programming via Random Walks

TR19-054
Authors: Joshua Brakensiek, Venkatesan Guruswami
Publication: 9th April 2019 20:18
Under the Strong Exponential Time Hypothesis, an integer linear program with $n$ Boolean-valued variables and $m$ equations cannot be solved in $c^n$ time for any constant $c < 2$. If the domain of the variables is relaxed to $[0,1]$, the associated linear program can of course be solved in polynomial time. In this work, we give a natural algorithmic bridging between these extremes of $0$-$1$ and linear programming. Specifically, for any subset (finite union of intervals) $E \subset [0,1]$ containing $\{0,1\}$, we give a random-walk based algorithm with runtime $O_E((2-\text{measure}(E))^n \text{poly}(n,m))$ that finds a solution in $E^n$ to any $n$-variable linear program with $m$ constraints that is feasible over $\{0,1\}^n$. Note that as $E$ expands from $\{0,1\}$ to $[0,1]$, the runtime improves smoothly from $2^n$ to polynomial.
Taking $E = [0,1/k) \cup (1-1/k,1]$ in our result yields as a corollary a randomized $(2-2/k)^{n}\text{poly}(n)$ time algorithm for $k$-SAT, matching the best known runtime for larger $k$. While our approach has some high level resemblance to Sch\"{o}ning's beautiful algorithm, our general algorithm is based on a more sophisticated random walk that incorporates several new ingredients, such as a multiplicative potential to measure progress, a judicious choice of starting distribution, and a time varying distribution for the evolution of the random walk that is itself computed via an LP at each step (a solution to which is guaranteed based on the minimax theorem). Plugging the LP algorithm into our earlier polymorphic framework yields fast exponential algorithms for any CSP (like $k$-SAT, $1$-in-$3$-SAT, NAE $k$-SAT) that admit so-called threshold partial polymorphisms."