Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR23-171 | 15th November 2023 04:22

Beating Brute Force for Compression Problems

RSS-Feed

Abstract:

A compression problem is defined with respect to an efficient encoding function $f$; given a string $x$, our task is to find the shortest $y$ such that $f(y) = x$. The obvious brute-force algorithm for solving this compression task on $n$-bit strings runs in time $O(2^{\ell} \cdot t(n))$, where $\ell$ is the length of the shortest description $y$ and $t(n)$ is the time complexity of $f$ when it prints $n$-bit output.

We prove that every compression problem has a Boolean circuit family which finds short descriptions more efficiently than brute force. In particular, our circuits have size $2^{4 \ell / 5} \cdot poly(t(n))$, which is significantly more efficient for all $\ell \gg \log(t(n))$. Our construction builds on Fiat-Naor's data structure for function inversion [SICOMP 1999]: we show how to carefully modify their data structure so that it can be nontrivially implemented using Boolean circuits, and we show how to utilize hashing so that the circuit size is only exponential in the description length.

As a consequence, the Minimum Circuit Size Problem for generic fan-in two circuits of size $s(n)$ on truth tables of size $2^n$ can be solved by circuits of size $2^{\frac{4}{5} \cdot w + o(w)} \cdot poly(2^n)$, where $w = s(n) \log_2(s(n) + n)$. This improves over the brute-force approach of trying all possible size-$s(n)$ circuits for all $s(n) \geq n$. Similarly, the task of computing a short description of a string $x$ when its $K^t$-complexity is at most $\ell$, has circuits of size $2^{\frac{4}{5} \ell} \cdot poly(t)$. We also give nontrivial circuits for computing $Kt$ complexity on average, and for solving NP relations with ``compressible'' instance-witness pairs.



ISSN 1433-8092 | Imprint