We consider the problem of compression for ``easy'' Boolean functions: given the truth table of an $n$-variate Boolean function $f$ computable by some \emph{unknown small circuit} from a \emph{known class} of circuits, find in deterministic time $\poly(2^n)$ a circuit $C$ (no restriction on the type of $C$) computing $f$ so that the size of $C$ is less than the trivial circuit size $2^n/n$.
We get both positive and negative results. On the positive side, we show that several circuit classes for which lower bounds are proved by a method of random restrictions:
\begin{itemize}
\item $\AC^0$,
\item (de Morgan) formulas, and
\item (read-once) branching programs,
\end{itemize}
allow non-trivial compression for circuits up to the size for which lower bounds are known. On the negative side, we show that compressing functions from any class $\mathcal{C}\subseteq\P/\poly$ implies superpolynomial lower bounds against $\mathcal{C}$ for a function in $\NEXP$; we also observe that compressing monotone functions of polynomial circuit complexity or functions computable by large-size $\AC^0$ circuits would also imply new superpolynomial circuit lower bounds.
Finally, we apply the ideas used for compression to get zero-error randomized \#SAT-algorithms for de Morgan and complete-basis formulas, as well as branching programs, on $n$ variables of about quadratic size that run in expected time $2^n/2^{n^{\epsilon}}$, for some $\epsilon>0$ (dependent on the size of the formula/branching program).