Valiant (2007) proposed a computational model for evolution and suggested that evolvability be studied in the framework of computational learning theory. Feldman (2008) showed that Valiant’s evolution model is equivalent to the correlational statistical query (CSQ) learning model, which is a restricted setting of the statistical query (SQ) model. Evolvability in Valiant’s model has been shown to be quite restricted; Valiant observed that the class of parities is not evolvable even under the uniform distribution. Subsequently, Feldman (2008, 2011) showed that in a distribution-independent sense, linear separators and decision-lists are not even weakly evolvable, and that the class of conjunctions is not evolvable. All these results are based on information-theoretic arguments.
In this paper, we show that computational bottlenecks for evolvability in Valiant’s model exist, even when information-theoretic bottlenecks are absent. In particular, assuming that polynomial size circuits are not PAC learnable, we construct a concept class that is not efficiently evolvable; in contrast, when unbounded computation is allowed this concept class can be evolved in an information-theoretically efficient manner.