__
TR98-013 | 3rd March 1998 00:00
__

#### A New Composition Theorem for Learning Algorithms

**Abstract:**

We present a new approach to the composition

of learning algorithms (in various models) for

classes of constant VC-dimension into learning algorithms for

more complicated classes.

We prove that if a class $\CC$ is learnable

in time $t$ from a hypothesis class $\HH$ of constant VC-dimension

then the class $\C^\star$ of all

functions $F$ of the form $F=f(g_1,\ldots,g_m)$,

where $f$ is any function

and $g_1,\ldots,g_m\in \C$, is

learnable in time polynomial in $t$ and $m$ .

We also use a simple

argument to prove that the composition theorem cannot be extended

to classes with a nonconstant VC-dimension.

A composition theorem for the exact learning model

(with equivalence queries only)

is proven in [BBK97] {\it only} for classes $\C$

of constant VC-dimension

that have {\it constant space} learning algorithms.

Constant space algorithms are hard to find

and have large complexity.

Our algorithm is simple and has a

complexity lower than the algorithm in [BBK97].

We then show how to change a PAC-learning

algorithm of $\CC$ from $\HH$ to an

SQ-learning algorithm and to a PAC-learning

algorithm for $\CC^\star$ with malicious noise

that achieves the optimal error rate $\eta/(1-\eta)+\beta$

for any $\beta$.

This, in particular, shows that if a class of

constant VC-dimension is PAC-learnable from a

class of constant VC-dimension then

it is SQ-learnable and PAC-learnable with malicious noise.

We apply this result for SQ-learning and PAC-learning with malicious

noise a general class of geometric objects.

This class includes the set of all geometric objects

in the constant dimensional space

that are bounded by $m$ algebraic surfaces of constant

degree (for example, hyperplanes, spheres, etc.).

This result generalizes all the results known

from the literature about learning geometric objects

in the SQ-learning and PAC-learning models with malicious noise.