We introduce a new technique which enables a learner without access to
hidden information to learn nearly as well as a learner with access to
hidden information. We apply our technique to solve an open problem
of Maass and Turan, showing that for any concept class F the least
number of queries sufficient for learning F by an algorithm which has
access only to arbitrary equivalence queries is at most a factor
of 1/ld(4/3) more than the least number of queries sufficient for
learning F by an algorithm which has access to both arbitrary
equivalence queries and membership queries. Previously known results
imply that the 1/ld(4/3) in our bound is best possible.
We describe analogous results for two generalizations of this model
to function learning, and apply those results to bound the difficulty
of learning in the harder of these models in terms of the difficulty
of learning in the easier model. We bound the difficulty of learning
unions of k concepts from a class F in terms of the difficulty of
learning F. We bound the difficulty of learning in a noisy environment
for deterministic algorithms in terms of the difficulty of learning in
a noise-free environment. We apply a variant of our technique to
develop an algorithm transformation that allows probabilistic learning
algorithms to nearly optimally cope with noise. A second variant
enables us to improve a general lower bound of Turan for the
PAC-learning model with queries. Finally, we show that logarithmically
many membership queries never help to obtain computationally efficient
learning algorithms.