Lower bound methods and separation results for on-line learning models
We consider the complexity of concept learning in various common models for
on-line learning, focusing on methods for proving lower bounds to the
learning complexity of a concept class. Among others, we consider the model
for learning with equivalence and membership queries. For this model we give
lower bounds on the number of queries that are needed to learn a concept
in terms of the Vapnik-Chervonenkis dimension of
, and in terms
of the complexity of learning
with arbitrary equivalence queries.
Furthermore, we survey other known lower bound methods and we exhibit all
known relationships between learning complexities in the models considered
and some relevant combinatorial parameters. As it turns out, the picture is
almost complete. This paper has been written so that it can be read without
previous knowledge of Computational Learning Theory. Keywords.
Formal models for learning, learning algorithms, lower bound arguments,
VC-dimension, machine learning
Reference: W. Maass and G. Turan.
Lower bound methods and separation results for on-line learning models.
Machine Learning, 9:107-145, 1992.
Invited paper for a special issue of Machine Learning.