One of the most exciting recent developments in machine learning is the discovery
and elaboration of kernel methods for classification and regression. These algorithms
combine three important ideas into a very successful whole. From mathematical
programming, they exploit quadratic programming algorithms for convex
optimization; from mathematical analysis, they borrow the idea of kernel representations;
and from machine learning theory, they adopt the objective of finding
the maximum-margin classifier. After the initial development of support vector
machines, there has been an explosion of kernel-based methods. Ralf Herbrich’s
Learning Kernel Classifiers is an authoritative treatment of support vector machines
and related kernel classification and regression methods. The book examines
these methods both from an algorithmic perspective and from the point of view of
learning theory. The book’s extensive appendices provide pseudo-code for all of the
algorithms and proofs for all of the theoretical results. The outcome is a volume
that will be a valuable classroom textbook as well as a reference for researchers in
this exciting area.
The goal of building systems that can adapt to their environment and learn from
their experience has attracted researchers from many fields, including computer
science, engineering, mathematics, physics, neuroscience, and cognitive science.
Out of this research has come a wide variety of learning techniques that have the
potential to transform many scientific and industrial fields. Recently, several research
communities have begun to converge on a common set of issues surrounding
supervised, unsupervised, and reinforcement learning problems. TheMIT Press
series on Adaptive Computation and Machine Learning seeks to unify the many diverse
strands of machine learning research and to foster high quality research and
innovative applications.
Thomas Dietterich
1