Academia.eduAcademia.edu

A geometric approach to support vector regression

2003, Neurocomputing

Abstract

We develop an intuitive geometric framework for support vector regression (SVR). By examining when-tubes exist, we show that SVR can be regarded as a classiÿcation problem in the dual space. Hard and soft-tubes are constructed by separating the convex or reduced convex hulls, respectively, of the training data with the response variable shifted up and down by. A novel SVR model is proposed based on choosing the max-margin plane between the two shifted data sets. Maximizing the margin corresponds to shrinking the e ective-tube. In the proposed approach, the e ects of the choices of all parameters become clear geometrically. The kernelized model corresponds to separating the convex or reduced convex hulls in feature space. Generalization bounds for classiÿcation can be extended to characterize the generalization performance of the proposed approach. We propose a simple iterative nearest-point algorithm that can be directly applied to the reduced convex hull case in order to construct soft-tubes. Computational comparisons with other SVR formulations are also included.