Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
66 pages
1 file
Support Vector Machines (SVM) present a robust framework for empirical data modeling, uniquely suited for both classification and regression tasks. By utilizing the Structural Risk Minimization principle, SVMs surpass traditional neural networks' performance by effectively addressing issues of generalization and overfitting. This paper elucidates the foundational concepts of statistical learning theory, detailing the formulation and operational mechanics of SVMs, as well as illustrating their application with theoretical examples.
1999
In this report we show some consequences of the work done by Pontil et al. in 1]. In particular we show that in the same hypotheses of the theorem proved in their paper, the optimal approximating hyperplane f R found by SVM regression classi es the data. This means that y i f R (x i ) > 0 for points which live externally to the margin between the two classes or points which live internally to the margin but correctly classi ed by SVM classi cation. Moreover y i f R (x i ) < 0 for incorrectly classi ed points. Finally, the zero level curve of the optimal approximating hyperplane determined by SVMR and the optimal separating hyperplane determined by SVMC coincide.
1999
In this report we show that the -tube size in Support Vector Machine (SVM) for regression is 2 = p 1 + jjwjj 2 . By using this result we show that, in the case all the data points are inside the -tube, minimizing jjwjj 2 in SVM for regression is equivalent to maximizing the distance between the approximating hyperplane and the farest points in the training set. Moreover, in the most general setting in which the data points live also outside the -tube, we show that, for a xed value of , minimizing jjwjj 2 is equivalent to maximizing the sparsity of the representation of the optimal approximating hyperplane, that is equivalent to minimizing the number of coe cients di erent from zero in the expression of the optimal w. Then, the solution found by SVM for regression is a tradeo between sparsity of the representation and closeness to the data. We also include a complete derivation of SVM for regression in the case of linear approximation.
Kernel methods and support vector machines have become the most popular learning from examples paradigms. Several areas of application research make use of SVM approaches as for instance hand written character recognition, text categorization, face detection, pharmaceutical data analysis and drug design. Also, adapted SVM’s have been proposed for time series forecasting and in computational neuroscience as a tool for detection of symmetry when eye movement is connected with attention and visual perception. The aim of the paper is to investigate the potential of SVM’s in solving classification and regression tasks as well as to analyze the computational complexity corresponding to different methodologies aiming to solve a series of afferent arising sub-problems.
Neurocomputing, 2003
Support vector machines (SVMs) are rarely benchmarked against other classiÿcation or regression methods. We compare a popular SVM implementation (libsvm) to 16 classiÿcation methods and 9 regression methods-all accessible through the software R-by the means of standard performance measures (classiÿcation error and mean squared error) which are also analyzed by the means of bias-variance decompositions. SVMs showed mostly good performances both on classiÿcation and regression tasks, but other methods proved to be very competitive.
2013
Support Vector-based learning methods are an important part of Computational Intelligence techniques. Recent efforts have been dealing with the problem of learning from very large datasets. This paper reviews the most commonly used formulations of support vector machines for regression (SVRs) aiming to emphasize its usability on large-scale applications. We review the general concept of support vector machines (SVMs), address the state-of-the-art on training methods SVMs, and explain the fundamental principle of SVRs. The most common learning methods for SVRs are introduced and linear programming-based SVR formulations are explained emphasizing its suitability for large-scale learning. Finally, this paper also discusses some open problems and current trends.
Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2014
This paper provides an overview of the support vector machine (SVM) methodology and its applicability to real-world engineering problems. Specifically, the aim is to review the current state of the SVM technique, and to show some of its latest successful results in real problems in different engineering fields. The paper starts by reviewing the main basic concepts of SVMs and kernel methods. Kernel theory, SVMs, support vector regression (SVR), and SVM in signal processing and hybridization of SVMs with meta-heuristics are fully described in the first part of this paper. The adoption of SVMs in engineering is nowadays a fact. As we illustrate in this paper, SVMs can handle high-dimensional, heterogeneous and scarcely labeled datasets very efficiently, and it can be also successfully tailored to particular applications. The second part of this review is devoted to present different case studies in real engineering problems, where the application of the SVM methodology has obtained excellent results. First, we discuss the application of SVR algorithms in two renewable energy problems: the wind speed prediction from measurements in neighbor stations and the wind speed reconstruction using synoptic-pressure data. The application of SVMs in noninvasive cardiac indices estimation is described next, and real results on this topic are presented. The application of SVMs in problems of functional magnetic resonance imaging (fMRI) data processing is further discussed in the paper: brain decoding and mental disorder characterization. The following application deals with antenna array processing: SVMs for spatial nonlinear beamforming, and the SVM application in a problem of arrival angle detection. Finally, the application of SVMs to remote sensing image classification and target detection problems closes this review.
Wiley Encyclopedia of Operations Research and Management Science, 2010
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Applied Soft Computing, 2012
Advances in Neural Information Processing Systems-9, 1997
Wiley Interdisciplinary Reviews: Computational Statistics, 2009
Department of Computer Science, University of …, 2004
Special Issue “Some Novel Algorithms for Global Optimization and Relevant Subjects”, Applied and Computational Mathematics (ACM), Vol. 6, No. 4-1, pages 1-15, 2016
Proceedings of 2004 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.04EX826), 2004
Lecture Notes in Computer Science, 2000
Applied Soft Computing, 2020