0% found this document useful (0 votes)
34 views13 pages

Support Vector Machine: Mr.A.Suresh Kumar

Support Vector Machine (SVM) is a supervised machine learning algorithm used for classification and regression, focusing on finding the optimal hyperplane that separates different classes. Key concepts include hyperplanes, support vectors, and margins, with techniques like soft margin allowing for some misclassification to achieve better performance. SVM can be linear or non-linear, utilizing kernel functions to transform data into higher dimensions for improved separation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views13 pages

Support Vector Machine: Mr.A.Suresh Kumar

Support Vector Machine (SVM) is a supervised machine learning algorithm used for classification and regression, focusing on finding the optimal hyperplane that separates different classes. Key concepts include hyperplanes, support vectors, and margins, with techniques like soft margin allowing for some misclassification to achieve better performance. SVM can be linear or non-linear, utilizing kernel functions to transform data into higher dimensions for improved separation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

Support Vector Machine

[Link] Kumar
Department of Electronics Engineering(VLSIDT)

[Link] COLLEGE OF TECHNOLOGY


(Autonomous) [Link] COLLEGE OF
TECHNOLOGY
(Autonomous)

[Link] [Link]
SUPPORT VECTOR MACHINE(SVM)

• Support Vector Machine (SVM) is a supervised machine learning


algorithm used for classification and regression tasks.
• It tries to find the best boundary known as hyperplane that
separates different classes in the data.
• It is useful when you want to do binary classification like spam vs.
not spam or cat vs. dog.

2
SUPPORT VECROR

3
MAXIMUM MARGIN CLASSIFIER

4
Key Concepts of Support Vector Machine

• Hyperplane: A decision boundary separating different classes


• Support Vectors: The closest data points to the hyperplane,
crucial for determining the hyperplane and margin in SVM.
• Margin: The distance between the hyperplane and the support
vectors. SVM aims to maximize this margin for better classification
performance.

5
Drawback of Maximum margin classifier

6
Key Terms

• Soft margin is a technique used in Support Vector Machine (SVM)


classification that allows for some misclassification of data points in
order to achieve a wider margin and a more flexible decision boundary.
• In traditional SVM classification with a hard margin, the goal is to find a
hyperplane that completely separates the two classes of data with no
misclassification.
• However, in real-world datasets, this is often not possible due to noise,
outliers, or other factors that make the data non-separable.
• The objective is to find the optimal hyperplane that maximizes the
margin while also minimizing the amount of misclassification.
• Cross Validation is used to determine the best soft margin.

7
Support Vector Classifier

8
Types of SVM

Linear SVM
• This is the simplest type.
• Used when your data is linearly separable, meaning you can draw
a straight line (or flat plane) that separates the classes without
mistakes.
Non-Linear SVM
• Real-world data — it can’t be split by a simple straight line.

9
Non Linear SVM

10
Kernal Function

• Kernel function: In SVM, a kernel function is used to transform the input


data into a higher dimensional space where it can be more easily
separated. Popular kernel functions include linear, polynomial, and radial
basis function (RBF) kernels.
• Kernel trick is a technique used in machine learning, specifically in
Support Vector Machine (SVM) algorithms, to transform a low-dimensional
input space to a higher dimensional feature space without actually
computing the coordinates of the data in the higher dimensional space.
• The kernel trick is based on the observation that the computation of the
dot product between two vectors in a high-dimensional space can be
done implicitly by defining a kernel function that measures the similarity
between two data points in the original input space.

11
Kernal Function

• By using the kernel function to compute the dot product, the data
points are effectively mapped to a higher dimensional space without
explicitly computing their coordinates.
• The kernel trick is a powerful technique that enables SVM to handle
complex, non-linear classification problems, while avoiding the
computational complexity and memory requirements of explicitly
computing the coordinates in a high-dimensional feature

12
Thank You !

[Link] COLLEGE OF TECHNOLOGY


(Autonomous)
Tiruchengode – 637 215. Namakkal Dt. Tamil Nadu. India

[Link] /ksrct1994

You might also like