Site icon TOSS

Support Vector Machine

Support Vector Machine is a Supervised Learning algorithm.

  1. What is Support Vector Machine
  2. Types of SVM
  3. SVM Terminologies
  4. Applications of SVM
  5. Pros & Cons
  6. Conclusion

What is Support Vector Machine (SVM)

                SVM is one of the Supervised Learning Algorithm, used for classification and regression problems. The main objective of SVM is to find the optimal Hyper-plane in N-Dimensional space (N-Features) helping in evidently classifying the data points.

The below Graph represents the Optimal Hyperplane in SVM.

Types of SVM

Figure i: SVM Types and Optimal Hyperplanes

SVM Terminologies

Hyperplanes

Figure ii: Optimal Hyperplane

Here there are 2 classes having different shapes and let’s understand on basis of this example.

  1. Hyperplane:
  2. Hyperplane is the decision boundary that help to classify the data points belonging to respective the classes.
  3. Considering the figure ii,

Hyperplane segregates two different classes belonging to circle and square classes respectively.

The below Graph represents Positive, Negative Hyperplanes.

Figure iii: Positive, Negative Hyperplanes

Now Consider figure iii (for point 2, 3)

  1. Positive Hyperplane:
  2. The data point closest above/right to the optimal hyperplane forms a positive hyperplane.
  3. Negative Hyperplane:
  4. The data point closest below/left to the optimal hyperplane forms a negative hyperplane.

Margin

  1. Margin: The distance between Positive Hyperplane and Negative Hyperplane is said to be a Margin.
  2. Maximum Margin: The maximum distance between Positive Hyperplane and Negative Hyperplane is said to be Maximum Margin.

(For visual representation kindly look up figure iv)

The below graphs represents the Small Margin and Large Margin

Figure iv: Small Margin
Figure v: Large Margin

Note: From two of the above Graph,

  1. SVM’s motive is to increase or to achieve the Large Margin (Maximum Margin).
  2. Larger the Margin, better the accuracy and stability of a model.

Support Vectors

The below Graph shows the representation of Support Vectors.

Figure vi: Support Vectors

Figure vii: : Hyperplane orientation 1
Figure viii: Hyperplane orientation 2

Hyper-parameter

                Hyper-parameter are the parameters whose values control the learning process of a model.

Example:

Regularization (C)

                Regularization (C) is a hyper-parameter in SVM for error control or control on misclassifying classes.

Figure ix: High Regularization

Note:

Figure x: Medium Regularization
Figure xi: Low Regularization

Gamma-Γ

                Gamma is a hyper-parameter which is used to define the curvature.

Figure xiii: Low Gamma

Figure xii: High Gamma

Overview of Regularization and Gamma:

Have a Question?

Kernel (sklearn)

                Kernel is a hyper-parameter in SVM used to perform learning operations based on functionality.

Applications of SVM?

Pros & Cons

Advantages

Disadvantages

Conclusion

Figure xiv: SVM Complete Overview

Thus, we conclude the basic theoretical knowledge of Support Vector Machine. There are many more applications which could be handled using SVM in various scenarios.

Hope this blog was helpful, good luck. Thank you.

And also if you required any technology you want to learn, let us know below we will publish them in our site http://tossolution.com/

Like our page in Facebook and follow us for New technical information.

Exit mobile version