A Support Vector Machine (SVM) is a very powerful and versatile supervised machine learning model, capable of performing linear or non-linear classification, regression, and even outlier detection.
Suppose we have given some data points that each belong to one of two classes, and the goal is to separate two classes based on a set of examples.
In SVM, a data point is viewed as a p-dimensional vector (a list of p numbers), and we wanted to know whether we can separate such points with a (p-1)-dimensional hyperplane. This is called a linear classifier.
There are many hyperplanes that classify the data. To choose the best hyperplane that represents the largest separation or margin between the two classes.
If such a hyperplane exists, it is known as a maximum-margin hyperplane and the linear classifier it defines is known as a maximum margin classifier. The best hyperplane that divides the data in H3
We have data (x1, y1), …, (xn, yn), and different features (xii, …, xip), and yiis either 1 or -1.
The equation of the hyperplane H3 is the set of points satisfying:
w. x-b = 0
Where w is the normal vector of the hyperplane. The parameter b||w||determines the offset of the hyperplane from the original along the normal vector w
So for each i, either xiis in the hyperplane of 1 or -1. Basically, xisatisfies:
w . xi - b = 1 or w. xi - b = -1