'Support Vector Machine and its Applications in Classification Problems
A slecture by Xing Liu
Partially based on the ECE662 Spring 2014 lecture material of Prof. Mireille Boutin.
Outline of the slecture
- Background in Linear Classification Problem
- Support vector machine
- Summary
- References
Background in Linear Classification Problem
In a linear classification problem, the feature space can be divided into different regions by hyperplanes. In this lecture, we will take a two-catagory case to illustrate. Given training samples $ \vec{y}_1,\vec{y}_2,...\vec{y}_n \in \mathbb{R}^p $, each $ \vec{y}_i $ is a p-dimensional vector and belongs to either class $ w_1 $ or $ w_2 $. The goal is to find the maximum-margin hyperplane that separate the points in the feature space that belong to class $ w_1 $ from those belong to class$ w_2 $. The discriminate function can be written as
We want to find $ \vec{c}\in\mathbb{R}^{n+1} $ so that a testing data point $ \vec{y}_i $ is labelled
We can apply a trick here to replace all $ \vec{y} $'s in class $ w_2 $ by $ -\vec{y} $, then the task is looking for $ \vec{c} $ so that
Then hyperplane through origin is defined by $ \vec{c}\cdot \vec{y} = 0 $, where \vec{c} is the normal of the plane lying on the positive side of every hyperplane.
You might have already observe the ambiguity of c in the above discussion, which is, in the above case, if c separates data, $ \lambda \vec{c} $ also separates the data. One solution might be set $ |\vec{c}|=1 $. Another solution is to introduce the concept of "margin" which we denote by b, and ask
In this scenario, $ \frac{}{c} $ However, it is not always possible to find a solution for c. An alternative approach is to find c that minimize a criterion function $ J(\textbf{a}) $that satisfy $ \vec{c}\cdot \vec{y}>0 $.