Line 48: | Line 48: | ||
− | In most cases when n>p, it is always impossible to find a solution for c. An alternative approach is to find c that minimize a criterion function <math>J(\ | + | In most cases when n>p, it is always impossible to find a solution for <math>\textbf{c}</math>. An alternative approach is to find c that minimize a criterion function <math>J(\textbf{c})</math>. There are variant forms of criterion functions. For example, we can try to minimize the error vector between <math>\textbf{c}\cdot\textbf{y}</math> and <math>b</math>, hence the criterion function can be defined as |
+ | |||
+ | <center><math>J(\textbf{c}) = ||\textbf{Y}\textbf{c}-\textbf{b}||^2</math></center> | ||
+ | |||
+ | The solution to the above problem is | ||
+ | <center><math>\textbf{c} = (\textbf{Y}^T\textbf{Y})^{-1}\textbf{Y}^T\textbf{b}</math></center> | ||
+ | if <math>det(\textbf{Y}^T\textbf{Y})\neq 0</math> | ||
+ | and | ||
+ | <center><math> \lim_{\epsilon \to 0 \textbf{c} = (\textbf{Y}^T\textbf{Y})^{-1}\textbf{Y}^T\textbf{b}</math></center> | ||
+ | |||
+ | |||
+ | \lim_{x \to +\infty} | ||
+ | |||
+ | |||
+ | <math>det(\textbf{Y}^T\textbf{Y})\neq 0</math> |
Revision as of 18:29, 1 May 2014
'Support Vector Machine and its Applications in Classification Problems
A slecture by Xing Liu
Partially based on the ECE662 Spring 2014 lecture material of Prof. Mireille Boutin.
Outline of the slecture
- Background in Linear Classification Problem
- Support vector machine
- Summary
- References
Background in Linear Classification Problem
In this section, we will introduce the framework and basic idea of linear classification problem.
In a linear classification problem, the feature space can be divided into different regions by hyperplanes. In this lecture, we will take a two-catagory case to illustrate. Given training samples $ \textbf{y}_1,\textbf{y}_2,...\textbf{y}_n \in \mathbb{R}^p $, each $ \textbf{y}_i $ is a p-dimensional vector and belongs to either class $ w_1 $ or $ w_2 $. The goal is to find the maximum-margin hyperplane that separate the points in the feature space that belong to class $ w_1 $ from those that belong to class $ w_2 $. The discriminate function can be written as
We want to find $ \textbf{c}\in\mathbb{R}^{n+1} $ so that a testing data point $ \textbf{y}_i $ is labelled
We can apply a trick here to replace all $ \textbf{y} $'s in class $ w_2 $ by $ -\textbf{y} $, then the task is looking for $ \textbf{c} $ so that
You might have already observe the ambiguity of c in the above discussion: if c separates data, $ \lambda \textbf{c} $ also separates the data. One solution might be set $ |\textbf{c}|=1 $. Another solution is to introduce the concept of "margin" which we denote by b, and ask
In this scenario, the hyperplane is defined by $ \textbf{c}\cdot \textbf{y} = \textbf{b} $. $ \textbf{c} $ is the normal of the plane lying on the positive side of every hyperplane. $ \frac{b_i}{||c||} $ is the distance from each point $ \textbf{y}_i $ to the hyperplane.
The above approach is equivalent to finding a solution for
where $ \textbf{Y} =\begin{bmatrix} y_1^T \\ y_2^T \\ ... \\ y_n^T \end{bmatrix} $
In most cases when n>p, it is always impossible to find a solution for $ \textbf{c} $. An alternative approach is to find c that minimize a criterion function $ J(\textbf{c}) $. There are variant forms of criterion functions. For example, we can try to minimize the error vector between $ \textbf{c}\cdot\textbf{y} $ and $ b $, hence the criterion function can be defined as
The solution to the above problem is
if $ det(\textbf{Y}^T\textbf{Y})\neq 0 $ and
\lim_{x \to +\infty}
$ det(\textbf{Y}^T\textbf{Y})\neq 0 $