Line 19: Line 19:
 
<math> g(\vec{y}) = c\cdot\vec{y}</math>  
 
<math> g(\vec{y}) = c\cdot\vec{y}</math>  
  
We want to find <math>c\in\mathbb{R}^{n+1}</math> so that a testing data point <math>\vec{y}_i</math> is labelled <math> {w} </math> if <math> c</math>
+
We want to find <math>c\in\mathbb{R}^{n+1}</math> so that a testing data point <math>\vec{y}_i</math> is labelled <math> {w_1} </math> if <math> c\cdot\vec{y}>0</math>
 +
<math> {w_2} </math> if <math> c\cdot\vec{y}<0</math>  
 +
 
 +
 
 +
 
  
  

Revision as of 11:02, 1 May 2014


'Support Vector Machine and its Applications in Classification Problems
A slecture by Xing Liu Partially based on the ECE662 Spring 2014 lecture material of Prof. Mireille Boutin.



Outline of the slecture

  • Linear discriminant functions
  • Summary
  • References


Linear classification Problem Statement

In a linear classification problem, the feature space can be divided into different regions by hyperplanes. In this lecture, we will take a two-catagory case to illustrate. Given training samples $ \vec{y}_1,\vec{y}_2,...\vec{y}_n \in \mathbb{R}^p $, each $ \vec{y}_i $ is a p-dimensional vector and belongs to either class $ w_1 $ or $ w_2 $. The goal is to find the maximum-margin hyperplane that separate the points in the feature space that belong to class $ w_1 $ from those belong to class$ w_2 $. The discriminate function can be written as

$ g(\vec{y}) = c\cdot\vec{y} $

We want to find $ c\in\mathbb{R}^{n+1} $ so that a testing data point $ \vec{y}_i $ is labelled $ {w_1} $ if $ c\cdot\vec{y}>0 $ $ {w_2} $ if $ c\cdot\vec{y}<0 $




. The separation hyperplane can be written as $ c\cdot y=b $ where $ \cdot $ denotes the dot product, c determines the orientation of the hyperplane and

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva