(Matlab Code)
 
(25 intermediate revisions by 8 users not shown)
Line 1: Line 1:
== Hw assignment 2 ==
+
= HW assignment 2, [[ECE662]] Spring 2008, [[User:mboutin|Prof. Boutin]]=
  
 
=== Assignment Description ===
 
=== Assignment Description ===
Due Tuesday April 1, 2008
+
[Official version: [http://cobweb.ecn.purdue.edu/~mboutin/ECE662/hw2.htm HTML] [http://cobweb.ecn.purdue.edu/~mboutin/ECE662/hw2.pdf PDF]]
  
 +
Due Tuesday April 1, 2008
  
 
Guidelines:
 
Guidelines:
  
Write a short report to present your results.
+
* Write a short report to present your results.
Be sure to include all the relevant graphs as well as a copy of your code.
+
* Be sure to include all the relevant graphs as well as a copy of your code.
Teamwork is encouraged, but the write up of your report must be your own.
+
* Teamwork is encouraged, but the write up of your report must be your own.
Please write the names of ALL your collaborators on the cover page of your report.
+
* Please write the names of ALL your collaborators on the cover page of your report.
  
 
+
==== Question 1====
Question 1: In the Parametric Method section of the course, we learned how to draw a separation hyperplane between two classes by obtaining w0, the argmax of the cost function <math>J(w)=w^TS_Bw / w^TS_ww</math>. The solution was found to be<math>w_0= S_w^{-1}(m_1-m_2)</math>, where <math>m_1</math> and <math>m_2</math> are the sample means of each class, respectively.
+
In the Parametric Method section of the course, we learned how to draw a separation hyperplane between two classes by obtaining w0, the argmax of the cost function <math>J(w)=w^TS_Bw / w^TS_ww</math>. The solution was found to be<math>w_0= S_w^{-1}(m_1-m_2)</math>, where <math>m_1</math> and <math>m_2</math> are the sample means of each class, respectively.
  
 
Some students raised the question: can one simply use <math>J(w)= w^TS_Bw</math> instead (i.e. setting <math>S_w</math> as the identity matrix in the solution <math>w_0</math>? Investigate this question by numerical experimentation.
 
Some students raised the question: can one simply use <math>J(w)= w^TS_Bw</math> instead (i.e. setting <math>S_w</math> as the identity matrix in the solution <math>w_0</math>? Investigate this question by numerical experimentation.
  
  
Question 2: Obtain a set of training data. Divide the training data into two sets. Use the first set as training data and the second set as test data.
+
==== Question 2====
 +
Obtain a set of training data. Divide the training data into two sets. Use the first set as training data and the second set as test data.
  
a)       Experiment with designing a classifier using the neural network approach.
+
a)       Experiment with designing a classifier using the neural network approach.
  
b)       Experiment with designing a classifier using the support vector machine approach.
+
b)       Experiment with designing a classifier using the support vector machine approach.
  
c)         Compare the two approaches.
+
c)       Compare the two approaches.
  
 
Note: you may use code downloaded from the web, but if you do so, please be sure to explain what the code does in your report and give the reference.
 
Note: you may use code downloaded from the web, but if you do so, please be sure to explain what the code does in your report and give the reference.
  
  
Question 3: Using the same data as for question 2 (perhaps projected to one or two dimensions for better visualization),
+
==== Question 3 ====
 +
Using the same data as for question 2 (perhaps projected to one or two dimensions for better visualization),
  
 
a)        Design a classifier using the Parzen window technique.
 
a)        Design a classifier using the Parzen window technique.
Line 40: Line 43:
  
 
=== Data ===
 
=== Data ===
Publicly available data sources are listed [[Data_Sources_OldKiwi|here]].
+
* Publicly available data sources are listed [[Data_Sources_OldKiwi|here]].
  
 +
* Simple perl script that converts data from the libsvm to the fann format. Allows you to quickly convert data if you're using the '''FANN''' and '''LIBSVM''' (please, follow the link under [[Tools_OldKiwi]]).
  
== Matlab Code ==
+
== LINK : Matlab Code ==
  
[http://homepages.cae.wisc.edu/~ece539/matlab/ The contents are below]
+
1. [http://homepages.cae.wisc.edu/~ece539/matlab/ The contents are below]
  
 
a). KNN classifiter
 
a). KNN classifiter
 +
 
b). Classification using SVM
 
b). Classification using SVM
 +
 
c). Demonstration of parzen window
 
c). Demonstration of parzen window
d). Serval matlab codes realated to learning, clustering, and pattern classification]
 
  
[http://my.fit.edu/~rperalta/RECENT_PROJECTS/K-NearestNeighbor.html KNN Classifier Matlab code]
+
d). Serval matlab codes realated to learning, clustering, and pattern classification
 +
 
 +
 
 +
2. [http://my.fit.edu/~rperalta/RECENT_PROJECTS/K-NearestNeighbor.html KNN Classifier Matlab code]
 +
 
 +
 
 +
3. [http://isp.imm.dtu.dk/toolbox/ann/ Tool box holding a collection of Artificial Neural Networks (ANN) algorithms implemented for Matlab]
 +
 
 +
a). Neural classifier for multiple class data
 +
 
 +
b). Neural classifier for binary class data
 +
 
 +
 
 +
4.[http://www.mathworks.com/access/helpdesk/help/techdoc/index.html?/access/helpdesk/help/techdoc/ref/delaunay.html&http://www.google.com/search?hl=ko&q=voronoi+matlab+nearest+neighbor&lr= Voroni Diagram]
 +
: The voronoi diagram is related to the nearest neighbor technique.
 +
 
 +
5. The
 +
[http://www.ncrg.aston.ac.uk/netlab/ Netlab ]library includes software implementations of a wide range of data analysis techniques, many of which are not yet available in standard neural network simulation packages (in Matlab)
 +
 
 +
== LINK : C Code ==
 +
1. [http://www.codeproject.com/KB/cpp/MLP.aspx Neural network classifier]
 +
 
 +
 
 +
 
 +
== LINK : Documentation ==
 +
1. [http://www.ir.iit.edu/~nazli/cs422/CS422-Slides/DM-NeuralNetwork.pdf Neural Network Classifier]
 +
 
 +
 
 +
2. [http://www.yom-tov.info/ComputerManualAppendix.pdf Manual in MATLAB to accompany PatternClassification]
 +
: It contains lots of pattern recognition algorithms and gives the description and pesudo code of them.
 +
 
 +
== Links ==
 +
Links to many SWM softwares, tutorials, etc: Most of these sites are compilation of several links to codes on the web
 +
 
 +
1. SVM and Kernel Methods Matlab Toolbox
 +
http://asi.insa-rouen.fr/enseignants/~arakotom/toolbox/index.html
 +
 
 +
2. SVM - Support Vector Machines Software
 +
http://www.support-vector-machines.org/SVM_soft.html
 +
 
 +
3. Some SVM sample data
 +
http://www.cs.iastate.edu/~dcaragea/SVMVis/data_sets.htm
 +
 
 +
4. LIBSVM - A library of SVM software, including both C and Matlab code.  Various interfaces through several platforms available as well.
 +
http://www.csie.ntu.edu.tw/~cjlin/libsvm/
 +
 
 +
Links to Matlab Toolbox tutorials
 +
 
 +
1. SVM Matlab Bioinformatics Toolbox http://www.mathworks.com/access/helpdesk/help/toolbox/bioinfo/index.html?/access/helpdesk/help/toolbox/bioinfo/ref/svmclassify.html&http://www.mathworks.com/cgi-bin/texis/webinator/search/
 +
 
 +
2. Neural network Matlab Toolbox
 +
http://www.mathworks.com/access/helpdesk/help/toolbox/nnet/
 +
 
 +
== Peer review of the homework==
 +
 
 +
[[Peer review_OldKiwi]]
 +
 
 +
----
 +
[[ECE662:BoutinSpring08_OldKiwi|Back to ECE662 Spring 2008 Prof. Boutin]]
 +
 
 +
[[Category:ECE662]]

Latest revision as of 10:53, 13 April 2010

HW assignment 2, ECE662 Spring 2008, Prof. Boutin

Assignment Description

[Official version: HTML PDF]

Due Tuesday April 1, 2008

Guidelines:

  • Write a short report to present your results.
  • Be sure to include all the relevant graphs as well as a copy of your code.
  • Teamwork is encouraged, but the write up of your report must be your own.
  • Please write the names of ALL your collaborators on the cover page of your report.

Question 1

In the Parametric Method section of the course, we learned how to draw a separation hyperplane between two classes by obtaining w0, the argmax of the cost function $ J(w)=w^TS_Bw / w^TS_ww $. The solution was found to be$ w_0= S_w^{-1}(m_1-m_2) $, where $ m_1 $ and $ m_2 $ are the sample means of each class, respectively.

Some students raised the question: can one simply use $ J(w)= w^TS_Bw $ instead (i.e. setting $ S_w $ as the identity matrix in the solution $ w_0 $? Investigate this question by numerical experimentation.


Question 2

Obtain a set of training data. Divide the training data into two sets. Use the first set as training data and the second set as test data.

a) Experiment with designing a classifier using the neural network approach.

b) Experiment with designing a classifier using the support vector machine approach.

c) Compare the two approaches.

Note: you may use code downloaded from the web, but if you do so, please be sure to explain what the code does in your report and give the reference.


Question 3

Using the same data as for question 2 (perhaps projected to one or two dimensions for better visualization),

a) Design a classifier using the Parzen window technique.

b) Design a classifier using the K-nearest neighbor technique

c) Design a classifier using the nearest neighbor technique.

d) Compare the three approaches.

Data

  • Publicly available data sources are listed here.
  • Simple perl script that converts data from the libsvm to the fann format. Allows you to quickly convert data if you're using the FANN and LIBSVM (please, follow the link under Tools_OldKiwi).

LINK : Matlab Code

1. The contents are below

a). KNN classifiter

b). Classification using SVM

c). Demonstration of parzen window

d). Serval matlab codes realated to learning, clustering, and pattern classification


2. KNN Classifier Matlab code


3. Tool box holding a collection of Artificial Neural Networks (ANN) algorithms implemented for Matlab

a). Neural classifier for multiple class data

b). Neural classifier for binary class data


4.Voroni Diagram

The voronoi diagram is related to the nearest neighbor technique.

5. The Netlab library includes software implementations of a wide range of data analysis techniques, many of which are not yet available in standard neural network simulation packages (in Matlab)

LINK : C Code

1. Neural network classifier


LINK : Documentation

1. Neural Network Classifier


2. Manual in MATLAB to accompany PatternClassification

It contains lots of pattern recognition algorithms and gives the description and pesudo code of them.

Links

Links to many SWM softwares, tutorials, etc: Most of these sites are compilation of several links to codes on the web

1. SVM and Kernel Methods Matlab Toolbox http://asi.insa-rouen.fr/enseignants/~arakotom/toolbox/index.html

2. SVM - Support Vector Machines Software http://www.support-vector-machines.org/SVM_soft.html

3. Some SVM sample data http://www.cs.iastate.edu/~dcaragea/SVMVis/data_sets.htm

4. LIBSVM - A library of SVM software, including both C and Matlab code. Various interfaces through several platforms available as well. http://www.csie.ntu.edu.tw/~cjlin/libsvm/

Links to Matlab Toolbox tutorials

1. SVM Matlab Bioinformatics Toolbox http://www.mathworks.com/access/helpdesk/help/toolbox/bioinfo/index.html?/access/helpdesk/help/toolbox/bioinfo/ref/svmclassify.html&http://www.mathworks.com/cgi-bin/texis/webinator/search/

2. Neural network Matlab Toolbox http://www.mathworks.com/access/helpdesk/help/toolbox/nnet/

Peer review of the homework

Peer review_OldKiwi


Back to ECE662 Spring 2008 Prof. Boutin

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood