Revision as of 23:40, 9 March 2008 by Park200 (Talk)

ECE662 Main Page

Class Lecture Notes

    • Nearest Neighbor Classification Rule**

- useful when there are several labels - e.g. fingerprint-based recognition

.. |X_space_d| image:: tex

alt: tex: \vec{X_1}, \vec{X_2}, \ldots, \vec{X_d}

.. |inRn| image:: tex

alt: tex: \in \mathbb{R}^n

.. |X_0| image:: tex

alt: tex: \vec{X_0}


.. |X| image:: tex

alt: tex: \vec{X}

Problem: Given the labeled training samples: |X_space_d| |inRn| (or some other feature space) and an unlabeled test point |X_0| |inRn|.

Classification: Let |X| be the closest training point to |X_0|, then we assign the class of |X| to |X_0|.

    • What do we mean by closest?**

.. |D_from_SxS_to_R| image:: tex

alt: tex: D: S\times S\rightarrow \mathbb{R}

There are many meaning depending on the metric we choose for the feature space.

    • Definition** A "metric" on a space S is a function

|D_from_SxS_to_R|

.. |nonnegat| image:: tex

alt: tex: D(\vec{x_1},\vec{x_2})\geq 0, \forall \vec{x_1},\vec{x_2}\in S


.. |symmetry| image:: tex

alt: tex: D(\vec{x_1},\vec{x_2})=D(\vec{x_2},\vec{x_1}), \forall \vec{x_1},\vec{x_2}\in S


.. |reflex| image:: tex

alt: tex: D(\vec{x},\vec{x})=0, \forall \vec{x}\in S


.. |tri-ineq| image:: tex

alt: tex: D(\vec{x_1},\vec{x_2})+D(\vec{x_2},\vec{x_3})\geq D(\vec{x_1},\vec{x_3}) , \forall \vec{x_1}, \vec{x_2}, \vec{x_3}\in S

that satisfies the following 4 properties: - Non-negativity |nonnegat| - Symmetry |symmetry| - Reflexivity |reflex| - Triangle Inequality |tri-ineq|

.. |euclid| image:: tex

alt: tex: D(\vec{x_1},\vec{x_2})=||\vec{x_1}-\vec{x_2}||_{L_2}=\sqrt{\sum_{i=1}^n ({x_1}^i-{x_2}^i)^2}

.. |manhattan| image:: tex

alt: tex: D(\vec{x_1},\vec{x_2})=||\vec{x_1}-\vec{x_2}||_{L_1}=\sum_{i=1}^n |{x_1}^i-{x_2}^i|

.. |minkowski| image:: tex

alt: tex: D(\vec{x_1},\vec{x_2})=||\vec{x_1}-\vec{x_2}||_{L_p}=(\sum_{i=1}^n ({x_1}^i-{x_2}^i)^p)^{\frac{1}{p}}

.. |riemannian| image:: tex

alt: tex: D(\vec{x_1},\vec{x_2})=\sqrt{(\vec{x_1}-\vec{x_2})^\top \mathbb{M}(\vec{x_1}-\vec{x_2})}
    • Examples of metrics**

Euclidean distance: |euclid|

Manhattan distance: |manhattan|

Minkowski metric: |minkowski|

Riemannian metric: |riemannian|

.. |nxn| image:: tex

alt: tex: n\times n

where M is a symmetric positive definite |nxn| matrix. Different choices for M enable associating different weights with different components.

.. |Rn| image:: tex

alt: tex: \mathbb{R}^n

.. |Zn| image:: tex

alt: tex: \mathbb{Z}^n

.. |Cn| image:: tex

alt: tex: \mathbb{C}^n

In this way, we see that |Rn|, |Zn|, |Cn| have many natural metrics, but feature could be in some other set, e.g. a discrete set.

test

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett