Revision as of 08:50, 22 April 2010 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Details of Lecture 26, ECE662 Spring 2010

April 22, 2010

In Lecture 26, we finished our discussion of artificial neural networks. The "highlight" of the lecture was a somewhat involved (mostly because of the many indices) computation of the update rule for the gradient descent to find the least squares solution for the parameters of an ANN using an inline training approach. We got to appreciate the fact that the parameters to optimize occur within linear functions in the expression for the cost function, as well as the fact that 3 layers are typically sufficient for accurately approximating the k-class decision function. Othewise, the computations would have been much worse.

Recalls: we are taking a poll regarding your favorite decision method. Please make sure to answer before the end of the semester. (Hint: stars stars!!).

It was announced that next Thursday's lecture (4-29-10) is canceled.

Note that there is a make up class tomorrow Friday (4-23-10) in EE117.

Previous: Lecture 25 Next: Lecture 27


Back to course outline

Back to 2010 Spring ECE 662 mboutin

Back to ECE662

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett