Courses:

Networks for Learning: Regression and Classification >> Content Detail



Syllabus



Syllabus

Course Description

The course focuses on the problem of supervised learning within the framework of Statistical Learning Theory. It starts with a review of classical statistical techniques, including Regularization Theory in RKHS for multivariate function approximation from sparse data. Next, VC theory is discussed in detail and used to justify classification and regression techniques such as Regularization Networks and Support Vector Machines. Selected topics such as boosting, feature selection and multiclass classification will complete the theory part of the course. During the course we will examine applications of several learning techniques in areas such as computer vision, computer graphics, database search and time-series analysis and prediction. We will briefly discuss implications of learning theories for how the brain may learn from experience, focusing on the neurobiology of object recognition. We plan to emphasize hands-on applications and exercises, paralleling the rapidly increasing practical uses of the techniques described in the subject.

Requirements
18.02 (calculus) or permission of instructor.
Rules of the Game
  • Attendance
  • Problem sets
  • Term paper (final project)
  • Interest
  • Effort

Grades will be based on a nonlinear function:

f = f(Problem set 1, Problem set 2, Problem set 3, Final Project, attendance, interest, effort).

We may post data-based regression of the function after the fact, that is after grades are given. As a guideline, the final grade may be approximated by:

f ~ 0.3 Problem set 1 + 0.3 Problem set 2 + max [(Final Project), (Final Project + Problem set 3)]. But in this advanced graduate class, effort, interest, and attendance will also be taken into account!



 



 








© 2009-2020 HigherEdSpace.com, All Rights Reserved.
Higher Ed Space ® is a registered trademark of AmeriCareers LLC.