Courses in previous years: [ 2001 | 2002 | 2003 | 2004 ] Tik-122.102 Special Course in Information Science V LRegularization and sparse approximations
Regularization and sparse approximationsCourse descriptionThis course has two parts, each covering a different viewpoint on sparse function approximations. The first part will take a regularization point of view, and the second part will use the framework of support vector machines. It is also possible to attend only one of the parts, if desired.Part I: Regularization and sparse basis function approximationsRegularization is a technique to constrain the complexity of a regression mapping by extending the conventional error function (for instance, the squared error between the target values and model outputs) by an additional penalty term that adds a penalty for parameter values that are not desired, e.g. mappings that are wavy (high curvature). Regularization techniques will be covered from the classical ridge regression techniques to the more recent sparse regression algorithms. Function approximation will be approached using a linear superposition of basis functions. Sparsity of the linear weights in the superposition will be a criterion in learning from data. Variations on this theme will be covered in the light of scientific journal articles. Part II: Support vector machinesThe second part provides insights to the sparse approximations from the viewpoint of support vector machines. Course materialSelected scientific articles, textbooks and lectures will form the materials for the course. Homework 1
Exercise paper:
exercise1.pdf
Project work of the part IThe instructions and the description of data is found from the exercise paper.
Exercise paper:
projectwork.pdf
TimetableThe lecture topics can be found in the timetable http://www.cis.hut.fi/Opinnot/T-61.6060/k2005/index.shtml webmaster@www.cis.hut.fi Tuesday, 08-Mar-2005 12:04:06 EET |