[an error occurred while processing this directive] [an error occurred while processing this directive] [an error occurred while processing this directive] [an error occurred while processing this directive] [an error occurred while processing this directive] HUT - CIS /Opinnot/T-61.6060/k2005/index.shtml [an error occurred while processing this directive]
         [an error occurred while processing this directive] Index of /style/plain

Index of /style/plain

[ICO]NameLast modifiedSizeDescription

[PARENTDIR]Parent Directory  -  
[IMG]blue20.gif1999-11-03 19:09 37  
[IMG]blue120.gif1999-11-03 19:08 42  
[TXT]cis_plain.css2000-04-20 12:16 3.1K 
[   ]cis_plain.css.old2000-01-25 10:42 2.3K 
[TXT]cis_plain_footer.shtml2000-09-08 16:08 322  
[TXT]cis_plain_header.shtml2001-08-03 10:00 1.3K 
[   ]cis_plain_header.shtml.old2001-08-03 09:46 1.2K 
[TXT]scientific.shtml2000-03-08 14:05 4.8K 
[TXT]template.html1999-12-03 14:11 332  
[TXT]template.shtml1999-11-12 12:58 321  
[TXT]template.shtml.krista1999-12-03 14:11 267  
[TXT]template.txt1999-11-12 12:58 321  

Apache/2.4 Server at www.cis.hut.fi Port 80


Courses in previous years: [ 2001 | 2002 | 2003 | 2004 ]

Tik-122.102 Special Course in Information Science V L

Regularization and sparse approximations

Lecturer:Prof. (pro tem) Jaakko Hollmén, PhD (Eng.) Amaury Lendasse
Course Assistant:M.Sc. (Tech.) Jarkko Tikka, e-mail: tikka(at)mail.cis.hut.fi
Semester:Spring 2005
Credit points:3-5 cr
Place:Lecture hall T5 in the computer science building
Time:Tuesday, 14:15 - 16:00, starting on January, 18th
Language:English
Homepage:http://www.cis.hut.fi/Opinnot/T-122.102/

Regularization and sparse approximations

Course description

This course has two parts, each covering a different viewpoint on sparse function approximations. The first part will take a regularization point of view, and the second part will use the framework of support vector machines. It is also possible to attend only one of the parts, if desired.

Part I: Regularization and sparse basis function approximations

Regularization is a technique to constrain the complexity of a regression mapping by extending the conventional error function (for instance, the squared error between the target values and model outputs) by an additional penalty term that adds a penalty for parameter values that are not desired, e.g. mappings that are wavy (high curvature).

Regularization techniques will be covered from the classical ridge regression techniques to the more recent sparse regression algorithms. Function approximation will be approached using a linear superposition of basis functions. Sparsity of the linear weights in the superposition will be a criterion in learning from data. Variations on this theme will be covered in the light of scientific journal articles.

Part II: Support vector machines

The second part provides insights to the sparse approximations from the viewpoint of support vector machines.

Course material

Selected scientific articles, textbooks and lectures will form the materials for the course.

Homework 1

Exercise paper: exercise1.pdf
Data for the first homework: polydata.dat
The first column of the data matrix polydata is a variable x and the second column is a variable y.

Project work of the part I

The instructions and the description of data is found from the exercise paper.

Exercise paper: projectwork.pdf
Data for the project work: Systemdata.dat

Timetable

The lecture topics can be found in the timetable



http://www.cis.hut.fi/Opinnot/T-61.6060/k2005/index.shtml
webmaster@www.cis.hut.fi
Tuesday, 08-Mar-2005 12:04:06 EET