Lab of Computer and Information Science

T-61.3030 Requirements for the examination, spring 2007

The course is based on the textbook S. Haykin, "Neural networks - a comprehensive foundation", 2nd edition, Prentice-Hall, 1998. In addition to above, visualization of Self-Organizing Map has been considered. The requirements for the examination cover the matters discussed on the lectures and exercises. The problems and their solutions are in English, and they are copied through Edita well before the first exam on Monday May 14th, 13-16, lecture hall C. You are allowed to keep with you a programmable calculator, but not the textbook, lecture slides or exercises with their solutions.

The lectures have covered Chapters 1,2,3,4,5 and 9 of Haykin's book excluding the following sections or parts, which thus do not belong to the exam requirements:

Chapter 1: Sections 1.2, 1.8, and 1.9;

Chapter 2: Sections 2.6, 2.11, 2.13, 2.14, 2.15, and 2.16;

Chapter 3: Section 3.6, from Section 3.9 the proof of the Perceptron convergence theorem on pp. 139-141, from Section 3.10 the derivation of Bayes classifier on p. 144;

Chapter 4: Sections 4.9, 4.10, 4.11, from Section 4.13 the parts Bounds on Approximation Errors and Curse of Dimensionality on pp. 209-212, Subsections of 4.14, Sections 4.15, 4.17, most of Section 4.18 starting fron subsection Conjugate-Gradient Method on p. 236, Sections 4.19 and 4.20;

Chapter 5: from Section 5.2 the derivations on p. 259 and pp. 261-262, from Section 5.5 the part from the end of p. 268 up to Eq. (5.41) and p. 276 up to Eq. (5.65), Section 5.9, most of Section 5.10 starting from subsection Curse of Dimensionality to the end, Section 5.12, from Section 5.13 subsections 3 ja 4 on pp. 302-305, Section 5.14;

Chapter 9: Sections 9.9 and 9.10.

This page is updated by
Last update 25.4.2006.