- Let the error function be

where and are the components of the two-dimensional parameter vector . Find the minimum value of by applying the steepest descent method. Use as an initial value for the parameter vector and the following constant values for the learning rate:- What is the condition for the convergence of this method?

- Show that the application of the Gauss-Newton method to the
error function

yields the the following update rule for the weights:

All quantities are evaluated at iteration step . (Haykin 3.3)

- The normalized LMS algorithm is described by the following
recursion for the weight vector:

where is a positive constant and is the Euclidean norm of the input vector . The error signal is defined by

where is the desired response. For the normalized LMS algorithm to be convergent in the mean square, show that . (Haykin 3.5)

- The ensemble-averaged counterpart to the sum of error squares
viewed as a cost function is the mean-square value of the error
signal:

- Assuming that the input vector
and desired response are drawn from a stationary environment,
show that

where , , and . - For this cost function, show that the gradient vector and
Hessian matrix of
are as follows, respectively:
and

- In the LMS/Newton algorithm, the gradient vector
is
replaced by its instantaneous value. Show that this algorithm,
incorporating a learning rate parameter , is described by

The inverse of the correlation matrix , assumed to be positive definite, is calculated ahead of time. (Haykin 3.8)

- Assuming that the input vector
and desired response are drawn from a stationary environment,
show that
- A linear classifier separates -dimensional
space into two classes using a -dimensional hyperplane. Points
are classified into two classes, or , depending on
which side of the hyperplane they are located.
- Construct a linear classifier which is able to
separate the following two-dimensional samples correctly:

- Is it possible to construct a linear classifier which is able to
separate the following samples correctly?

Justify your answer.

- Construct a linear classifier which is able to
separate the following two-dimensional samples correctly:

Jarkko Venna 2005-04-13