- The McCulloch-Pitts perceptrons can be used to perform numerous
logical tasks. Neurons are assumed to have two binary input
signals, and , and a constant bias signal which are combined
into an input vector as follows:
,
. The output of the neuron is given by

where is an adjustable weight vector. Demonstrate the implementation of the following binary logic functions with a single neuron:- not
- or
- and
- nor
- nand
- xor .

- A single perceptron is used for a classification task, and its
weight vector
is updated iteratively in the following way:

where is the input signal, sgn is the output of the neuron, and is the correct class. Parameter is a positive learning rate. How does the weight vector evolve from its initial value , when the above updating rule is applied with , and we have the following samples from classes and :

- Suppose that in the signal-flow graph of the perceptron
illustrated in Figure 1 the hard limiter is replaced by the sigmoidal
linearity:

where is the induced local field. The classification decisions made by the perceptron are defined as follows:

*Observation vector belongs to class if the output where is a threshold;*

otherwise, belongs to class

Show that the decision boundary so constructed is a hyperplane.

- Two pattern classes,
and
, are assumed to have Gaussian
distributions which are centered around points
and
and have the following covariance matrixes:
and

Plot the distributions and determine the optimal Bayesian decision surface for and . In both cases, assume that the prior probabilities of the classes are equal, the costs associated with correct classifications are zero, and the costs associated with misclassifications are equal.

Jarkko Venna 2005-04-13