NN-9

Neural Networks

Neurons



Models for Neurons



Artificial Neural Networks (ANN)

Advantage of using Neural Network



Single Layer Perceptron



Training of a Single Layer Perceptron

Step 1: Initialize Weights and Threshold

Set wi (0) (0 ú i ú N-1) and q to small random values. Here wi (t) is the weight from input i at time t and q is the threshold in the output node.

Step 2: Present New Input and Desired Output

Present new continuous valued input x0, x1, ... xN-1 along with the desired output d(t).

Step 3: Calculate Actual Output


Step 4: Adapt Weights

wi (t+1) = wi (t) + h[ d(t) - y(t) ] xi (t), 0 ú i ú N-1
d(t) = +1 if input from class A
-1 if input from class B
In these equations, h is a positive gain fraction less than 1 and d(t) is the desired correct output for the current input. Note that the weighs are unchanged if the correct decision is made by the net.

Step 5: Repeat by going to step 2.



Multi-Layer Perceptron



Example: (A network to implement the XOR function)



q is the

threshold

value

Example: (A network to implement the AND function)



where 0 < d < 1















Training a Multi-layer Perceptron by Backpropagation




Training Algorithm

  1. Select the next training pair from the training set; apply the input vector to the network input.

  2. Calculate the output of the network

  3. Calculate the error between the network output and the desired output (the target vector from the training pair)

  4. Adjust the weights of the network in a way that minimizes the error.

  5. If acceptable recognition result is not obtained, goto 1.



Adjust the weight of the output layer



  1. Calculate the d value
    d = OUT (1-OUT)(Target-OUT)

  2. The weight between node p in hidden layer and q in output layer is updated by Dwpq = h dq OUTp
    h is a training rate coefficient (0.01 - 1.0)
    OUTp, output of neuron p.


Adjust the weights of hidden layer

Other Popular Networks

Neural Network Software