|
3.3. Neural Networks
3.3.1. Background
A neural network, in basic terms, is a computational device whose design draws on the workings of the biological brain. It comprises a number of nodes (or neurons), interconnected with each other via varying coefficients known as weights (or synaptic weights). It is in these weights that the computational power of the network is stored. The value of these weights is set by a process known as 'training'.
3.4. Multi-Layer Perceptron (MLP)
The basic structure of the MLP is given in figure 3. An input vector representing the pattern to be learned or identified is presented to the input nodes. Each node calculates its output by summing all of its inputs and feeding this value into an activation function (in the case of the input layer the input vector is simply transferred to the connections).
|
|