Design a Network Architecture
Now it’s time to actually design the neural network. For the backpropagation feed−forward neural network we
have designed, this means making the following choices:
1. The number of hidden layers.
2. The size of hidden layers.
3. The learning constant, beta([beta]).
4. The momentum parameter, alpha([alpha]).
5. The form of the squashing function (does not have to be the sigmoid).
6. The starting point, that is, initial weight matrix.
7. The addition of noise.
Some of the parameters listed can be made to vary with the number of cycles executed, similar to the current
implementation of noise. For example, you can start with a learning constant [beta] that is large and reduce
this constant as learning progresses. This allows rapid initial learning in the beginning of the process and may
speed up the overall simulation time.
Do'stlaringiz bilan baham: |