Variations of the Backpropagation Algorithm
Backpropagation is a versatile neural network algorithm that very often leads to success. Its Achilles heel is
the slowness at which it converges for certain problems. Many variations of the algorithm exist in the
literature to try to improve convergence speed and robustness. Variations have been proposed in the following
portions of the algorithm:
•
Adaptive parameters. You can set rules that modify alpha, the momentum parameter, and beta,
the learning parameter, as the simulation progresses. For example, you can reduce beta whenever a
weight change does not reduce the error. You can consider undoing the particular weight change,
setting alpha to zero and redoing the weight change with the new value of beta.
C++ Neural Networks and Fuzzy Logic:Preface
Trying the Noise and Momentum Features
293
•
Do'stlaringiz bilan baham: |