Clamping Probabilities
Sometimes in simulated annealing, first a subset of the neurons in the network are associated with some
inputs, and another subset of neurons are associated with some outputs, and these are clamped with
probabilities, which are not changed in the learning process. Then the rest of the network is subjected to
adjustments. Updating is not done for the clamped units in the network. This training procedure of Geoffrey
Hinton and Terrence Sejnowski provides an extension of the Boltzmann technique to more general networks.
Do'stlaringiz bilan baham: |