Use other minimum search routines besides steepest descent. For example, you could use
Newton’s method for finding a minimum, although this would be a fairly slow process. Other
examples include the use of conjugate gradient methods or Levenberg−Marquardt optimization, both
of which would result in very rapid training.
•
Use different cost functions. Instead of calculating the error (as expected—actual output), you
could determine another cost function that you want to minimize.
•
Modify the architecture. You could use partially connected layers instead of fully connected
layers. Also, you can use a recurrent network, that is, one in which some outputs feed back as inputs.
Applications
Backpropagation remains the king of neural network architectures because of its ease of use and wide
applicability. A few of the notable applications in the literature will be cited as examples.
Do'stlaringiz bilan baham: |