C++ Neural Networks and Fuzzy Logic: Preface


•  Try a different number of layers and layer sizes for a given problem. •



Download 1,14 Mb.
Pdf ko'rish
bet178/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   174   175   176   177   178   179   180   181   ...   443
Bog'liq
C neural networks and fuzzy logic

  Try a different number of layers and layer sizes for a given problem.

  Try different learning rate parameters and see its effect on convergence and training time.

  Try a very large learning rate parameter (should be between 0 and 1); try a number over 1 and

note the result.

Previous Table of Contents Next

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

C++ Classes and Class Hierarchy

150



C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Previous Table of Contents Next



Summary

In this chapter, you learned about one of the most powerful neural network algorithms called

backpropagation. Without having feedback connections, propagating only errors appropriately to the hidden

layer and input layer connections, the algorithm uses the so−called generalized delta rule and trains the

network with exemplar pairs of patterns. It is difficult to determine how many hidden−layer neurons are to be

provided for. The number of hidden layers could be more than one. In general, the size of the hidden layer(s)

is related to the features or distinguishing characteristics that should be discerned from the data. Our example

in this chapter relates to a simple case where there is a single hidden layer. The outputs of the output neurons,

and therefore of the network, are vectors with components between 0 and 1, since the thresholding function

is the sigmoid function. These values can be scaled, if necessary, to get values in another interval.

Our example does not relate to any particular function to be computed by the network, but inputs and outputs

were randomly chosen. What this can tell you is that, if you do not know the functional equation between two

sets of vectors, the feedback backpropagation network can find the mapping for any vector in the domain,

even if the functional equation is not found. For all we know, that function could be nonlinear as well.



There is one important fact you need to remember about the backpropagation algorithm. Its steepest descent

procedure in training does not guarantee finding a global or overall minimum, it can find only a local

minimum of the energy surface.

Previous Table of Contents Next

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Summary


151



Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   174   175   176   177   178   179   180   181   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish