C++ Neural Networks and Fuzzy Logic: Preface


C++ Neural Networks and Fuzzy Logic



Download 1,14 Mb.
Pdf ko'rish
bet158/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   154   155   156   157   158   159   160   161   ...   443
Bog'liq
C neural networks and fuzzy logic

C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Previous Table of Contents Next

The top row in the table gives headings for the columns. They are, Item, I−1, I−2, I−3 (I−k being for input

layer neuron k); H−1, H−2 (for hidden layer neurons); and O−1, O−2, O−3 (for output layer neurons).

In the first column of the table, M−1 and M−2 refer to weight matrices as above. Where an entry is appended

with −H, like in Output −H, the information refers to the hidden layer. Similarly, −O refers to the output layer,

as in Activation + threshold −O.

The next iteration uses the following information from the previous iteration, which you can identify from

Table 7.1. The input pattern is ( 0.52, 0.75, 0.97 ), and the desired output pattern is ( 0.24, 0.17, 0.65). The

current weight matrices are as follows:



M−1 Matrix of weights from input layer to hidden layer:

             0.6004   − 0.4

             0.2006     0.8001

           − 0.4992     0.3002



M−2 Matrix of weights from hidden layer to output layer:

           −0.910       0.412      0.262

            0.096      −0.694     −0.734

The threshold values (or bias) for neurons in the hidden layer are 0.2008 and 0.3002, while those for the

output neurons are 0.1404, 0.2336, and 0.0616, respectively.

You can keep the learning parameters as 0.15 for connections between input and hidden layer neurons, and

0.2 for connections between the hidden layer neurons and output neurons, or you can slightly modify them.

Whether or not to change these two parameters is a decision that can be made perhaps at a later iteration,

having obtained a sense of how the process is converging.

If you are satisfied with the rate at which the computed output pattern is getting close to the target output

pattern, you would not change these learning rates. If you feel the convergence is much slower than you

would like, then the learning rate parameters can be adjusted slightly upwards. It is a subjective decision both

in terms of when (if at all) and to what new levels these parameters need to be revised.


Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   154   155   156   157   158   159   160   161   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish