C++ Neural Networks and Fuzzy Logic: Preface



Download 1,14 Mb.
Pdf ko'rish
bet148/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   144   145   146   147   148   149   150   151   ...   443
Bog'liq
C neural networks and fuzzy logic

Other Training Issues

Besides the applications for which a neural network is intended, and depending on these applications, you

need to know certain aspects of the model. The length of encoding time and the length of learning time are

among the important considerations. These times could be long but should not be prohibitive. It is important

to understand how the network behaves with new inputs; some networks may need to be trained all over

again, but some tolerance for distortion in input patterns is desirable, where relevant. Restrictions on the

format of inputs should be known.

An advantage of neural networks is that they can deal with nonlinear functions better than traditional

algorithms can. The ability to store a number of patterns, or needing more and more neurons in the output

field with an increasing number of input patterns are the kind of aspects addressing the capabilities of a

network and also its limitations.

Adaptation

Sometimes neural networks are used as adaptive filters, the motivation for such an architecture being

selectivity. You want the neural network to classify each input pattern into its appropriate category. Adaptive

models involve changing of connection weights during all their operations, while nonadaptive ones do not

alter the weights after the phase of learning with exemplars. The Hopfield network is often used in modeling a

neural network for optimization problems, and the Backpropagation model is a popular choice in most other

applications. Neural network models are distinguishable sometimes by their architecture, sometimes by their

adaptive methods, and sometimes both. Methods for adaptation, where adaptation is incorporated, assume

great significance in the description and utility of a neural network model.

For adaptation, you can modify parameters in an architecture during training, such as the learning rate in the

backpropagation training method for example. A more radical approach is to modify the architecture itself

during training. New neural network paradigms change the number or layers and the number of neurons in a

C++ Neural Networks and Fuzzy Logic:Preface

Lyapunov Function

110



layer during training. These node adding or pruning algorithms are termed constructive algorithms. (See

Gallant for more details.)




Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   144   145   146   147   148   149   150   151   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish