C++ Neural Networks and Fuzzy Logic: Preface


C++ Neural Networks and Fuzzy Logic



Download 1,14 Mb.
Pdf ko'rish
bet289/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   285   286   287   288   289   290   291   292   ...   443
Bog'liq
C neural networks and fuzzy logic

C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Previous Table of Contents Next



Generalization versus Memorization

If your overall goal is beyond pattern classification, you need to track your network’s ability to generalize.

Not only should you look at the overall error with the training set that you define, but you should set aside

some training examples as part of a test set (and do not train with them), with which you can see whether or

not the network is able to correctly predict. If the network responds poorly to your test set, you know that you

have overtrained, or you can say the network “memorized” the training patterns. If you look at the arbitrary

curve−fitting analogy in Figure 14.2, you see curves for a generalized fit, labeled G, and an overfit, labeled O.

In the case of the overfit, any data point outside of the training data results in highly erroneous prediction.

Your test data will certainly show you large error in the case of an overfitted model.

Figure 14.2

  General (G) versus over fitting (0) of data.

Another way to consider this issue is in terms of Degrees Of Freedom (DOF). For the polynomial:

y= a0 + a1x + a2x2 + anxn...

the DOF equals the number of coefficients a0, a1 ... an, which is N + 1. So for the equation of a line (y=a0 +

a1x), the DOF would be 2. For a parabola, this would be 3 and so on. The objective to not overfit data can be

restated as an objective to obtain the function with the least DOF that fits the data adequately. For neural

network models, the larger the number of trainable weights (which is a function of the number of inputs and

the architecture), the larger the DOF. Be careful with having too many (unimportant) inputs. You may find

terrific results with your training data, but extremely poor results with your test data.


Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   285   286   287   288   289   290   291   292   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish