C++ Neural Networks and Fuzzy Logic: Preface


C++ Neural Networks and Fuzzy Logic



Download 1,14 Mb.
Pdf ko'rish
bet169/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   165   166   167   168   169   170   171   172   ...   443
Bog'liq
C neural networks and fuzzy logic

C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Previous Table of Contents Next



Details of the Backpropagation Header File

At the top of the file, there are two #define statements, which are used to set the maximum number of layers

that can be used, currently five, and the maximum number of training or test vectors that can be read into an

I/O buffer. This is currently 100. You can increase the size of the buffer for better speed at the cost of

increased memory usage.

The following are definitions in the layer base class. Note that the number of inputs and outputs are protected

data members, which means that they can be accessed freely by descendants of the class.

int num_inputs;

int num_outputs;

float *outputs;      // pointer to array of outputs

float *inputs;       // pointer to array of inputs, which

                     // are outputs of some other layer

friend network;

There are also two pointers to arrays of floats in this class. They are the pointers to the outputs in a given layer

and the inputs to a given layer. To get a better idea of what a layer encompasses, Figure 7.3 shows you a small

feedforward backpropagation network, with a dotted line that shows you the three layers for that network. A

layer contains neurons and weights. The layer is responsible for calculating its output (calc_out()), stored in

the float * outputs array, and errors (calc_error()) for each of its respective neurons. The errors are stored in

another array called float * output_errors defined in the output class. Note that the input class does not

have any weights associated with it and therefore is a special case. It does not need to provide any data

members or function members related to errors or backpropagation. The only purpose of the input layer is to

store data to be forward propagated to the next layer.



Figure 7.3

  Organization of layers for backpropagation program.

With the output layer, there are a few more arrays present. First, for storing backpropagated errors, there is an

array called float * back_errors. There is a weights array called float * weights, and finally, for storing the

expected values that initiate the error calculation process, there is an array called float * expected_values.

Note that the middle layer needs almost all of these arrays and inherits them by being a derived class of the




Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   165   166   167   168   169   170   171   172   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish