C++ Neural Networks and Fuzzy Logic: Preface



Download 1,14 Mb.
Pdf ko'rish
bet151/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   147   148   149   150   151   152   153   154   ...   443
Bog'liq
C neural networks and fuzzy logic

Figure 7.1

  Layout of a feedforward backpropagation network.

C++ Neural Networks and Fuzzy Logic:Preface

Chapter 7 Backpropagation

112



The network has three fields of neurons: one for input neurons, one for hidden processing elements, and one

for the output neurons. As already stated, connections are for feed forward activity. There are connections

from every neuron in field A to every one in field B, and, in turn, from every neuron in field B to every

neuron in field C. Thus, there are two sets of weights, those figuring in the activations of hidden layer

neurons, and those that help determine the output neuron activations. In training, all of these weights are

adjusted by considering what can be called a cost function in terms of the error in the computed output pattern

and the desired output pattern.

Training

The feedforward backpropagation network undergoes supervised training, with a finite number of pattern

pairs consisting of an input pattern and a desired or target output pattern. An input pattern is presented at the

input layer. The neurons here pass the pattern activations to the next layer neurons, which are in a hidden

layer. The outputs of the hidden layer neurons are obtained by using perhaps a bias, and also a threshold

function with the activations determined by the weights and the inputs. These hidden layer outputs become

inputs to the output neurons, which process the inputs using an optional bias and a threshold function. The

final output of the network is determined by the activations from the output layer.

The computed pattern and the input pattern are compared, a function of this error for each component of the

pattern is determined, and adjustment to weights of connections between the hidden layer and the output layer

is computed. A similar computation, still based on the error in the output, is made for the connection weights

between the input and hidden layers. The procedure is repeated with each pattern pair assigned for training the

network. Each pass through all the training patterns is called a cycle or an epoch. The process is then repeated

as many cycles as needed until the error is within a prescribed tolerance.

There can be more than one learning rate parameter used in training in a feedforward

backpropagation network. You can use one with each set of weights between consecutive

layers.

Previous Table of Contents Next

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Training


113



Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   147   148   149   150   151   152   153   154   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish