C++ Neural Networks and Fuzzy Logic: Preface



Download 1,14 Mb.
Pdf ko'rish
bet159/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   155   156   157   158   159   160   161   162   ...   443
Bog'liq
C neural networks and fuzzy logic

Notation and Equations

You have just seen an example of the process of training in the feedforward backpropagation network,

described in relation to one hidden layer neuron and one input neuron. There were a few vectors that were

shown and used, but perhaps not made easily identifiable. We therefore introduce some notation and describe

the equations that were implicitly used in the example.

C++ Neural Networks and Fuzzy Logic:Preface

Notation and Equations

119



Notation

Let us talk about two matrices whose elements are the weights on connections. One matrix refers to the

interface between the input and hidden layers, and the second refers to that between the hidden layer and the

output layer. Since connections exist from each neuron in one layer to every neuron in the next layer, there is

a vector of weights on the connections going out from any one neuron. Putting this vector into a row of the

matrix, we get as many rows as there are neurons from which connections are established.

Let M

1

 and M



2

 be these matrices of weights. Then what does M

1

[i][j] represent? It is the weight on the



connection from the ith input neuron to the jth neuron in the hidden layer. Similarly, M

2

[i][j] denotes the



weight on the connection from the ith neuron in the hidden layer and the jth output neuron.

Next, we will use x, y, z for the outputs of neurons in the input layer, hidden layer, and output layer,

respectively, with a subscript attached to denote which neuron in a given layer we are referring to. Let P

denote the desired output pattern, with p

i

 as the components. Let m be the number of input neurons, so that



according to our notation, (x1, x2, …, xm) will denote the input pattern. If P has, say, r components, the

output layer needs r neurons. Let the number of hidden layer neurons be n. Let ²

h

 be the learning rate



parameter for the hidden layer, and ²

o2

, that for the output layer. Let ¸ with the appropriate subscript represent



the threshold value or bias for a hidden layer neuron, and Ä with an appropriate subscript refer to the

threshold value of an output neuron.

Let the errors in output at the output layer be denoted by ejs and those at the hidden layer by t

i

’s. If we use a ”



prefix of any parameter, then we are looking at the change in or adjustment to that parameter. Also, the

thresholding function we would use is the sigmoid function, f(x) = 1 / (1 + exp(–x)).

Previous Table of Contents Next

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Notation

120




Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   155   156   157   158   159   160   161   162   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish