C++ Neural Networks and Fuzzy Logic: Preface


Pattern’s Contribution to Weight



Download 1,14 Mb.
Pdf ko'rish
bet80/443
Sana29.12.2021
Hajmi1,14 Mb.
#77367
1   ...   76   77   78   79   80   81   82   83   ...   443
Bog'liq
C neural networks and fuzzy logic

Pattern’s Contribution to Weight

Next, we work with the bipolar versions of the input patterns. You take each pattern to be recalled, one at a

time, and determine its contribution to the weight matrix of the network. The contribution of each pattern is

itself a matrix. The size of such a matrix is the same as the weight matrix of the network. Then add these

contributions, in the way matrices are added, and you end up with the weight matrix for the network, which is

also referred to as the correlation matrix. Let us find the contribution of the pattern A = (1, 0, 1, 0):

First, we notice that the binary to bipolar mapping of A = (1, 0, 1, 0) gives the vector (1, –1, 1, –1).

Then we take the transpose, and multiply, the way matrices are multiplied, and we see the following:

     1  [1   −1   1   −1]       1   −1   1   −1

     1                     =   −1    1  −1    1

     1                          1   −1   1   −1

     1                         −1    1  −1    1

Now subtract 1 from each element in the main diagonal (that runs from top left to bottom right). This

operation gives the same result as subtracting the identity matrix from the given matrix, obtaining 0’s in the

main diagonal. The resulting matrix, which is given next, is the contribution of the pattern (1, 0, 1, 0) to the

weight matrix.

      0      −1      1     −1

     −1       0     −1      1

      1      −1      0     −1

     −1       1     −1      0

Similarly, we can calculate the contribution from the pattern B = (0, 1, 0, 1) by verifying that pattern B’s

contribution is the same matrix as pattern A’s contribution. Therefore, the matrix of weights for this exercise

is the matrix W shown here.

           0     −2      2      −2

  W  =    −2      0     −2       2

           2     −2      0      −2

          −2      2     −2       0

You can now optionally apply an arbitrary scalar multiplier to all the entries of the matrix if you wish. This is

how we had previously obtained the +/− 3 values instead of +/− 2 values shown above.

Previous Table of Contents Next

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Pattern’s Contribution to Weight

64




Download 1,14 Mb.

Do'stlaringiz bilan baham:
1   ...   76   77   78   79   80   81   82   83   ...   443




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish