max
i
,
~
z - znachenie i-go vyxoda, vozvrashchaemoe neyrosetyu.
ÿ
ÿ
H
1,
ÿ
z
~
oshibki H kak funktsii vesov neyroseti w1, w2, …, wn. This oznachaet, chto ves
odin obuchayushchiy primer. Obuchayushchiy primer sostoit iz vector vxodnyx
zakonomernosti mejdu sovokupnostyu obuchayushchix dannyx i zaranee
Sushchestvuet, odnako, ryad spetsificheskix ogranicheniy. Oni svyazanÿ s
ogromnoy razmernostyu task obucheniya. The number of parameters can be reached
where n - vector size with vyxodnymi dannymi;
- mark i-go exit by primer;
- ÿÿÿÿÿÿÿÿÿÿÿ ÿÿÿÿÿÿÿ ÿÿÿ ÿÿÿÿ ÿÿÿÿÿÿ, kajdaya stroka kotoryx soderjit
(3.1)
vÿchisleniya summarnogo gradienta neobxodimo
prosummirovat vector
dannyx, sootvetstvuyushchego vyxodam neyroseti. Zadacha obucheniya sostoit v
moshchnÿy arsenal metodov optimizatsii mojet byt ispytan dlya obucheniya.
dannyx razmernostyu, ravnoy chislu vxodov neyroseti, i vektora vyxodnyx
optimizatsii, nadstraivaemÿe nad prosteyshim gradientnÿm spuskom. For
108 - and daje bolee. Uje v prosteyshix programmnyx imitatorax na
personalnyx kompyuterax podbiraetsya 103 - 104 parameters.
vector vyxodnyx dannyx, kotoryy bÿl by naibolee blizok k vyxodnym
Obuchenie neyronnoy seti predstavlyaet soboy avtomaticheskiy poisk
dannym from primera.
(3.2)
gradientov, vÿchislyaemÿe dlya kajdogo primera zadachnika (vsego N vektorov).
Obuchenie sloistyx neyronnyx setey
tom, chtoby neyroset v otvet na vektor vxodnyx
dannyx vydavala takoy
,
izvestnym rezultatom. Obuchayushchie dannÿe obrazuyut obuchayushchuyu vÿborku
answer (on the primer), you can use different norms:
Takim obrazom, zadacha obucheniya neyroseti svoditsya k minimizatsii
Mine sravnit, naskolko otlichaetsya otvet seti ot ojidaemogo
40
i
i
zi zi in
2
2
n
1
ÿ
i 1
ii
ÿÿ ÿ
zz
~
ÿ
H
ÿ
Machine Translated by Google
naydena tochka obshchego minimum, tak ona eshche doljna lejat v dostatochno
to vryad li primenim dlya obucheniya.
Voobshche govorya, gelatelno imet algoritmÿ, kotorye trebuyut zatrat
reshat vse testovÿe zadachi (ili, byt mojet, pochti vse s ochen maloy
on the structure of the neural network.
mnogokriterialnoy zadachey optimizatsii: nado nayti tochku obshchego
,
k minimum. For resheniya ÿÿÿÿ ÿÿÿÿÿÿ ÿÿÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿ ÿÿÿÿÿÿÿÿ.
priobretat novye navÿki bez utraty staryx. Vozmojno bolee slaboe
2) Vozmojnost parallelnogo vÿpolneniya naibolee trudoemkix
hypotheses about sushchestvovanii takoy tochki. The basis
of the hypothesis - ochen
obuchenie neyrokompyutera iz obshchix zadach optimizatsii: astronomicheskoe
privlekatelnÿy algorithm requires pamyati poryadka
n
«Sxodstvo» zdes trudno formalizovat, no opÿt pokazÿvaet, chto
doljny byt isklyucheny. This oznachaetsya, chto v dostatochno bolshoy
shirokuyu area, in which znacheniya vsex minimiziruemyx functions
znacheniya vse otsenok malo otlichayutsya ot minimalnyx, chasto opravdÿvaetsya.
trebuet zatrat pamyati poryadka
n
obuchaemyx parameters to razumnyx predelov.
Eshche dva obstoyatelstva svyazanÿ s neyrokompyuternoy spetsifikoy. 1)
Obuchennÿy neurocomputer dolzhen with priemlemoy tochnostyu
neznachitelno otlichayutsya from minimalnyx.
Malo togo, chto doljna byt
many classical and modern methods dostatochno estestvenno lojatsya
pamyati poryadka
Kn, K = const.
chastyu isklyucheniy). Poetomu zadacha obucheniya stanovitsya po sushchestvu
to ego vse je mojno
shirokoy nizmennosti, where znacheniya vsex minimiziruemyx functional blizki
2) Obuchennÿy neurocomputer dolzhen imet vozmojnost
,
minimum bolshogo chisla funktsiy. Obuchenie neurocomputer isxodit iz
Itak, imetet chetyre spetsificheskix ogranicheniya, vydelyayushchix
chislo parametrov, neobxodimost vÿsokogo
parallelizma pri obuchenii,
mnogokriterialnost reshaemyx zadach, neobxodimost nayti dostatochno
stage algorithm, and gelatelno neyronnoy setyu. Esli kakoy-libo osobo
bolshoe chislo peremennyx i sxodstvo mezhdu funktsiyami. Samo ponyatie
trebovanie: novye navÿki mogut soprovojdatsya poterey tochnosti v staryx, no poterya
ne doljna bÿt osobo sushchestvennoy, a kachestvennÿe izmeneniya
ispolzovat, esli s pomoshchyu analiza chuvstvitelnosti
sokratit chislo
Iz-za vÿsokoy razmernosti voznikaet dva trebovaniya k algorithm: 1)
Ogranichenie po pamyati. Empty
n - number of parameters. Esli algorithm
predpolojenie o sushchestvovanii obshchego minimum or or, tochnee, tochek, where
okrestnosti naydennoy tochki obshchego minimum otsenok ix znacheniya
blizki k minimalnym. O ostalnom, eto prosto zadacha optimizatsii, i
2
2
41
Machine Translated by Google
rassmatrivayutsya «on odnomu» - snachala uluchshaetsya znachenie odnogo, potom
chastey: vybora napravleniya i vybora shaga v dannom napravlenii. Togda
Takim obrazom, napravlenie i shag nujno vybirat tak, chtoby na
effektyvny, poetomu davaya zametnoe uluchshenie na odnom obuchayushchem
(3.4)
vesami.
Pri ogranicheniyax po pamyati, a takje v tex sluchayax, kogda poluchenie
prekratitsya, or rasxojdeniya mezhdu izvestnymi vyxodnymi znacheniyami i
postranichnom obuchenii takogo obÿchno ne proisxodit.
organization - vÿbiraetsya dostatochno bogatyy set of criteria (stranitsa
gde
(3.3)
,
w i
Methods odnomernoy optimizatsii dayut effektyvnyy sposob dlya
ego optimization.
w - vectors of weights on k-y and k + 1-y iteratsii; h - shag;
S - direction of movement.
All poshagovÿe methods of optimization sostoyat from dvux vajneyshix
mnogokriterialnoy optimizatsii nevozmojno, esli kriterii
organizatsii, methods odnomernoy optimizatsii
neprimenimy - oni slishkom
naxojdenie minimum funktsii H svedetsya k iteratsionnomu protsessu:
primere, oni chasto razrushayut navÿk resheniya predydushchix. Pri
kajdoy iteratsii tselevaya function umenshalas. When the function is reduced
drugogo i t.d. Neobxodim synthesis obobshchennogo (integralnogo) criteria. Prosteyshiy variant -
summirovanie vsex. Chut slozhnee - summirovanie s
otvetami neyroseti budut razlichatsya dostatochno malo, protsess obucheniya
znacheniya kajdogo kriteriya trebuet zametnyx zatrat, vozmojna postranichnaya
primerov), iz nix formiruetsya integralnyy (otsenka stranitsy) i vedetsya
pri uslovii, chto
mojno schitat zakonchennym.
Do'stlaringiz bilan baham: