chisel, libo odnovremenno reshat zadachi i prognozirovaniya, i
distribution functions. Prixoditsya vstesto statistically dostovernyx
obosnovannoy zavisimosti, a poluchenie ustroystva-predskazatelya. Forecast
samostoyatelnÿm, ne zavisyashchim ot seti ustroystvom - interpreter
otsenivat vyxodnoy signal neyroseti na osnove svoix znaniy i
realizatsii.
Mojno vydelit dva klassa zadach, chasto reshaemyx obuchaemymi
classifications (classifications of two classes). V eto sluchae dostatochno
luchshe reshat raznotipnÿe zadachi otdelnymi neyronnymi setyami.
vtoromu, esli on bolshe ili raven nulyu.
Classification of several classes requires improvement
Zadachi predskazaniya ili
prognozirovaniya yavlyayutsya, po sushchestvu,
zadachami postroeniya regressionnoy zavisimosti vyxodnyx dannyx ot
situatsii, on osnove obuchennoy neyronnoy seti sozdayut ustroystvo
sopostavit sluchaynye i directed methods of optimization. I te i drugie
regressionnye zavisimosti. Specific zdes takova, chto, poskolku
When solving zadach klassifikatsii neyronnaya set stroit
vse ", gde chislo vyxodnyx signalov seti ravno chislu klassov, a nomer klassa
neural network.
prinadlejnosti situatsii tomu ili inomu klassu prinimaetsya
obÿema
izvestnyx dannyx, libo ochen silnyx predpolojeniy o vide
interesuet v pervuyu ochred ne postroenie ponyatnoy i teoreticheski
Odna neyronnaya set mojet odnovremenno predskazÿvat neskolko
uravneniy regressii ispolzovat pravdopodobnye neyrosetevÿe
takogo ustroystva neposredstvenno ne poydet v delo - polzovatel budet
answer seti. Naibolee prostoy interpreter voznikaet v zadache binarnoy
classification. Potrebnost v poslednem voznikaet, odnako, krayne rare, i
odnogo vyxodnogo signal seti, a interpreter otnosit, naprimer, situatsiyu k
pervomu klassu, esli
vyxodnoy signal menshe nulya, i ko
neural networks. This is a task prediction and classification.
formirovat sobstvennoe ekspertnoe zaklyuchenie. Isklyucheniya sostavlyayut
Sredi mnojestva osobennostey obucheniya neyronnyx setey mojno
metodÿ ispolzuyutsya v prikladnoy matematike, v tom chisle v teorii
vxodnyx. Neyronnye sets can effectively build strong nelineynye
upravleniya dlya Texnicheskoy sistemy.
interpreter. Shiroko ispolzuetsya interpreter "pobeditel zabiraet
reshayutsya in osnovnom neformalizovannÿe zadachi, to polzovatelya
otsutstvuyushchix dannyx ix uslovnoe matematicheskoe ojidanie (condition -
izvestnÿe znacheniya drugix priznakov) i characteristics razbrosa - doveritelnÿy
interval. This, of course, requires a nepomerno bolshogo libo
razdelyayushchuyu poverxnost in priznakovom prostranstve, a reshenie o
will sootvetstvovat number of maximum output signal.
35
Machine Translated by Google
Funktsionirovanie sloistyx neyronnyx
setey rassmotrim na primere
sleduet call genetic algorithms and algorithms on ix basis. Kombinirovanie metodov
uluchshaet kachestvo rabotÿ, no garantirovanno
Funktsionirovanie sloistyx neyronnyx setey
svoystvax etix podkhodov:
1. Napravlennÿe metodÿ garantiruyut dostijenie minimuma za
neyronov, imeet dva vxoda x1 i x2 i odin vyxod z. Neurons oboznacheny
trexsloynoy neyroseti, izobrajennoy na risunke 2.1. Set sostoit iz pyati
Mojno sformulirovat dva predelno uproshchennyx utverjdeniya o
mugs, synapses - directed arrows. f1, f2,…, nelineynogo
preobrazovaniya, v obshchem sluchae raznÿe dlya kajdogo neyrona. w1, w2,…,
w10 - vesa synapsov. Chernymi mugs oboznacheny tochki
konechnoe vremya ne poluchaetsya. Sushchestvuyut kombinirovannÿe metodÿ,
odnim iz izvestnyx i chasto ispolzuemyx podkhodov
dlya kombinirovaniya
ÿÿÿÿÿÿÿÿÿ ÿÿÿ ÿÿÿÿÿÿ.
konechnoe chislo shagov (in konechnoe vremya), no minimum lokalnogo.
2. Sluchaynye methods guarantee the achievement of global minimum, but
not indefinitely. K sojaleniyu, garantirovanno poluchit globalnÿy minimum za
f5 - functions
poluchit globalnÿy minimum for konechnoe vremya ne poluchaetsya.
konkretnÿe znacheniya, podobrannÿe takim obrazom, chto esli na vxody x1 i x2
podat vxodnÿe dannÿe, to na vyxode z poluchim nekotoryy rezultat,
soglasuyushchiysya s nashey zadachey.
Set schitaetsya obuchennoy, esli vse ee vesa w1, w2,…, w10 imeyut
Rice. 12. Model three-layer neural network
36
Machine Translated by Google
ÿ
ÿ
ÿ ÿÿ
y ÿ
ft
y ÿ
ft
On the whole set of synapses obÿchno nalojenÿ trebovaniya prinadlejnosti
vÿchisleniya proizvodyatsya analogichno. At this labor-intensive vÿchisleniy
Takim obrazom vychislenie funktsii z po neyroseti idet sleva
umnojennyx na vesa sootvetstvuyushchix synapsov (znachenie na vyxode
In sluchae,
esli neyroset imeet m vyxodov, to dlya vsex vyxodov
Vvedem dopolnitelnÿe oboznacheniya: ti - summa vxodov i-go neyrona,
vyxode nelineynogo preobrazovatelya), i = 1, 2,…, 5. Togda:
funktsii dvux peremennyx x1 i x2:
no s odnim vyxodom.
(2.2)
kajdom neyrone (znacheniya ti i yi), to vychisleniya v kajdom neyrone svodyatsya
(2.1)
adaptive summator); yi - vyxodnoe znachenie i-go neyrona (znachenie na
Dlya nashego primera mojno napisat yavnoe vyrajenie dlya z kak
napravo, from layer to layer. Esli soxranyat promejutochnÿe rezultaty v
vozrastaet ne v m raz, kak eto mojet pokazatsya pri analize formuly (2.1), a
na velichinu, ravnuyu (m-1) H, po sravneniyu s trudoemkostyu dlya toy je seti,
vesa i posleduyushchemu vÿchisleniyu funktsii fi ot poluchennogo rezultata.
Do'stlaringiz bilan baham: