Beginning Anomaly Detection Using



Download 26,57 Mb.
Pdf ko'rish
bet112/283
Sana12.07.2021
Hajmi26,57 Mb.
#116397
1   ...   108   109   110   111   112   113   114   115   ...   283
Bog'liq
Beginning Anomaly Detection Using Python-Based Deep Learning

Figure 4-58.  The variational encoder neural network

The parameters of the model are trained via two loss functions: a reconstruction 

loss forcing the decoded samples to match the initial inputs (just like in the previous 

autoencoders), and the KL divergence between the learned latent distribution and the 

prior distribution, acting as a regularization term. You can actually get rid of this latter 

term entirely, although it does help in learning well-formed latent spaces and reducing 

overfitting to the training data.

Chapter 4   autoenCoders




165

The distribution that you’re learning from is not too far removed from a normally 

distributed so you going to try to force your latent distribution to be relatively close to 

a mean of zero and a standard deviation of one so before you can train your variational 

autoencoder you must consider that there is a sampling problem that could happen. 

Since you are only taking a sample of the distribution from the mean vector and the 

standard deviation, it is harder to realize backpropagation there. You are sampling it so 

how do you get back during the back propagation step?

A variational autoencoder is a kind of a mix of neural networks and graphical models 

since the first paper came up on variational autoencoder tried to create a graphical 

model and then turn the graphical model to a neural network. The variational auto 

encoder is based on variational inference.

Assume that there are two different distributions, p and q, and that you can use KL 

divergence to show dissimilarity between the two distributions, p and q. Thus, a KL 

divergence serves as a measure of the similarity between the two distributions, p and q.

The best way to understand the need for a variational autoencoder is that in a general 

autoencoder, the bottleneck is too dependent on the inputs and there is no understanding 

of the nature of the data. Since you use sampling of the distribution instead, you will be 

able to better accommodate the model to new types of data.

Figure 


4-59

 shows the basic code to import all necessary packages in Jupyter. Also 

note the versions of the various necessary packages.

Chapter 4   autoenCoders




166

Figure 


4-60

 shows the code to visualize the results via a confusion matrix, a chart for 

the anomalies, and a chart for the errors (difference between predicted and truth) while 

training.




Download 26,57 Mb.

Do'stlaringiz bilan baham:
1   ...   108   109   110   111   112   113   114   115   ...   283




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish