Beginning Anomaly Detection Using



Download 26,57 Mb.
Pdf ko'rish
bet268/283
Sana12.07.2021
Hajmi26,57 Mb.
#116397
1   ...   264   265   266   267   268   269   270   271   ...   283
Bog'liq
Beginning Anomaly Detection Using Python-Based Deep Learning

Figure B-24.  The general formula for log_softmax. The value i goes up until the 

total number of samples, K.

Figure B-23.  The softmax layer in the forward function of a model

However, this doesn’t work well if you’re using NLLL (negative log likelihood) loss, in 

which case you should use log_softmax instead.

 Log_Softmax

torch.nn.LogSoftmax()

This performs a softmax activation on the given dimension, but passes that through 

a log function.

The general formula for log_softmax is shown in Figure 

B-24


 (K is the number of 

samples).

Here is the parameter:

• 

dim: The dimension to compute softmax along, determined by some 

integer 

n. This is so every slice along the dimension will sum to 1. 

Default = None.

You can define this as a layer within the model itself, or apply softmax in the forward 

function like so:

torch.nn.functional.log_softmax(input, dim=None, _stacklevel=3)

Input is the previous layer.

appendix B   intro to pytorch




386

Figure B-25.  The log softmax layer in the forward function of a model

Figure 


B-25

 shows an example of how you can use this layer in the forward function.



 Sigmoid

torch.nn.Sigmoid()

This performs a sigmoid activation.

The sigmoid function does have its uses, primarily because it forces the input to be 

between 0 and 1, but it is prone to the vanishing gradient problem, and so it is seldom 

used in hidden layers.

There are no parameters, so it’s a simple function to call.

To get an idea of what the equation is like when graphed, refer to Figure 

B-26

.

Figure B-26.  The general graph of a sigmoid function



appendix B   intro to pytorch


387

You can define this as a layer within the model itself, or apply sigmoid in the forward 

function like so:

torch.nn.functional.sigmoid(input)



Input is the previous layer.

Figure 


B-27

 shows an example of how you can use this layer in the forward function.



Figure B-28.  The general formula for mean squared loss

Figure B-27.  The sigmoid layer in the forward function of a model


Download 26,57 Mb.

Do'stlaringiz bilan baham:
1   ...   264   265   266   267   268   269   270   271   ...   283




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish