Beginning Anomaly Detection Using


n means a dilation factor of  n



Download 26,57 Mb.
Pdf ko'rish
bet265/283
Sana12.07.2021
Hajmi26,57 Mb.
#116397
1   ...   261   262   263   264   265   266   267   268   ...   283
Bog'liq
Beginning Anomaly Detection Using Python-Based Deep Learning

n means a dilation factor of 

n. A tuple of two integers allows you to specify (vertical_dilation

horizontal_dilation). Default = 1.

• 

return_indices: If set to True, it will return the indices of the max 

values along with the outputs. Default = False.

• 

ceil_mode: If set to True, it will use ceil instead of floor to compute 

the output shape. This comes into play because of the dimensionality 

reduction involved (a kernel size of 



n will reduce dimensionality by a 

factor of 



n).

 ZeroPadding2D

torch.nn.ZeroPad2d()

Depending on the input, it pads the input sequence with a row and columns of 

zeroes at the top, left, right, and bottom of the image tensor.

appendix B   intro to pytorch



382

Here is the parameter:

• 

padding: An integer or a tuple of four integers. The integer tells it 

to add 


n rows of zeroes on the top and bottom of the image tensor, 

and 


n columns of zeroes. The tuple of four integers is formatted as 

(padding_left, padding_right, padding_top, padding_bottom), so 

you can customize even more how you want the layer to add rows or 

columns of zeroes.



 Dropout

torch.nn.Dropout()

What the dropout layer does in PyTorch is take the input and randomly zeroes the 

elements according to some probability 



p using samples from a Bernoulli distribution. 

This process is random, so with every forward pass through the model, different 

elements will be chosen to be zeroed. This process helps with regularization of layer 

outputs and helps combat overfitting.

Here are the parameters:

• 

p: The probability of an element to be zeroed. Default = 0.5

• 

inplace: If set to True, it will perform the operation in place.  

Default = False.

You can define this as a layer within the model itself, or apply dropout in the forward 

function like so:

torch.nn.functional.Dropout(input, p = 0.5, training=False, inplace=False)


Download 26,57 Mb.

Do'stlaringiz bilan baham:
1   ...   261   262   263   264   265   266   267   268   ...   283




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish