A gentle introduction to deep learning in medical image processing



Download 1,27 Mb.
bet4/15
Sana18.07.2022
Hajmi1,27 Mb.
#818719
1   2   3   4   5   6   7   8   9   ...   15
Bog'liq
2.2-reja

Network training


Having gained basic insights into neural networks and their basic topology, we still need to discuss how its parameters θ are actually determined. The answer is fairly easy: gradient descent. In order to compute a gradient, we need to define a function that measures the quality of our parameter set θ, the so-called loss function L(θ). In the following, we will work with simple examples for loss functions to introduce the concept of back-propagation, which is the algorithm that






Figure 3. A decision tree allows to describe any partition of space and can thus model any decision boundary. Mapping the tree into a one-layer network is possible. Yet, there still is significant residual error in the resulting function. In the center example, 0.7. In order to reduce this error further, a higher number of neurons would be required. If we construct a network with one node for every inner node in the first layer and one node for every leaf node in the second layer, we are able to construct a network that results in = 0.



is commonly used to efficiently compute gradients for neural network training.
We can represent a single-layer fully connected network with linear activations simply as yˆ = fˆ (x) = Wx, i.e., a
where η is the so-called learning rate and j is used to index the iteration number.
Now, let us consider a slightly more complicated net- work structure with three layers yˆ = fˆ 3(fˆ 2(fˆ 1(x))) =

matrix multiplication. Note that the network’s output is now


multidimensional with yˆ, y Rm. Using an L2-loss, we end up with the following objective function:



2 2 2 2
L(θ) = 1 ||fˆ (x) − y||2 = 1 ||Wx y||2. (4)
In order to update the parameters θ = W in this example, we need to compute

Download 1,27 Mb.

Do'stlaringiz bilan baham:
1   2   3   4   5   6   7   8   9   ...   15




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish