Data Analysis From Scratch With Python: Step By Step Guide



Download 2,79 Mb.
Pdf ko'rish
bet56/60
Sana30.05.2022
Hajmi2,79 Mb.
#620990
1   ...   52   53   54   55   56   57   58   59   60
Bog'liq
Data Analysis From Scratch With Python Beginner Guide using Python, Pandas, NumPy, Scikit-Learn, IPython, TensorFlow and... (Peters Morgan) (z-lib.org)

Potential & Constraints
The idea behind artificial neural networks is actually old. But recently it has
undergone massive reemergence that many people (whether they understand it or
not) talk about it.
Why did it become popular again? It’s because of data availability and
technological developments (especially massive increase in computational


power). Back then creating and implementing an ANN might be impractical in
terms of time and other resources.
But it all changed because of more data and increased computational power. It’s
very likely that you can implement an artificial neural network right in your
desktop or laptop computer. And also, behind the scenes ANNs are already
working to give you the most relevant search results, most likely products you’ll
purchase, or the most probable ads you’ll click. ANNs are also being used to
recognize the content of audio, image, and video.
Many experts say that we’re only scratching the surface and artificial neural
networks still have a lot of potential. It’s like when an experiment about
electricity (done by Michael Faraday) was performed and no one had no idea
what use would come from it. As the story goes, Faraday told that the UK Prime
Minister would soon be able to tax it. Today, almost every aspect of our lives
directly or indirectly depends on electricity.
This might also be the case with artificial neural networks and the exciting field
of Deep Learning (a subfield of machine learning that is more focused on
ANNs).
Here’s an Example
With TensorFlow Playground we can get a quick idea of how it all works. Go to
their website (
https://playground.tensorflow.org/
) and take note of the different
words there such as Learning Rate, Activation, Regularization, Features, and
Hidden Layers. At the beginning it will look like this (you didn’t click anything
yet): 
Click the “Play” button (upper left corner) and see the cool animation (pay close
attention to the Output at the far right. After some time, it will look like this: 


The connections became clearer among the Features, Hidden Layers, and
Output. Also notice that the Output has a clear Blue region (while the rest falls in
Orange). This could be a Classification task wherein blue dots belong to Class A
while the orange ones belong to Class B.
As the ANN runs, notice that the division between Classs A and Class B
becomes clearer. That’s because the system is continuously learning from the
training examples. As the learning becomes more solid (or as the rules are
getting inferred more accurately), the classification also becomes more accurate.
Exploring the TensorFlow Playground is a quick way to get an idea of how
neural networks operate. It’s a quick visualization (although not a 100% accurate
representation) so we can see the Features, Hidden Layers, and Output. We can
even do some tweaking like changing the Learning Rate, the ratio of training to
test data, and the number of Hidden Layers.
For instance, we can set the number of hidden layers to 3 and change the
Learning Rate to 1 (instead of 0.03 earlier). We should see something like this: 
When we click the Play button and let it run for a while, somehow the image


will remain like this: 
Pay attention to the Output. Notice that the Classification seems worse. Instead
of enclosing most of the yellow points under the Yellow region, there are a lot of
misses (many yellow points fall under the Blue region instead). This occurred
because of the change in parameters we’ve done.
For instance, the Learning Rate has a huge effect on accuracy and achieving just
the right convergence. If we make the Learning Rate too low, convergence might
take a lot of time. And if the Learning Rate is too high (as with our example
earlier), we might not reach the convergence at all because we overshot it and
missed.
There are several ways to achieve convergence within reasonable time (e.g.
Learning Rate is just right, more hidden layers, probably fewer or more Features
to include, applying Regularization). But “overly optimizing” for everything
might not make economic sense. It’s good to set a clear objective at the start and
stick to it. If there are other interesting or promising opportunities that pop up,
you might want to further tune the parameters and improve the model’s
performance.
Anyway, if you want to get an idea how an ANN might look like in Python,
here’s a sample code: 

Download 2,79 Mb.

Do'stlaringiz bilan baham:
1   ...   52   53   54   55   56   57   58   59   60




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish