Challenge Question Solution
We were able to review the exact answer to this question during the lesson from Thursday. It was easy to see that dictionaries are clearly the more efficient way to store and retrieve data. It’s always important to keep in mind the proper data structures to use when working with large sets of data. You can be sure that similar questions will be asked in an interview process.
Weekly Challenges
To test out your skills, try these challenges:
Understanding the Market: Go on to a job application web site like Indeed or Monster, and look up potential jobs that you’re interested in. Make notes of the qualifications and technologies they’re looking for. After looking at several job descriptions, what are the top three technologies? These should be your focus going forward.
Shopping Cart Module: Take the code from our Shopping Cart program that we wrote a few weeks back, and put it into a module. In Jupyter Notebook, run the module, and get the program to work properly.
Enhanced Shopping Cart: Add a new feature into the program that allows the user to save the cart. Upon running the program, the saved cart should load. The method should be written within the module. Hint: Use a CSV or text file.
Code Wars: Make an account on www.codewars.comand try to solve some problems. Code Wars has been used for interview practice problems, improving your algorithm and problemsolving skills, and much more. It will help to increase the skills taught in this book. Try to solve a problem a day, and you’ll notice your Python programming skills will improve.
CHAPTER 10
Introduction to Data
Analysis
Up to this point, we’ve covered enough Python basics and programming concepts to move on toward bigger and better things. This week will encompass a full introduction into the data analysis libraries that Python has to offer. We won’t go in depth like other books that focus on this subject; instead we’ll cover enough to get you well on your way to analyzing and parsing information.
We’ll learn about the Pandas library and how to work with tabular data structures, web scraping with BeautifulSoup and understanding how to parse data, as well as data visualization libraries like matplotlib. At the end of the week, we’ll use all these libraries together to create a small project that scrapes and analyzes web sites.
Overview
Working with Anaconda environments and sending requests
Learning how to analyze tabular data structures with Pandas
Understanding how to present data using matplotlib
Using the BeautifulSoup library to scrape the Web for data
Creating a web site analysis tool
© Connor P. Milliken 2020
C. P. Milliken, Python Projects for Beginners, https://doi.org/10.1007/978-1-4842-5355-7_10
Do'stlaringiz bilan baham: |