External Libraries & HTTP Basics
What You'll Build Today
Up until now, every program you have written has lived entirely inside your computer. You type code, Python runs it, and the result pops up on your screen. It is a closed loop.
But the modern world—and especially the world of Generative AI—lives on the internet. Large Language Models (LLMs) like GPT-4 or Claude are too massive to run on a typical laptop. They run on massive servers in the cloud. To build AI applications, your Python script needs to leave your computer, travel across the internet, knock on a server's door, and ask for information.
Today, you are going to break out of the closed loop. You will build a Weather Fetcher.
This program will connect to a real server, request the current weather data for a specific location, and display it to you.
Here is what you will learn and why it matters:
* Libraries and PyPI: Python comes with "batteries included," but it doesn't have everything. You will learn how to download code other people have written so you don't have to reinvent the wheel.
* The pip workflow: You will learn the standard way to install these external tools.
* HTTP Requests: This is the language of the web. You will learn how to send a "GET" request to ask for data.
* APIs and JSON: Servers don't send back pretty websites with images; they send back raw data. You will learn how to parse this data so your Python program can understand it.
* Status Codes: You will learn how to tell if your request succeeded (200 OK) or failed (404 Not Found).
The Problem
Imagine you want to download the contents of a website using Python. You want to know the weather, so you decide to download data from a weather site.
Python actually has a built-in tool for this called urllib. It comes installed with Python, so you might think, "Great, I'll just use that."
Here is what it looks like to simply fetch data from a website using the built-in tools:
import urllib.request
import json
# The URL we want to access
url = "https://api.open-meteo.com/v1/forecast?latitude=52.52&longitude=13.41¤t=temperature_2m"
print("Attempting to fetch weather data...")
# We have to open the connection manually
try:
with urllib.request.urlopen(url) as response:
# The data comes back as raw bytes (computer code), not text
data_bytes = response.read()
# We have to manually decode the bytes into a string
data_string = data_bytes.decode('utf-8')
# Then we have to turn that string into a dictionary
data_dict = json.loads(data_string)
print("Success!")
print(data_dict)
except urllib.error.URLError as e:
print(f"Network error: {e}")
except Exception as e:
print(f"Something else went wrong: {e}")
If you run this, it works. But look at the code. You have to manage opening connections, reading raw bytes, decoding character sets (like 'utf-8'), and manually parsing JSON.
Now imagine doing this for a complex AI application where you need to send headers, authentication keys, and massive blocks of text. Using urllib becomes painful, verbose, and hard to read. It feels like building a car from scratch just to drive to the grocery store.
There has to be a better way.
Let's Build It
The "better way" is using a third-party library.
In the Python world, a Library (or Package) is a bundle of code someone else wrote to solve a specific problem. The Python community has a massive repository called PyPI (The Python Package Index), which is like an App Store for code.
The most popular library in the world is called requests. Its tagline is "HTTP for Humans." It takes all that complex code above and makes it simple.
Step 1: Installing the Library
requests does not come with Python. It lives on PyPI. To get it, we use a tool called pip (Pip Installs Packages).
You run this command in your terminal or command prompt (not inside the Python script itself).
``bash
pip install requests
`
Note: If you are on Mac/Linux and that doesn't work, try pip3 install requests.
You should see text scrolling by saying "Downloading" and finally "Successfully installed requests".
Step 2: Your First Request
Now that we have the library, let's use it. We are going to fetch weather data from Open-Meteo, a fantastic free API (Application Programming Interface) that doesn't require passwords or keys.
Create a new file called
weather_app.py.
# We import the library we just installed
import requests
# This URL points to a specific "endpoint" that returns weather data
# Latitude 51.50, Longitude -0.12 is London, UK
url = "https://api.open-meteo.com/v1/forecast?latitude=51.50&longitude=-0.12¤t_weather=true"
print("Sending request to the internet...")
# requests.get() does all the heavy lifting:
# connecting, sending, receiving, and decoding.
response = requests.get(url)
# Print the HTTP Status Code
# 200 means "OK" (Success)
# 404 means "Not Found"
# 500 means "Server Error"
print(f"Status Code: {response.status_code}")
Run this code.
You should see
Status Code: 200. This is the internet's way of giving you a thumbs up. If you mistyped the URL, you might get a 404.
Step 3: Parsing the Data
Getting a 200 OK is great, but we want the actual weather.
APIs usually return data in a format called JSON (JavaScript Object Notation). It looks almost exactly like a Python Dictionary (curly braces, key-value pairs).
The
requests library has a built-in tool to convert the server's JSON directly into a Python dictionary.
Update your code:
import requests
url = "https://api.open-meteo.com/v1/forecast?latitude=51.50&longitude=-0.12¤t_weather=true"
response = requests.get(url)
if response.status_code == 200:
print("Connection successful!")
# .json() converts the response text into a Python Dictionary
data = response.json()
# Let's print the whole dictionary to see what we got
print("Raw Data:")
print(data)
else:
print(f"Failed to retrieve data. Status code: {response.status_code}")
Run this code.
You will see a big dictionary printed out. It might look messy, but look closely. You'll see keys like
current_weather, and inside that, temperature.
Step 4: Extracting Specific Information
We don't want to dump the raw dictionary to the user. We want to tell them the temperature.
Because
data is just a standard Python dictionary (which you learned in Phase 1), you know how to access values using keys.
import requests
url = "https://api.open-meteo.com/v1/forecast?latitude=51.50&longitude=-0.12¤t_weather=true"
response = requests.get(url)
if response.status_code == 200:
data = response.json()
# Access the 'current_weather' dictionary inside the main dictionary
current_weather = data["current_weather"]
# Extract specific values
temp = current_weather["temperature"]
wind_speed = current_weather["windspeed"]
print("----------------------------")
print(f"Weather Report for London:")
print(f"Temperature: {temp} degrees")
print(f"Wind Speed: {wind_speed} km/h")
print("----------------------------")
else:
print("Error connecting to server.")
Run this code.
Now you have a clean, readable output derived from live internet data.
Step 5: Handling Errors (The "What If")
The internet is unreliable. WiFi drops, servers crash, and URLs change. If your internet is off,
requests.get(url) will crash your program with a nasty error message.
We need to wrap our dangerous internet call in a
try/except block.
import requests
url = "https://api.open-meteo.com/v1/forecast?latitude=51.50&longitude=-0.12¤t_weather=true"
print("Contacting Weather Satellite...")
try:
# This is the line that might fail if there is no internet
response = requests.get(url)
# Check if the server said "OK"
if response.status_code == 200:
data = response.json()
current_weather = data["current_weather"]
temp = current_weather["temperature"]
print(f"Success! The temperature is {temp} degrees.")
else:
# This handles cases where the internet works, but the page doesn't exist (404)
print(f"Server returned an error: {response.status_code}")
except requests.exceptions.RequestException as e:
# This handles cases where the connection failed entirely (no wifi, DNS error)
print("CRITICAL ERROR: Could not connect to the internet.")
print(f"Details: {e}")
This is robust code. It handles success, server rejection (404), and connection failure (Exception).
Now You Try
You have a working weather fetcher for London. Let's make it better.
Change the Location: Look up the Latitude and Longitude of your own city (search "lat long [your city]" on Google). Update the URL variables to use your coordinates.
Add Units: The API returns a current_weather_units key inside the main dictionary (print the raw data again to see it). Modify your print statement to include the correct unit (like "°C" or "km/h") dynamically fetched from the data, rather than hardcoding it.
User Input: Ask the user to input a latitude and longitude using input(), then plug those variables into the URL. Hint: You will need to use f-strings to build the URL string.
Challenge Project: The Joke Saver
Your challenge is to build a script that fetches a random joke from the internet and saves it to a text file on your computer.
The API:
Use this URL:
https://official-joke-api.appspot.com/random_joke
This API returns a JSON object with a
setup (the question) and a punchline (the answer).
Requirements:
Use requests to fetch the data.
Check the status code to ensure the request was successful.
Parse the JSON to extract the setup and punchline.
Open a file named joke_of_the_day.txt in "write" mode (w).
Write the joke into the file in a nice format.
Wrap the whole thing in a try/except block to handle network errors.
Example Output (inside the text file):
JOKE OF THE DAY
===============
Setup: Why do programmers prefer dark mode?
Punchline: Because light attracts bugs.
Hints:
* Remember that
response.json() returns a dictionary. You access parts of it using ['key_name'].
* File I/O uses
with open("filename.txt", "w") as file: (Recalled from Day 10).
* You can write multiple lines to a file using
file.write(). Don't forget \n for new lines.
What You Learned
Today you stepped into the larger world of Python development. You learned:
* Libraries: How to use
pip install to add new superpowers to Python.
* The Request/Response Cycle: You send a request, the server sends a response.
* HTTP Verbs: We used
GET today (to get data). Later we will use POST (to send data).
* Status Codes: 200 is good, 404 is missing, 500 is broken.
* JSON: The universal language of data exchange on the web.
Why This Matters:
When you build an AI application, you aren't writing the AI yourself. You are using
requests` to send a prompt to OpenAI or Anthropic, and they send back the answer in JSON format. The code structure you wrote today—Connect, Check Status, Parse JSON—is the exact same structure used in professional AI engineering.
Tomorrow:
Now that we can fetch data, what happens if we need to fetch data from 50 different websites at once? Doing them one by one is slow. Tomorrow, we will look at Async Programming—how to do many things at the same time.