SQL Fundamentals with SQLite
What You'll Build Today
Welcome to Day 17! Today, we are crossing a massive bridge in your journey as a developer. Up until now, your Python programs have had short memories. When you stop the script, all your variables, lists, and dictionaries disappear.
Today, we give your programs long-term memory.
We are building a User Management System. Imagine you are building an AI application that generates marketing copy. You need to know who your users are, when they signed up, and how many times they have used your tool so you can bill them later.
You cannot do this with simple variables. You need a database.
Here is what you will master today:
* SQLite Setup: How to create and connect to a database file directly within Python (no server installation required).
* Tables and Schema: Defining the structure of your data so it is organized and consistent.
* CRUD Operations: The four pillars of all data interaction—Create, Read, Update, and Delete.
* SQL Queries: writing text commands that tell the database exactly what information to fetch.
* Data Persistence: Ensuring data survives even after you turn off your computer.
The Problem
Let's look at why we need SQL. Imagine you have a list of users and their API usage counts. You want to save this so it is there tomorrow.
Without a database, your best option is a text file or a CSV (Comma Separated Values) file.
Here is a script that tries to update a user's usage count using a CSV file. Read through this code and notice how painful the logic is.
import os
# The data file
filename = "users.csv"
# 1. We have to create the file if it doesn't exist
if not os.path.exists(filename):
with open(filename, "w") as f:
f.write("user_id,username,api_calls\n")
f.write("1,alice,5\n")
f.write("2,bob,10\n")
# THE GOAL: Increase Bob's api_calls by 1.
updated_lines = []
found = False
# 2. We have to read the WHOLE file into memory
with open(filename, "r") as f:
lines = f.readlines()
# 3. We have to loop through every single line manually
for line in lines:
line = line.strip()
parts = line.split(",")
# Handle the header row
if parts[0] == "user_id":
updated_lines.append(line)
continue
# Check if this is Bob (User ID 2)
if parts[0] == "2":
current_calls = int(parts[2]) # Convert string to int
new_calls = current_calls + 1
# Reconstruct the string... painful!
new_line = f"{parts[0]},{parts[1]},{new_calls}"
updated_lines.append(new_line)
found = True
else:
updated_lines.append(line)
# 4. We have to delete the old file content and write EVERYTHING back
with open(filename, "w") as f:
for line in updated_lines:
f.write(line + "\n")
print("Updated Bob's count.")
Why this is painful:
There has to be a better way. That way is SQL.
Let's Build It
We are going to use SQLite. It is a lightweight database engine that comes built-in with Python. You do not need to install anything. It saves your database as a simple file (e.g., data.db), but allows you to talk to it using powerful SQL commands.
Step 1: The Connection and The Cursor
Think of the database as a warehouse.
* The Connection is the road to the warehouse.
* The Cursor is the worker inside the warehouse who executes your orders.
import sqlite3
# 1. Connect to the database
# If 'app_data.db' doesn't exist, Python creates it for you automatically.
connection = sqlite3.connect("app_data.db")
# 2. Create a cursor object
# This allows us to send SQL commands to the database
cursor = connection.cursor()
print("Database connected successfully!")
# Always close the connection when done (we will keep it open for the next steps)
# connection.close()
Step 2: Creating a Table (The Schema)
Excel has sheets; databases have Tables. A table needs a name and a set of columns. Each column has a specific data type.
We will create a table named users.
* id: A unique number for each user (Integer).
* username: The user's name (Text).
* email: Their email address (Text).
* api_calls: How many times they used our AI (Integer).
# SQL command to create a table
# PRIMARY KEY means this column is the unique identifier
create_table_query = """
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY,
username TEXT NOT NULL,
email TEXT,
api_calls INTEGER DEFAULT 0
);
"""
# Execute the command
cursor.execute(create_table_query)
# Commit the changes (Save the file)
connection.commit()
print("Table 'users' created.")
Note: IF NOT EXISTS prevents the program from crashing if you run this script twice.
Step 3: INSERT (Adding Data)
Let's add users. In SQL, we use the INSERT INTO command.
? placeholders.
# Data to insert
new_user = ("alice_ai", "alice@example.com", 0)
# The SQL command with ? placeholders
insert_query = "INSERT INTO users (username, email, api_calls) VALUES (?, ?, ?)"
# Execute with the data tuple
cursor.execute(insert_query, new_user)
# Let's add another user
cursor.execute(insert_query, ("bob_builder", "bob@example.com", 15))
# SAVE the changes!
connection.commit()
print("Users added successfully.")
Step 4: SELECT (Reading Data)
Now we need to retrieve the data. We use SELECT. You can select specific columns or use * to get everything.
print("--- Fetching All Users ---")
# Select everything from the users table
cursor.execute("SELECT * FROM users")
# fetchall() gives us a list of tuples
all_users = cursor.fetchall()
for user in all_users:
# user is a tuple: (id, username, email, api_calls)
print(f"ID: {user[0]} | Name: {user[1]} | Usage: {user[3]}")
print("\n--- Fetching High Usage Users ---")
# We can filter using the WHERE clause
cursor.execute("SELECT username, api_calls FROM users WHERE api_calls > 10")
high_usage = cursor.fetchall()
for row in high_usage:
print(f"High User: {row[0]} ({row[1]} calls)")
Step 5: UPDATE and DELETE
This is where SQL shines compared to text files. We can target specific rows using logic.
Scenario: Alice just used the API. We need to increment her count. Scenario: Bob canceled his account. We need to remove him.# 1. UPDATE Alice's count
# logic: Set api_calls to whatever it currently is + 1, BUT ONLY for Alice
update_query = "UPDATE users SET api_calls = api_calls + 1 WHERE username = ?"
cursor.execute(update_query, ("alice_ai",))
# 2. DELETE Bob
delete_query = "DELETE FROM users WHERE username = ?"
cursor.execute(delete_query, ("bob_builder",))
# Commit changes
connection.commit()
# Verify results
cursor.execute("SELECT * FROM users")
print("\n--- After Update and Delete ---")
for user in cursor.fetchall():
print(user)
# Clean up
connection.close()
Step 6: Putting it together (Runnable Script)
Here is the complete, runnable code block incorporating everything. Running this multiple times will add more "Alice" entries because we aren't checking if she already exists, which is fine for today.
import sqlite3
def run_database_demo():
# Connect
conn = sqlite3.connect("app_data.db")
cursor = conn.cursor()
# 1. Setup Table
cursor.execute("""
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY,
username TEXT,
api_calls INTEGER
)
""")
conn.commit()
# 2. Insert Data
print("Inserting data...")
users_data = [
("dev_dave", 42),
("pro_pam", 105),
("new_ned", 0)
]
# execu-temany is a shortcut to run the same command for a list of data
cursor.executemany("INSERT INTO users (username, api_calls) VALUES (?, ?)", users_data)
conn.commit()
# 3. Read Data
print("\nCurrent Users:")
cursor.execute("SELECT * FROM users")
for row in cursor.fetchall():
print(row)
# 4. Update Data (Give everyone 10 free credits)
print("\nGranting 10 bonus calls to everyone...")
cursor.execute("UPDATE users SET api_calls = api_calls + 10")
conn.commit()
# 5. Read Specific Data
print("\nUsers with more than 100 calls:")
cursor.execute("SELECT username, api_calls FROM users WHERE api_calls > 100")
for row in cursor.fetchall():
print(f"{row[0]} is a power user with {row[1]} calls.")
conn.close()
if __name__ == "__main__":
run_database_demo()
Now You Try
You have the foundation. Now, create a new file called my_db_practice.py and try these extensions.
Modify the table creation code to add a new column called last_login_date (type TEXT). Insert a user with today's date (e.g., "2023-10-27"). Hint: You might need to delete your old .db file first so the script can recreate the table with the new column.
Write a function called reset_usage(user_id) that takes a user's ID as an argument and sets their api_calls back to 0.
Write a script that deletes all users who have api_calls equal to 0. Use DELETE FROM users WHERE...
Challenge Project: The Credits System
In real AI apps, users often buy "credits" (e.g., 50 credits for $10). Every time they generate an image, it costs 1 credit. If they have 0 credits, they cannot generate images.
Your Goal: Build a system that manages this logic. Requirements:credits_ledger with columns: username (Text) and balance (Integer).attempt_generation(username) that:* Checks the user's balance.
* If balance > 0: Decrement balance by 1, save the change, and print "Generation successful! Remaining: X".
* If balance == 0: Print "Insufficient funds. Please reload."
Generation successful! Remaining: 4
Generation successful! Remaining: 3
...
Generation successful! Remaining: 0
Insufficient funds. Please reload.
Hints:
* You will need to SELECT the balance first.
* Use fetchone() to get the single result. The result will be a tuple, like (5,), so you need to access index [0] to get the number.
* Don't forget connection.commit() after the update!
What You Learned
Today you moved from temporary script variables to permanent data storage. This is essential for any production-grade application.
* sqlite3: The Python module for database interaction.
* CREATE: Defining the structure of your data.
* INSERT: Adding new records.
* SELECT / WHERE: Finding exactly the data you need.
* UPDATE / DELETE: Modifying or removing data based on logic.
* Commit: The save button for databases.
Why This Matters:When you build your final GenAI capstone, you will likely need to store the history of prompts users have sent, or cache the responses from the AI to save money. SQL is the standard tool for this.
Tomorrow:SQL is great for structured data (rows and columns), but what if your data is messy, like a complex JSON object returned by an AI model? Tomorrow we explore NoSQL concepts for flexible document storage.