Day 13 of 80

Async Programming Basics

Phase 1: Python Foundation

Here is the comprehensive content for Day 13 of your GenAI Bootcamp.

*

What You'll Build Today

Today, we are going to break the laws of sequential time—at least, as far as Python is concerned.

Up until now, your code has been strictly "synchronous." This means line 1 runs, then line 2, then line 3. If line 2 takes five seconds to finish, line 3 sits there twiddling its thumbs, waiting. In the world of AI, where generating a response from a large language model (LLM) can take several seconds, this creates a terrible user experience. It makes your application look frozen.

Today, you will build a high-performance script that simulates downloading multiple large files simultaneously. Instead of waiting for one to finish before starting the next, you will handle them all at once.

Here is what you will master today:

* Synchronous vs. Asynchronous Execution: You will learn why doing one thing at a time is safe but slow, and why doing multiple things at once is efficient but tricky.

* The async and await Keywords: You will learn the special syntax that tells Python, "This might take a while, go do something else while you wait."

* asyncio Basics: You will use Python's built-in library for managing these concurrent tasks.

* Running Concurrent Tasks: You will learn how to fire off three different requests and gather the results only when they are all done.

Let's make your code faster.

The Problem

Imagine you are running a coffee shop. You are the only employee.

The Synchronous Way (The Pain):

A customer walks in and orders a latte. You grind the beans, pull the shot, steam the milk, and pour the art. This takes 3 minutes. During those 3 minutes, a line of ten people forms out the door. You ignore them completely until the first latte is handed over. Only then do you ask the second person what they want.

This is how standard Python code works. It is "blocking."

Let's look at code that simulates this. We will use time.sleep() to simulate a slow operation, like waiting for a server to send us a file.

import time

def download_file(filename):

print(f"Starting download: {filename}...")

# Simulate a 2-second delay

time.sleep(2)

print(f"Finished download: {filename}!")

def main():

start_time = time.time()

# We download 3 files, one after another

download_file("Photo_1.jpg")

download_file("Photo_2.jpg")

download_file("Photo_3.jpg")

end_time = time.time()

total_time = end_time - start_time

print(f"\nTotal time taken: {total_time:.2f} seconds")

if __name__ == "__main__":

main()

Run this code. You will see it takes about 6 seconds. Why this hurts:
  • Wasted Time: While the computer is "sleeping" (waiting for the download), the processor is doing absolutely nothing. It is idle.
  • Frozen UI: If this were a graphical app or a chatbot, the user couldn't click buttons or type messages while the download was happening. The screen would just hang.
  • In the world of GenAI, waiting 6 seconds for three separate API calls is unacceptable. We need a way to take the order, start the coffee machine, and take the next order while the coffee brews.

    Let's Build It

    We are going to switch from "Synchronous" (one after another) to "Asynchronous" (start one, pause, start another).

    Step 1: Defining an Async Function

    To tell Python that a function can be paused (to let other code run), we change def to async def. This creates a "coroutine."

    However, we also need to change how we sleep. time.sleep() is a bully—it blocks the entire program. We need a polite sleeper that yields control. That is asyncio.sleep().

    Note: You cannot just run an async function like a normal function. If you try download_file(), Python will just give you a "coroutine object" message and nothing will happen. We need an event loop to run it.

    Step 2: The Event Loop

    Think of the Event Loop as the manager of the coffee shop. It keeps a list of tasks. It says, "Task A is waiting for water to boil? Okay, pause Task A. Task B, you run now."

    We use asyncio.run() to start this manager.

    Let's rewrite our single file downloader using async syntax.

    import asyncio
    

    import time

    # Notice the 'async' keyword before def

    async def download_file_async(filename):

    print(f"Starting download: {filename}...")

    # We use 'await' to pause this function without blocking the whole app. # We MUST use asyncio.sleep, not time.sleep

    await asyncio.sleep(2)

    print(f"Finished download: {filename}!")

    # The main function must also be async to use 'await' inside it

    async def main():

    start_time = time.time()

    # We await the function to run it

    await download_file_async("Photo_1.jpg")

    end_time = time.time()

    print(f"Total time: {end_time - start_time:.2f} seconds")

    if __name__ == "__main__":

    # This starts the event loop manager

    asyncio.run(main())

    Run this.

    It still takes 2 seconds. Why? Because we awaited the download immediately. await means "Pause here until this is done." We haven't achieved concurrency yet, we just changed the syntax.

    Step 3: Running Things Concurrently

    To get the speed boost, we need to schedule all three downloads before we wait for them. We want to say, "Start A, Start B, Start C," and then "Wait for them all to finish."

    We use asyncio.gather() to bundle multiple tasks together.

    import asyncio
    

    import time

    async def download_file_async(filename):

    print(f"Starting download: {filename}...")

    await asyncio.sleep(2) # Represents a slow network call

    print(f"Finished download: {filename}!")

    return f"{filename} data"

    async def main():

    start_time = time.time()

    print("Batch start!")

    # asyncio.gather schedules these 3 things to run on the event loop # It waits until ALL of them are complete

    await asyncio.gather(

    download_file_async("Photo_1.jpg"),

    download_file_async("Photo_2.jpg"),

    download_file_async("Photo_3.jpg")

    )

    end_time = time.time()

    print(f"\nTotal time taken: {end_time - start_time:.2f} seconds")

    if __name__ == "__main__":

    asyncio.run(main())

    Run this code.

    Look at the output.

    * It starts all three downloads almost instantly.

    * It finishes all three around the same time.

    * Total time: ~2 seconds (instead of 6).

    You just tripled the speed of your program.

    Step 4: Understanding the "Await" Magic

    The keyword await is the secret sauce. When Python sees await asyncio.sleep(2), it essentially places a bookmark in that function. It says, "Okay, I can't do anything here for 2 seconds. Does anyone else need the CPU?"

    It hops over to Photo_2.jpg, runs until it hits await, places a bookmark, and hops to Photo_3.jpg.

    When the 2 seconds are up, the Event Loop sees that the sleep is over, goes back to the bookmark, and finishes the function.

    Step 5: Handling Return Values

    Usually, you want to get data back from your tasks (like the text from an LLM or weather data). asyncio.gather returns a list of results in the same order you passed the tasks in.

    Here is the final, complete script showing how to capture the data.

    import asyncio
    

    import time

    async def fetch_data(data_id, delay):

    print(f"--> Fetching request {data_id} (will take {delay}s)")

    # Simulate variable network speeds

    await asyncio.sleep(delay)

    print(f"<-- Received data for {data_id}")

    return {"id": data_id, "status": "complete", "time": delay}

    async def main():

    start_time = time.time()

    print("--- Starting Concurrent Requests ---")

    # We create the tasks, but we don't wait for them individually. # We wait for the 'group' to finish.

    results = await asyncio.gather(

    fetch_data("User_Profile", 2),

    fetch_data("Recent_Posts", 3),

    fetch_data("Friend_List", 1)

    )

    print("--- All Requests Finished ---")

    # results is now a list containing the return value of each function

    for result in results:

    print(f"Processed: {result}")

    end_time = time.time()

    print(f"Total time: {end_time - start_time:.2f} seconds")

    if __name__ == "__main__":

    asyncio.run(main())

    Output Analysis:

    Notice that Friend_List (1 second) finished before User_Profile (2 seconds), even though it was requested last. However, results prints them in the correct order (User, Posts, Friends). asyncio.gather is smart enough to tidy up the results for you.

    Now You Try

    Take the script from Step 5 and modify it to solidify your understanding:

  • Scale it up: Create a loop that generates a list of 10 tasks using a list comprehension or a for loop, then pass that list to asyncio.gather. (Hint: You invoke it like await asyncio.gather(*my_task_list)—the asterisk unpacks the list).
  • The Race: Change the delays so they are random numbers between 1 and 5 using random.randint(). Observe how the order of "Received data" print statements changes every time you run it, but the final results list remains in order.
  • The Timeout: Look up asyncio.wait_for. Try to wrap the gather call in a timeout of 2 seconds. Since some tasks take 3 seconds, the program should crash with a TimeoutError. This is useful for ensuring your AI app doesn't hang forever if the server is down.
  • Challenge Project: Concurrent Weather Fetcher

    Yesterday (Day 12), you built a script to fetch weather data for a single city. If you wanted to check New York, London, and Tokyo, you had to wait for New York to finish before asking about London.

    The Challenge:

    Convert your Day 12 weather script to be asynchronous. You need to fetch the weather for three different cities at the same time.

    Requirements:
  • Use aiohttp (an async library) instead of requests.
  • Note:* requests is synchronous and will block your code. You need to install aiohttp: run pip install aiohttp in your terminal. Hint:* The syntax for aiohttp is slightly different. You need to use async with session.get(url) as response:.
  • Define a function get_weather_async(city) that returns the temperature.
  • In your main function, use asyncio.gather to fetch London, New York, and Tokyo simultaneously.
  • Print the results as they come in, or all at the end.
  • Measure the total time taken.
  • Example Expected Output:
    Fetching weather for London...
    

    Fetching weather for New York...

    Fetching weather for Tokyo...

    Got London: 15°C

    Got Tokyo: 22°C

    Got New York: 10°C

    All finished in 0.45 seconds.

    Hints:

    * You will need to import aiohttp and asyncio.

    * You need to create a ClientSession inside your main async function: async with aiohttp.ClientSession() as session:. Pass this session to your weather fetching function.

    * Don't forget to await the response: data = await response.json().

    What You Learned

    Today you stepped into the world of non-blocking code. This is a massive leap in your capability as a developer.

    * Synchronous means waiting in line. Asynchronous means taking a number and sitting down until your number is called.

    * async def defines a coroutine (a pausable function).

    * await pauses the function to let other work happen.

    * asyncio.gather runs multiple tasks at the same time and collects the results.

    Why This Matters for GenAI:

    When you build a chatbot, you might want to:

  • Send the user's prompt to GPT-4.
  • Search a vector database for relevant documents.
  • Log the user's request to a database.
  • If you do this synchronously, the user waits for the sum of all those times. If you do it asynchronously, the user only waits for the slowest single task. This makes your AI applications feel snappy and professional.

    Tomorrow: Now that your app is fast, we need to make it safe. We will cover Security Basics, specifically how to hide those API keys so you don't accidentally publish your credit card access to the internet.