If you’ve ever written a script that makes a bunch of HTTP requests one at a time and thought “this is painfully slow,” this post is for you.
We’re going to take a slow, synchronous script and make it fast using Python’s asyncio.
Scenario
Let’s say you need to fetch data from 10 different URLs. One naive approach might look like this:
1 | import requests |
Each request takes about 1 second because the endpoint simulates a 1-second delay. We have 10 requests, so this takes roughly 10 seconds.
While we’re waiting for a response from the server, our program is doing absolutely nothing until bytes arrive over the network. Then it moves to the next request and waits for information again.
Concurrency to Save the Day
Concurrency doesn’t mean doing things “at the same time” on multiple CPU cores (that’s parallelism). Concurrency means managing multiple tasks that are in progress at once.
Concurrency is great for network requests because the “slow part” is waiting for a remote server to respond — not doing computation on your CPU.
The Fix: asyncio + aiohttp
Python’s asyncio module lets us write concurrent code using async/await. Combined with aiohttp (an async HTTP library), we can fire off all 10 requests without waiting for each one to finish:
1 | import asyncio |
This finishes in roughly 1 second instead of 10. Same 10 requests, ~10x faster.
What’s Happening Under the Hood
Let’s walk through this step by step:
asyncio.run(main())starts the event loop — this is the “waiter” from our restaurant analogy.We create a list of
tasks, one for each URL. Each task is a call tofetch()that hasn’t started yet.asyncio.gather(*tasks)tells the event loop: “start all of these and let me know when they’re all done.”The event loop starts the first request. When it hits
session.get(url), instead of blocking, it says “I’ll come back when the network responds” and moves on to start the next request.It does this for all 10 requests. Now all 10 are “in flight” simultaneously.
As responses come back, the event loop picks up each one right where it left off (the line after
await) and finishes processing it.
The key insight: await is not time.sleep(). When you await something, you’re telling the event loop “I’m waiting on I/O, go do something else in the meantime.” That’s what makes it concurrent.
When to Use This (and When Not To)
Use asyncio for I/O-bound work:
- HTTP requests (APIs, web scraping)
- Database queries
- File reads/writes
- Anything where your code spends most of its time waiting
Don’t use asyncio for CPU-bound work:
- Number crunching, image processing, data transformation
- If your code is slow because the CPU is busy computing, concurrency won’t help — you need
multiprocessingfor true parallelism across cores
A quick rule of thumb: if you can make your program faster by getting a faster internet connection, it’s I/O-bound. If you need a faster CPU, it’s CPU-bound.
Gotchas
A few things that trip people up:
- You can’t mix
requestswithasyncioeasily. Therequestslibrary is synchronous and will block the event loop. Useaiohttpinstead, or wrap synchronous calls withasyncio.to_thread(). asyncio.run()can only be called once. It creates and destroys an event loop. If you’re inside a framework that already has a loop (like FastAPI), justawaitdirectly.- Error handling works normally. Wrap your
awaitcalls in try/except just like synchronous code.
Summary
| Synchronous | Concurrent (asyncio) | |
|---|---|---|
| 10 requests @ 1s each | ~10 seconds | ~1 second |
| How it works | One at a time, blocking | All in flight, non-blocking |
| Good for | Simple scripts | Many I/O operations |