I still remember the first time I tried async Python. I added async to my function, used await everywhere, and expected magic. The code ran exactly as slowly as before. That was the day I realized async programming isn’t just about keywords. It’s about understanding how Python actually handles concurrency.
Three years later, after building several high-traffic services with asyncio, I’ve learned that the difference between broken async code and performant async code often comes down to a handful of patterns and pitfalls. This guide covers what actually matters in production.
Why Asyncio Matters Now More Than Ever
The Python web ecosystem looks very different from even two years ago. Modern backends power AI agents holding long-running conversations, coordinate multiple LLM calls, query vector databases, and stream partial results in real time. These workloads expose weaknesses that were easy to ignore when we were just serving simple CRUD endpoints.
Frameworks like FastAPI have made async the default for new Python web projects. But knowing the syntax isn’t the same as knowing how to use it well. I’ve seen teams claim they were using async while their code actually ran sequentially, bottlenecks hidden behind async def keywords.
The performance difference is real. A service making 1000 API calls that takes 3 minutes synchronously can drop to under 10 seconds with proper async patterns. But only if you understand what you’re doing.
Core Concepts You Need to Understand
The Event Loop
Think of the event loop as a traffic controller for your program. It manages and schedules asynchronous tasks, monitoring which ones are ready to proceed and executing them one at a time. Unlike multi-threaded code where tasks run simultaneously, async uses cooperative multitasking. Your tasks voluntarily yield control back to the loop when waiting.
This matters because one misbehaving task can block everything. If you call blocking code inside an async function, you freeze the entire loop.
Coroutines and Tasks
A coroutine is a function defined with async def that can pause and resume. When you call an async function, you get a coroutine object back, not the result. You need to await it to run.
async def fetch_data():
return "data"
# This creates a coroutine object but doesn't run the function
coro = fetch_data()
# This actually runs it
result = await coro
Tasks wrap coroutines and schedule them on the event loop. They’re how you run multiple coroutines concurrently.
import asyncio
async def fetch_user(user_id):
# Simulate API call
await asyncio.sleep(1)
return {"id": user_id, "name": f"User {user_id}"}
async def main():
# This runs both fetches concurrently
results = await asyncio.gather(
fetch_user(1),
fetch_user(2)
)
print(results) # Takes ~1 second, not 2
asyncio.run(main())
The Five Pitfalls That Will Burn You
After years of debugging async code in production, these are the issues I see most often.
Pitfall 1: Forgetting to Await
This one’s sneaky. Call an async function without await and Python creates the coroutine object but never runs it. Your code looks like it works but does nothing.
# WRONG - creates coroutine but doesn't run it
async def main():
fetch_data() # Nothing happens!
# CORRECT
async def main():
await fetch_data() # This runs the function
Pitfall 2: Blocking the Event Loop
Running synchronous blocking code inside an async function freezes everything. I once saw a team use requests.get() inside their async endpoints and wonder why their “async” service was slower than their old synchronous one.
# WRONG - blocks the event loop
async def main():
response = requests.get(url) # This blocks!
# CORRECT - use async library
async def main():
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.text()
For code that has no async version, use thread pools:
import asyncio
from concurrent.futures import ThreadPoolExecutor
executor = ThreadPoolExecutor()
async def main():
loop = asyncio.get_running_loop()
# Run blocking code in thread pool
result = await loop.run_in_executor(
executor,
blocking_function
)
Pitfall 3: Sequential Instead of Concurrent
It’s easy to write code that looks async but actually runs sequentially:
# Looks async, but runs sequentially
async def main():
result1 = await fetch_data(1)
result2 = await fetch_data(2)
result3 = await fetch_data(3)
This takes 3 seconds if each call takes 1 second. Fix it with asyncio.gather:
# Actually concurrent
async def main():
results = await asyncio.gather(
fetch_data(1),
fetch_data(2),
fetch_data(3)
)
This takes 1 second total.
Pitfall 4: Not Handling Exceptions
In async code, exceptions don’t automatically bubble up the way you expect. If one task in asyncio.gather fails, the others keep running but you might lose the exception entirely.
# RISKY - exceptions might be lost
results = await asyncio.gather(
risky_task(),
another_task()
)
# BETTER - use return_exceptions
results = await asyncio.gather(
risky_task(),
another_task(),
return_exceptions=True
)
for result in results:
if isinstance(result, Exception):
print(f"Task failed: {result}")
Pitfall 5: Mutable Default Arguments
This classic Python gotcha applies to async too:
# DANGER - mutable default argument
async def process_items(items=[]):
items.append("processed")
return items
# Calling multiple times shares the same list!
result1 = await process_items()
result2 = await process_items()
print(result2) # ['processed', 'processed'] - WRONG!
# CORRECT
async def process_items(items=None):
if items is None:
items = []
items.append("processed")
return items
Patterns That Actually Work
Fan-Out, Fan-In
Launch multiple tasks concurrently, then wait for all to complete:
async def fetch_all_users(user_ids):
# Fan-out: launch all tasks
tasks = [fetch_user(uid) for uid in user_ids]
# Fan-in: wait for all to complete
results = await asyncio.gather(*tasks)
return results
Producer-Consumer with Queues
When you need to coordinate work between coroutines:
async def producer(queue):
for item in generate_items():
await queue.put(item)
await queue.put(None) # Signal completion
async def consumer(queue):
while True:
item = await queue.get()
if item is None:
break
await process(item)
async def main():
queue = asyncio.Queue()
await asyncio.gather(
producer(queue),
consumer(queue)
)
Task Groups (Python 3.11+)
Structured concurrency makes error handling much cleaner:
async def main():
async with asyncio.TaskGroup() as tg:
task1 = tg.create_task(fetch_data(1))
task2 = tg.create_task(fetch_data(2))
# If either raises, all are cancelled automatically
# No more orphaned tasks
Performance Optimization That Matters
Use the Right Library
Not all libraries are async-compatible. Using synchronous libraries inside async code defeats the entire purpose:
| Task | Synchronous | Async |
|---|---|---|
| HTTP requests | requests | aiohttp, httpx |
| Database | psycopg2 | asyncpg, aiomysql |
| File I/O | open() | aiofiles |
| Redis | redis-py | aioredis |
Run CPU-Bound Work in Process Pools
Async doesn’t help with CPU-bound tasks. For those, use process pools:
async def main():
loop = asyncio.get_running_loop()
result = await loop.run_in_executor(
ProcessPoolExecutor(),
heavy_computation
)
Implement Caching Wisely
For expensive operations that get called repeatedly:
from functools import lru_cache
# Works with async too
@lru_cache(maxsize=128)
async def get_user_cached(user_id):
return await fetch_user(user_id)
Minimize Object Creation
Creating objects inside hot paths adds overhead:
# Less efficient
async def process_items(items):
results = []
for item in items:
result = {"id": item["id"], "value": item["value"] * 2}
results.append(result)
return results
# More efficient
@dataclass
class Result:
id: int
value: int
async def process_items(items):
return [Result(id=item["id"], value=item["value"] * 2)
for item in items]
When to Use Async (and When Not To)
Here’s a practical guide:
Use Async when:
- Building HTTP APIs that call external services
- Handling WebSocket connections
- Processing multiple I/O-bound operations concurrently
- Building real-time applications
- Working with streaming data
Don’t Bother when:
- Simple CRUD operations with no external calls
- CPU-bound computational tasks
- Scripts that run once and exit
- When simplicity matters more than performance
Common Mistakes with Solutions
Mixing Sync and Async Code
# PROBLEM: requests blocks the event loop
import requests
@app.get("/users/{user_id}")
async def get_user(user_id):
response = requests.get(f"/users/{user_id}") # BAD
return response.json()
# SOLUTION: use httpx
import httpx
@app.get("/users/{user_id}")
async def get_user(user_id):
async with httpx.AsyncClient() as client:
response = await client.get(f"/users/{user_id}")
return response.json()
Not Setting Timeouts
Always set timeouts for operations that could hang:
import asyncio
async def fetch_with_timeout():
try:
async with asyncio.timeout(5): # 5 second timeout
return await slow_operation()
except asyncio.TimeoutError:
return None # Handle timeout
Ignoring Backpressure
When producing faster than consuming:
async def bounded_process(items):
semaphore = asyncio.Semaphore(10) # Limit concurrency
async def process_one(item):
async with semaphore:
return await process(item)
return await asyncio.gather(*[process_one(i) for i in items])
The uvloop Advantage
For additional performance, consider uvloop, a drop-in replacement for the default event loop:
import uvloop
import asyncio
async def main():
# Your async code here
pass
uvloop.install()
asyncio.run(main())
In benchmarks, uvloop provides 2-4x faster network I/O compared to the default event loop. FastAPI works with it automatically when installed.
Summary
Async Python in 2026 is about understanding the event loop, avoiding blocking operations, running tasks concurrently when appropriate, and handling errors properly. The syntax is simple. The patterns take time to learn.
Start with one change: find a place in your code where you’re making multiple I/O calls sequentially and refactor it to use asyncio.gather. See the performance difference yourself. Then build from there.
The tools keep improving. But the fundamentals haven’t changed: async works when you understand what it’s actually doing under the hood.
Discussion
Leave a comment
No comments yet
Be the first to start the conversation.