Python's asyncio module unlocks concurrent programming without threads. Here's how to use it effectively.
The Core Idea
Async code lets you do other work while waiting for I/O (network requests, file reads, database queries). Instead of blocking, you yield control back to an event loop that can run other tasks.
import asyncio
async def fetch_data():
print("Starting fetch...")
await asyncio.sleep(1) # Simulate network delay
print("Fetch complete!")
return {"data": "results"}
# Run it
result = asyncio.run(fetch_data())async and await
Two keywords make it work:
async defdeclares a coroutine functionawaitpauses execution until the awaited thing completes
async def process_item(item):
# await can only be used inside async functions
result = await some_async_operation(item)
return resultRunning Multiple Tasks
The power comes from running tasks concurrently:
import asyncio
async def fetch_url(url):
print(f"Fetching {url}")
await asyncio.sleep(1) # Simulate request
return f"Data from {url}"
async def main():
# Run three fetches concurrently
results = await asyncio.gather(
fetch_url("https://api.example.com/users"),
fetch_url("https://api.example.com/posts"),
fetch_url("https://api.example.com/comments"),
)
print(results)
asyncio.run(main())All three "requests" complete in ~1 second total, not 3 seconds.
Creating Tasks
For more control, create tasks explicitly:
async def main():
# Create tasks (they start running immediately)
task1 = asyncio.create_task(fetch_url("url1"))
task2 = asyncio.create_task(fetch_url("url2"))
# Do other work while tasks run...
print("Tasks are running in background")
# Wait for results when you need them
result1 = await task1
result2 = await task2Timeouts
Don't let slow operations hang forever:
async def main():
try:
# Cancel if it takes more than 5 seconds
result = await asyncio.wait_for(
slow_operation(),
timeout=5.0
)
except asyncio.TimeoutError:
print("Operation timed out!")Real-World Pattern: Async HTTP
Using aiohttp for actual HTTP requests:
import aiohttp
import asyncio
async def fetch(session, url):
async with session.get(url) as response:
return await response.json()
async def main():
async with aiohttp.ClientSession() as session:
tasks = [
fetch(session, f"https://api.example.com/item/{i}")
for i in range(10)
]
results = await asyncio.gather(*tasks)
print(f"Fetched {len(results)} items")
asyncio.run(main())Common Gotchas
1. Forgetting to await
# Wrong - returns coroutine object, doesn't run it
result = fetch_data()
# Right
result = await fetch_data()2. Blocking the event loop
# Wrong - blocks everything
import time
time.sleep(5)
# Right - yields control
await asyncio.sleep(5)3. Running async from sync code
# Use asyncio.run() at the top level
asyncio.run(main())
# Or in Jupyter notebooks
await main()When to Use asyncio
Good fit:
- Many I/O operations (HTTP requests, database queries)
- WebSockets and real-time connections
- Concurrent file operations
Not ideal for:
- CPU-bound work (use
multiprocessinginstead) - Simple scripts with few I/O operations
Quick Reference
import asyncio
# Define async function
async def my_func():
await something()
# Run from sync code
asyncio.run(my_func())
# Run multiple concurrently
await asyncio.gather(task1(), task2())
# Create background task
task = asyncio.create_task(my_func())
# With timeout
await asyncio.wait_for(my_func(), timeout=10)
# Sleep without blocking
await asyncio.sleep(1)Async Python takes some getting used to, but once it clicks, you'll find it invaluable for I/O-heavy applications.