python concurrency is fine actually
The GIL discourse exhausts me. Python concurrency is fine for 90% of workloads if you pick the right model.
the three models
threading — one OS thread per Thread. The GIL prevents true parallel CPU execution, but threads can run concurrently when blocked on I/O. Fine for I/O-bound work where you don't need thousands of tasks.
asyncio — a single thread, event loop, cooperative multitasking. Extremely efficient for I/O-bound work at scale. The catch: everything in your call stack that touches I/O needs to be async-aware. Mixing sync and async code is where people get burned.
multiprocessing — separate processes, no GIL. The right choice for CPU-bound work. Overhead is higher (process startup, IPC), but if you're doing actual computation this is what you want.
the rule
- Network I/O at scale → asyncio
- Subprocess-level parallelism → multiprocessing
- "I have a few threads that mostly wait" → threading, keep it simple
Most of the "Python is slow" complaints I see are people using threading for CPU work or doing synchronous I/O inside an async function. The model isn't broken; it's just being used wrong.
asyncio in practice
The main footgun: blocking the event loop. Any synchronous call that takes more than a few milliseconds will stall every other coroutine. Use loop.run_in_executor to push blocking calls to a thread pool:
import asyncio
from concurrent.futures import ThreadPoolExecutor
executor = ThreadPoolExecutor(max_workers=4)
async def read_file(path: str) -> str:
loop = asyncio.get_running_loop()
return await loop.run_in_executor(executor, open(path).read)
It's verbose, but it's explicit about what's happening.