· 2 min read ·

The Vocabulary Problem in Concurrent Programming

Source: isocpp

Concurrency terminology gets muddled fast. Threads, coroutines, async/await, actors, fibers: the words appear interchangeably in documentation and conversation, and the confusion compounds when you’re trying to pick the right tool for a specific problem.

Lucian Radu Teodorescu published a clarifying piece on this at isocpp.org back in December, and it’s worth reading even if you’ve been writing concurrent code for years. The core argument is that concurrency is not one thing; it’s a family of approaches, each with different semantics, different performance characteristics, and different failure modes.

Most engineers reach for concurrency primitives without a clear map of the territory. When I started writing Discord bots, I reached for async/await because the library used it, and it worked well enough, but I didn’t really understand what I was trading off until I ran into backpressure issues and had to think more carefully about what “concurrent” even meant in that context.

The article draws useful distinctions. There’s a difference between concurrency and parallelism, and there’s a difference between cooperative and preemptive multitasking. Coroutines give you concurrency without parallelism; threads give you both but at the cost of synchronization complexity. Async/await in most languages is syntactic sugar over cooperative coroutines, which means you only yield control at explicit suspension points. This matters when you have blocking calls in a hot path, and it matters in ways that are difficult to debug after the fact.

The actor model takes a different angle. Rather than sharing memory and coordinating access, actors communicate through message passing. This shifts the complexity from lock management to message ordering and mailbox design. Both approaches carry complexity; they distribute it differently.

What the article gets right is that understanding these distinctions changes how you reason about bugs. A deadlock in a thread-based system has a different shape than starvation in an actor system. Conflating the models leads to applying the wrong diagnostic lens and spending time looking in the wrong place.

For systems programming work, these distinctions are non-negotiable. Choosing between kernel threads, green threads, or async I/O has measurable consequences for throughput and latency. At the application layer, async/await often hides enough complexity that you can ship working software without understanding it deeply, but that debt shows up eventually.

The piece was originally published in December 2025, so it reads somewhat as a retrospective on where concurrency tooling has landed after years of async proliferation in mainstream languages. That context makes the terminology clarification feel timely rather than academic. The vocabulary has stabilized enough that it’s worth nailing down what each term actually means.

Was this interesting?