Three threads execute concurrently, synchronizing at crossing points. No thread waits for another to finish — they coordinate, exchange, and continue. The diagram is the architecture; the architecture is the program.
Concurrency is not about doing many things at once. It is about structuring a program so that multiple things can happen at once. The distinction is everything. Parallelism is a runtime property — a function of hardware, of cores, of buses. Concurrency is a design property — a function of structure, of decomposition, of the architecture of communication.
When you write a concurrent program, you are not writing a faster program. You are writing a program whose structure more accurately reflects the structure of the problem it solves. The web server that handles ten thousand connections does not run ten thousand processes in parallel — it interleaves them, switches context, shares time on a handful of cores. The concurrency is in the design; the parallelism is incidental.
This is why concurrency is hard: it requires you to think about time. Not clock time, but logical time — the ordering of events, the visibility of state changes, the guarantees (or lack thereof) about what happens before what. A sequential program is a story told in order. A concurrent program is a conversation among strangers who may or may not be speaking the same language, in the same room, at the same time.
The great insight of Communicating Sequential Processes — Hoare's 1978 paper that planted the seed for Go's goroutines, Erlang's actors, Clojure's core.async — is that communication is synchronization. When two processes exchange a message, they implicitly agree on a moment in time. The message is the handshake. The channel is the contract. No locks, no shared mutable state, no prayer that the scheduler will be kind.
We build systems that are concurrent because the problems they solve are concurrent. Users click simultaneously. Sensors fire asynchronously. Networks partition without warning. The question is never whether to deal with concurrency — it is whether to acknowledge it in your design or pretend it away and suffer the consequences in production at 3 AM.
When two threads reach for the same memory without coordination, the result is not an error — it is chaos wearing the mask of correctness. The race condition does not crash your program. It corrupts your data silently, intermittently, and only when the production load is high enough to make the timing just wrong enough. It is the hardest bug to find because it is the bug that only exists sometimes.
The mutex is not a workaround. It is a design primitive — a declaration that this region of code demands exclusive attention. Coordination is not overhead; it is architecture. Every lock acquired is a contract honored. Every channel message sent is a handshake completed. The concurrent system that works is the one that was designed to be concurrent from the first line.