01see: parallelism
cf. Hoare, 1978
02fig. 1 — threads
see: goroutine
03cf. deadlock
see: mutex
04fig. 3 — race
see: atomicity
05cf. semaphore
fig. 4 — lock
06see: channels
end.
01
{
}
CONCURRENT .QUEST
Parallelism is not concurrency. This is why it matters.
02

THREADS DON'T WAIT

T1 T2 T3

Three threads execute concurrently, synchronizing at crossing points. No thread waits for another to finish — they coordinate, exchange, and continue. The diagram is the architecture; the architecture is the program.

03
<
>

Concurrency is not about doing many things at once. It is about structuring a program so that multiple things can happen at once. The distinction is everything. Parallelism is a runtime property — a function of hardware, of cores, of buses. Concurrency is a design property — a function of structure, of decomposition, of the architecture of communication.

"THE WORLD IS CONCURRENT. THINGS IN THE WORLD DON'T SHARE DATA. THEY COMMUNICATE WITH MESSAGES."

When you write a concurrent program, you are not writing a faster program. You are writing a program whose structure more accurately reflects the structure of the problem it solves. The web server that handles ten thousand connections does not run ten thousand processes in parallel — it interleaves them, switches context, shares time on a handful of cores. The concurrency is in the design; the parallelism is incidental.

This is why concurrency is hard: it requires you to think about time. Not clock time, but logical time — the ordering of events, the visibility of state changes, the guarantees (or lack thereof) about what happens before what. A sequential program is a story told in order. A concurrent program is a conversation among strangers who may or may not be speaking the same language, in the same room, at the same time.

"DO NOT COMMUNICATE BY SHARING MEMORY; SHARE MEMORY BY COMMUNICATING."

The great insight of Communicating Sequential Processes — Hoare's 1978 paper that planted the seed for Go's goroutines, Erlang's actors, Clojure's core.async — is that communication is synchronization. When two processes exchange a message, they implicitly agree on a moment in time. The message is the handshake. The channel is the contract. No locks, no shared mutable state, no prayer that the scheduler will be kind.

We build systems that are concurrent because the problems they solve are concurrent. Users click simultaneously. Sensors fire asynchronously. Networks partition without warning. The question is never whether to deal with concurrency — it is whether to acknowledge it in your design or pretend it away and suffer the consequences in production at 3 AM.

04

THE PROBLEM OF SHARED STATE

Thread A Thread B STATE corrupted corrupted

When two threads reach for the same memory without coordination, the result is not an error — it is chaos wearing the mask of correctness. The race condition does not crash your program. It corrupts your data silently, intermittently, and only when the production load is high enough to make the timing just wrong enough. It is the hardest bug to find because it is the bug that only exists sometimes.

05

COORDINATION IS DESIGN

MUTEX P1 P1' P2 P2' wait

The mutex is not a workaround. It is a design primitive — a declaration that this region of code demands exclusive attention. Coordination is not overhead; it is architecture. Every lock acquired is a contract honored. Every channel message sent is a handshake completed. The concurrent system that works is the one that was designed to be concurrent from the first line.

06
GO.
Start building concurrent systems.