DISTRIBUTED SYSTEMS — VOL. 01 / ISSUE 04 2026 · STOCKHOLM EDITION

ncbd.dev

A field manual for engineers who refuse the monolith — written in the spare, honest language of Nordic engineering.

[ MESH ] [ RING ] [ TREE ] [ STAR ] [ ALIVE ]
CONTINUE READING

The shape of a network is its first promise.

Before any protocol is chosen, before any byte is serialized, the topology of a distributed system has already decided what it can survive.

A centralized graph is a single point of agreement and a single point of failure. A distributed graph trades elegance for endurance: every node carries part of the truth, and no single node carries all of it. The distinction is not academic -- it is the difference between a system that breaks and one that bends.

We choose mesh when latency is local, ring when ordering matters more than speed, tree when delegation must be obvious, and star only when we have already accepted that the center will fall.

Code blocks throughout this developer documentation site use Fira Code with ligatures, Code gray backgrounds, and Neon-electric syntax highlighting on language keywords -- challenges the convention that tech docs must be sans-serif throughout.

“Architecture is the politics of who is allowed to fail.”

Mesh — every node knows every node.
Ring — order is preserved by direction.
Tree — delegation by depth.
Star — efficient until the hub yields.
consensus/raft.rs RUST · 1.78
01use std::collections::HashMap;
02
03enum Role { Follower, Candidate, Leader }
04
05struct Node {
06    id: u64,
07    term: u64,
08    role: Role,
09    log: Vec<Entry>,
10    peers: HashMap<u64, Peer>,
11}
12
13impl Node {
14    fn tick(&mut self) {
15        match self.role {
16            Role::Follower  => self.await_heartbeat(),
17            Role::Candidate => self.campaign(),
18            Role::Leader    => self.heartbeat("i am here"),
19        }
20    }
21}

Consensus is the polite name for a fight that ends in a vote. Raft formalizes the fight; Paxos refuses to.

No leader is permanent. That is the entire point.

Consensus protocols are written so that the network can lose anyone — including the one currently in charge — and still agree on what happened next.

A Raft term is a small, finite contract. A node enters the term as a follower, becomes a candidate when it stops hearing from the leader, and is permitted to lead only if a majority signs off. The arithmetic of n/2 + 1 is unforgiving and merciful in equal measure: it forbids tyranny, and it forbids deadlock.

The code above uses syntax highlighting on language keywords drawn in Neon electric, with strings tinted Neon green — the same palette that runs as accent lines through every spread of this developer documentation site.

  • Election timeout150 – 300 ms
  • Heartbeat interval50 ms
  • Quorum size⌊n / 2⌋ + 1
  • Log replicationappend-only
— INTERLUDE —

A forest is a database that has been running for ten thousand years.

Distributed systems borrow their best ideas from biology. Long before Paxos, the rhizome had already solved partition tolerance.

Rhizome

Lateral roots without a center. Cut any segment — the network keeps growing in every other direction.

// peer-to-peer · gossip protocols

Mycelium

An underground mesh routing nutrients between trees. Bandwidth without billing. Discovery without DNS.

// content-addressable · multicast

Banyan

Aerial roots descending until each one becomes its own trunk. A single tree that becomes a forest in place.

// horizontal scaling · sharding

Replication is honest about what it cannot promise.

You may have consistency, availability, or partition tolerance. You may pretend you have all three. The pretense is expensive.

Strong consistency demands that every reader sees the same write at the same instant. Eventual consistency relaxes the demand: every reader will, given quiet, converge on the same value. Most production systems live somewhere on that spectrum and most outages are caused by forgetting where exactly.

The rebellion of distributed thinking is not against centralization itself -- it is against the comfort of assuming a single source of truth where none exists.

// reading mode The spread at the viewport center stays sharp; the rest soften through an IntersectionObserver setting CSS classes. SVGs above are inline elements -- animations run via offset-path with animateMotion fallbacks, drawn directly for maximum control. Labels track 0.1em. of letter-spacing per the spec.

  • Primary write
  • Acknowledged replica
  • Read quorum
01

Network partition

The wire goes silent. Nodes disagree about who is alive.

CAP · P
02

Byzantine fault

A node lies, deliberately or otherwise. The protocol must tolerate liars.

PBFT · 3f+1
03

Cascading retry

Recovery traffic exceeds normal load. The cure becomes the disease.

JITTER · BACKOFF
04

Split-brain

Two leaders, both convinced. The quorum decides whose memory is real.

QUORUM · FENCE

Design for the day the wire goes silent.

A distributed system is not a system that does not fail. It is a system whose failure is local and whose recovery is automatic.

Every honest architecture has a list of failures it has chosen to survive, and a list it has chosen to ignore. The list is part of the design — not a footnote, not a runbook, not a slide. It is the design.

Write the list down. Pin it above the desk. Reread it on the day of the incident.

Set in Playfair Display, Inter and Fira Code. Printed on a clean white background, with neon-electric ink.

  • EDITION2026 · 04
  • FORMATMAGAZINE-SPREAD
  • TOPOLOGYDISTRIBUTED
  • STATUSALIVE

ncbd.dev — Not Centralized, But Distributed. A document that refuses the monolith.