parallengine.com

A computational engine for parallel processing.

Layer 01 / Thread Dispatcher

Dispatching threads across cores

The engine begins at the dispatcher level -- breaking tasks into parallel threads, routing computation across available cores like traffic through a city grid. Each thread is a messenger, each core a destination.

Think of it as rush hour, but every car knows exactly where it's going.

INPUT DISPATCHER n-way split Core 0 Core 1 Core 2 Core n
Layer 02 / Shared Memory Bus

Shared memory, shared responsibility

Below the dispatcher lies the shared memory bus -- the common ground where threads exchange data. Cache coherence protocols keep every core in sync, preventing the chaos of stale reads and phantom writes.

It's like a shared fridge in an office. Someone always takes your yogurt.

SHARED MEMORY BUS L1 Cache L1 Cache L2 Cache L2 Cache Main Memory Main Memory
Layer 03 / Execution Pipeline

Pipelined for throughput

The execution pipeline stages instructions in overlapping waves -- fetch, decode, execute, write-back -- each stage handling a different instruction simultaneously. Throughput multiplied, latency masked.

An assembly line where every worker is also the foreman.

FETCH DECODE EXECUTE WRITE-BACK INSTRUCTION n INSTRUCTION n+1 (overlapped)
Layer 04 / Synchronization Barriers

Where threads converge

Synchronization barriers are the rendezvous points -- where threads pause, wait for siblings to catch up, and resume together. Mutexes guard critical sections. Semaphores count available resources. The engine breathes in sync.

Like herding cats, except the cats occasionally deadlock.

T0 T1 T2 T3 BARRIER wait_all() T0 T1 T2 T3
Layer 05 / The Core

The heart of the engine

At the deepest layer, the arithmetic logic unit hums -- the fundamental gate-level machinery that performs computation. Billions of transistors switching in parallel, the ultimate granularity. This is where electricity becomes logic, and logic becomes result.

Two billion tiny switches, and they never complain about overtime.