where randomness finds its shape.
Begin with a belief. Observe the evidence. Update. The posterior distribution is not a conclusion but a living map of certainty, reshaped with every new observation. Bayes' theorem is the algebra of belief revision -- a formal framework for changing your mind rationally.
Watch as prior distributions (the faint curves) sharpen into posteriors (bright curves) as data accumulates. Each new data point nudges the curve, and the width narrows -- uncertainty collapsing into knowledge, never fully arriving but always converging.
"Probability is not about the world. It is about our knowledge of the world."
Roll a die once: chaos. Roll it a thousand times: the average converges to 3.5 with the inexorable certainty of a physical law. The law of large numbers is probability's promise that patience reveals truth -- that the sample mean will, given enough observations, approximate the expected value.
The green trace below charts the running average. Watch it oscillate wildly at first, then settle into a narrow band around the true mean. This is convergence made visible -- the moment when randomness submits to structure.
n = 0 | mean = 0.000
A system without memory. The next state depends only on the current state, never on how we arrived here. Yet from this simple rule -- this radical forgetting -- emerges complex, structured behavior. Weather patterns, stock prices, the words in this sentence: all can be modeled as chains of probabilistic transitions.
The constellation before you maps five states and their transition probabilities. Arrows carry probability mass from node to node. Click any node to inject a burst of transitions and watch the chain explore its state space.
To measure the unmeasurable, throw darts at random. The fraction that land inside the region approximates its area -- a beautiful trick that converts randomness into measurement. Monte Carlo methods are the computational realization of a deep truth: enough random samples can solve problems that defy analytical solutions.
Click anywhere in the bounded region to release a burst of 200 random samples. Green dots fall inside the circle; warm dots fall outside. Watch the estimate of π converge as the sample size grows.
π ≈ 0.000 | n = 0
Probability is the hidden architecture of the world. Every measurement, every prediction, every decision rests upon its foundations. What you have witnessed here -- particles falling into order, averages converging, random points measuring circles -- are not simulations of probability. They are probability itself, made visible.
bability.pro