bability

.pro

The Architecture of Chance

P(Theory) = 0.00

The Branching Paths

n=1024

The Law of Large Numbers

As the number of trials increases, the sample mean converges to the expected value. Flip a coin once and you have chaos. Flip it a thousand times and you have certainty. The beauty of probability lies in this transformation from the unpredictable to the inevitable.

n=512

Random Variables

A random variable is not a variable and is not random. It is a function from outcomes to numbers, a translator between the language of events and the language of mathematics.

n=2048

Conditional Probability

P(A|B) changes everything. The probability of rain given clouds is not the probability of clouds given rain. Bayesian reasoning inverts our assumptions, forcing us to update beliefs in light of evidence.

P(A|B) = P(B|A) · P(A) / P(B)

n=768

Expected Value

The weighted average of all possible outcomes. Not a prediction for any single trial, but the truth that emerges from repetition. The expected value of a fair die is 3.5 -- a number the die can never show.

P(Data) = 0.00

The Accumulation

observations: n = 0
n=4096

Central Limit Theorem

The most beautiful theorem in probability. Take any distribution -- skewed, bimodal, chaotic -- and average enough samples. The distribution of those averages will be Gaussian. Order emerges from any chaos, given enough repetitions.

n=3072

Variance

The average squared distance from the mean. Variance measures the spread of outcomes, the width of the bell curve, the uncertainty that remains after expectation has been accounted for.

Var(X) = E[(X - μ)²]

n=1536

Independence

Two events are independent when knowing one tells you nothing about the other. The coin does not remember its last flip. The die does not owe you a six.

n=5120

Monte Carlo Methods

When analysis fails, simulation succeeds. Throw enough random darts at a circle inscribed in a square and you can compute pi. Named for the casino where chance is the house specialty.

n=2560

Markov Chains

The future depends only on the present, not the past. A memoryless walk through state space, where each step is determined by transition probabilities. The drunk man's walk home becomes a mathematical object of profound utility.

n=6144

Bayesian Inference

Prior beliefs, updated by evidence, yield posterior knowledge. Bayes' theorem is not merely a formula; it is an epistemology. It describes how rational agents should change their minds.

posterior ∝ likelihood × prior

n=7168

Stochastic Processes

Random phenomena evolving through time. Stock prices, particle diffusion, population dynamics -- all dance to the rhythm of stochastic differential equations.

n=8192

Entropy

The measure of uncertainty, the average surprise. A fair coin has maximum entropy for a binary outcome. A loaded die has less. Information theory and probability theory are two faces of the same truth.

H(X) = -Σ p(x) log p(x)

P(Understanding) = 0.00

The Distribution

-3σ

-2σ

The tails thin. Extremes become rare. Here at two standard deviations from the mean, only 2.28% of outcomes reside.

Approaching the peak. 68% of all outcomes fall within one standard deviation. The familiar territory of the probable.

The mean. The mode. The median. In a perfect Gaussian, all three converge to this singular point of maximum likelihood. This is where most stories end.

μ

The mirror. The Gaussian is symmetric. What was true on the left is true on the right. Beauty in bilateral balance.

Descending from certainty. Each step away from the mean halves the probability density.

+2σ

+3σ