P(x) = The mathematics of chance
The most fundamental distribution in probability theory. From heights of populations to measurement errors, nature converges toward the Gaussian. The central limit theorem guarantees it: aggregate enough random variables and normality emerges.
Prior knowledge meets new evidence. Bayes' theorem transforms uncertainty into refined understanding. Every observation shifts the posterior, narrowing the probability space toward truth.
Information entropy quantifies the average surprise in a random variable. Maximum entropy at uniform distribution; minimum at certainty. Shannon's formula bridges probability and information, revealing the fundamental limits of compression and communication.
From Brownian motion to stock prices, stochastic processes model systems evolving randomly over time. Each step is uncertain, yet the aggregate behavior reveals deep mathematical structure. The Markov property ensures the future depends only on the present.