확률 · 確率 · probability
The bell curve emerges from the sum of countless independent random variables. As sample size grows, the distribution of means converges to this fundamental shape -- the central limit theorem made visible in every particle above.
P(x) = (1/σ√2π) · e^(-(x-μ)²/2σ²)
Prior beliefs meet new evidence. The posterior probability updates what we thought we knew, integrating observation with expectation. Each particle carries a weighted history of its past positions -- a Bayesian memory of where it has been.
P(A|B) = P(B|A) · P(A) / P(B)
The long-run average of repetitions of an experiment. Where probability meets prediction -- the value that all outcomes gravitate toward when trials approach infinity. Watch the particles above: their average position converges, even as individuals wander.
E[X] = Σ xᵢ · P(xᵢ)
A measure of uncertainty -- the average surprise in a random variable. High entropy means maximum unpredictability, a uniform scattering across all states. Low entropy reveals order hidden within apparent chaos.
H(X) = -Σ P(xᵢ) · log₂ P(xᵢ)
As the number of trials increases, the sample average converges to the expected value. Randomness submits to regularity. The chaotic dance of individual particles resolves into the smooth curve of statistical certainty.
lim(n→∞) X̄ₙ = μ (almost surely)
Random sampling to solve deterministic problems. Cast enough points into the void and patterns emerge -- compute integrals, optimize functions, simulate the future. Every particle on this page is a Monte Carlo sample, painting probability one dot at a time.
π ≈ 4 · (hits inside circle / total throws)
Scroll to reshape the distribution. Watch as the mean shifts and variance narrows -- randomness submitting to the observer's will.
In the limit, chaos yields to order. The law of large numbers prevails.
확률은 수렴한다