P(A|B) = P(B|A) · P(A) / P(B) σ² = E[X²] − (E[X])² f(x) = (1/σ√2π) e−(x−μ)²/2σ² E[X] = Σ x·P(x) Var(X̄) = σ²/n P(A∪B) = P(A) + P(B) − P(A∩B) CLT: X̄ → N(μ, σ²/n) P(A|B) = P(B|A) · P(A) / P(B) σ² = E[X²] − (E[X])² f(x) = (1/σ√2π) e−(x−μ)²/2σ² E[X] = Σ x·P(x) Var(X̄) = σ²/n P(A∪B) = P(A) + P(B) − P(A∩B) CLT: X̄ → N(μ, σ²/n)

bability.pro

where uncertainty becomes beautiful

Central Limit Theorem

The sum of many random variables converges to the normal distribution

Bayes' Theorem

Click to add data points and watch the posterior update

P(A|B) = P(B|A) · P(A) / P(B)
Prior Posterior

Monte Carlo

Estimating π through random sampling

Points 0
Inside 0
π ≈ 0