확률

hwaklyul.com

확률
기댓값
분산
표준편차

Every Moment Is a Branch

You are standing at the intersection of countless possible futures. Each decision you make — each coin you flip, each step you take — collapses an infinity of alternatives into a single realized outcome.

P(you reading this) = 1.0

The Language of Chance

Probability is not about predicting the future — it is about quantifying uncertainty. When we say an event has probability 0.73, we are not prophesying. We are measuring the weight of our ignorance.

가능성은 숫자로 표현된 겸손이다

Possibility is humility expressed in numbers

The Normal Distribution

The bell curve emerges everywhere — heights, test scores, measurement errors, stock fluctuations. It is nature's favorite shape, the inevitable consequence of aggregating countless small random influences.

σ = 1, μ = 0, n = 30

Favorable Outcome

The coin lands heads. You win the wager. The distribution favors your hypothesis. In this branch of the multiverse, luck smiled.

P = 0.62

행운의 가지

Unfavorable Outcome

The coin lands tails. The null hypothesis stands. In this branch, the data tells a different story — one where the expected did not materialize.

P = 0.38

불운의 가지

Decision Trees

Every choice is a node. Every outcome, a branch. The decision tree is not a metaphor — it is the literal architecture of rational choice under uncertainty.

72%28%65%35%40%60%
A

The Monty Hall Paradox

You pick a door. The host opens another, revealing a goat. Should you switch? Intuition screams no. Mathematics whispers yes. Switching gives you a 2/3 chance of winning.

Perhaps the most counterintuitive result in all of probability theory — a reminder that our brains are terrible probability calculators.

Door 11/3
Door 2🐐
Door 32/3

The Random Walk

A particle steps left or right with equal probability. After a thousand steps, where does it end up? The random walk is the simplest model of diffusion, stock prices, and the wandering drunk.

E[X] = 0, Var(X) = n

Frequentist View

Probability is the long-run frequency of events. Flip a coin ten thousand times; heads will occur approximately 50% of the time. No beliefs involved. Only counts.

빈도주의: 반복의 논리

Bayesian View

Probability is a degree of belief, updated by evidence. Before seeing data, you have a prior. After seeing data, you have a posterior. Knowledge is always conditional.

베이지안: 믿음의 갱신

Law of Large Numbers

As the number of trials approaches infinity, the sample average converges to the expected value. Chaos in the individual. Order in the aggregate. This is the central miracle of probability.

n → ∞: μ

큰 수의 법칙: 혼돈에서 질서로

Entropy and Information

Shannon's entropy measures surprise. A fair coin has maximum entropy — you cannot predict it. A loaded coin has less entropy — partial certainty reduces information content.

H(X) = −Σ p(x) log p(x)

The more uncertain an outcome, the more information you gain when it is revealed. Certainty is boring. Uncertainty is where meaning lives.

이항분포의 엔트로피

Known Unknowns

We know we do not know the outcome of the next dice roll. The probability space is defined. The sample space is finite. Uncertainty is quantified.

Calculable

Unknown Unknowns

Black swans. Events so improbable they were not even in our model. No probability distribution can capture what we have not imagined.

Undefined