The engine of rational belief revision. When new evidence arrives, your prior probability transforms into a posterior — not by gut feeling, but by the precise machinery of conditional probability. P(H|E) = P(E|H) · P(H) / P(E).
This is not merely a formula. It is an instruction manual for changing your mind correctly. Every piece of evidence you encounter is a signal, and Bayes tells you exactly how much to update, how strongly to shift your confidence, and when the evidence is speaking louder than your assumptions.
The probability tree branches outward from prior to posterior, each fork a decision point where reality winnows the possible from the actual. To be Bayesian is to hold all beliefs provisionally, ready for revision at the arrival of the next datum.
The systematic errors woven into the architecture of human thought. Not random noise — structured distortions, predictable departures from rationality that evolution installed as heuristics for survival and that now betray us in the modern world of complex decisions.
Anchoring. Availability. Confirmation. The conjunction fallacy. The sunk cost trap. Each bias is a shortcut that once served our ancestors on the savanna and now leads us astray in boardrooms and ballot boxes. To name them is the first step toward defusing them.
The wheel of distortions turns endlessly, and awareness alone is not inoculation. But mapping the territory of our own unreliability is the beginning of epistemic humility — the foundation upon which all genuine rationality is built.
When the variables multiply and intuition falters, the matrix becomes your compass. A decision matrix forces clarity by decomposing a complex choice into its constituent dimensions — impact, cost, risk, reversibility — and scoring each option against each axis.
The power is not in the numbers themselves but in the act of decomposition. By breaking an overwhelming decision into weighted criteria, you externalize the reasoning process, making it visible, auditable, and correctable. The matrix is a mirror held up to your own thinking.
Every cell in the grid is a small act of honesty: how much do I really value this? How much risk am I actually willing to bear? The completed matrix often surprises its creator, revealing preferences that were always there but never articulated.
The study of knowledge itself — not what we know, but how we know, and whether "knowing" is even the right word. Epistemics is rationality turned inward, the telescope pointed at the eye that looks through it.
What counts as justification? When does belief become knowledge? Can we ever be certain, or is certainty itself a cognitive bias — the most seductive illusion of all? These questions have occupied philosophers for millennia, and they remain unsettled because the terrain is genuinely difficult.
The epistemic toolkit includes credence levels (degrees of belief between 0 and 1), calibration (are your 90% confidences right 90% of the time?), and the crucial distinction between object-level claims and meta-level claims about how to evaluate claims. To think about epistemics is to climb a ladder while examining the rungs beneath your feet.
The opposite of a strawman: take your opponent's argument and make it stronger. Find the most compelling version of the position you disagree with. Repair its logical gaps, supply its missing evidence, articulate its assumptions more clearly than its proponent did.
Only then — only when you have built the strongest possible version of the argument you oppose — do you earn the right to critique it. This is the discipline at the heart of rational discourse: the commitment to engaging with ideas at their best, not their weakest.
Steelmanning is not generosity. It is strategy. When you defeat the strongest version of an argument, your victory is real. When you defeat a strawman, you have defeated only yourself, and everyone watching knows it.
The calculus of belief revision under uncertainty.
The tendency to seek evidence that supports existing beliefs.
Among competing hypotheses, prefer the simplest.
If P then Q. P. Therefore Q.
A claim must be testable to be meaningful.
Aligning confidence levels with actual frequencies of correctness.
The probability-weighted average of all possible outcomes.
Engaging with the strongest form of an opposing argument.
Acknowledging the limits and fallibility of one's own knowledge.