A geometric theory of gravitation unifying space, time, and matter. Gravity is the curvature of spacetime caused by mass and energy — not a force, but geometry.
Spacetime
Gravity
Tensor Calculus
Einstein's field equations Gμν + Λgμν = (8πG/c⁴)Tμν describe how the geometry of spacetime is shaped by energy and momentum. Key predictions — gravitational lensing, time dilation, frame dragging, black holes, and gravitational waves — have all been confirmed experimentally. General Relativity forms the foundation of modern cosmology and GPS satellite systems.
The theory supersedes Newtonian gravity, predicting deviations in strong gravitational fields. Its reconciliation with quantum mechanics remains one of the great open problems in physics.
Related Theories
Special Relativity
Quantum Gravity
Cosmological Constant
Truth is what works. Ideas are instruments; their meaning lies in their practical consequences.
Truth
Experience
Developed by Peirce, James, and Dewey, pragmatism holds that the meaning of a concept is constituted by its practical effects. Truth is not a static correspondence to reality but an evolving, functional property of beliefs that guide effective action. James extended the theory: a belief is true if it works satisfactorily in the context of our experiences.
Related Theories
Instrumentalism
Fallibilism
Any consistent formal system capable of expressing basic arithmetic contains statements that are true but unprovable within that system.
Logic
Axioms
Gödel's two incompleteness theorems (1931) shattered Hilbert's program of finding a complete and consistent set of axioms for all of mathematics. The first theorem: in any consistent formal system F capable of expressing arithmetic, there exist statements that can neither be proved nor disproved within F. The second theorem: such a system cannot prove its own consistency. The proofs use self-referential Gödel numbering to encode "This statement is unprovable."
Related Theories
Formal Systems
Halting Problem
Resources embedded in social networks — trust, norms, and relationships — that can be mobilized for collective and individual benefit.
Networks
Trust
Bourdieu distinguished social capital from economic and cultural capital: it consists of actual or potential resources linked to durable networks of mutual acquaintance and recognition. Putnam later broadened the concept into "bridging" (connecting across groups) and "bonding" (reinforcing within groups) social capital. Strong social capital correlates with better health, economic outcomes, and civic participation.
Related Theories
Network Theory
Habitus
Meaning arises not from individual elements but from their relationships within a system. Structure precedes and produces meaning.
Semiotics
Language
Systems
Saussure's Course in General Linguistics introduced the idea that signs derive meaning from their differences from other signs within the same system — not from any intrinsic relationship between signifier and signified. This structural insight spread to anthropology (Lévi-Strauss), psychology (Lacan), and literary theory (Barthes). Structuralism posits that underlying structures govern cultural phenomena and that surface variations mask deeper invariant patterns.
Post-structuralism later challenged the stability of these structures, arguing that meaning is always deferred and unstable (Derrida's différance).
Related Theories
Post-Structuralism
Semiotics
Deconstruction
At subatomic scales, particles exist in probabilistic superpositions until measured. Wave functions, uncertainty, and entanglement define reality.
Wave Function
Uncertainty
Developed through contributions from Heisenberg, Schrödinger, Bohr, Dirac, and Born, quantum mechanics describes the behavior of matter and energy at atomic and subatomic scales. The Schrödinger equation governs the time evolution of quantum states. The measurement problem — how superpositions "collapse" to definite values upon observation — remains philosophically unresolved. Interpretations include Copenhagen, Many-Worlds, and Pilot Wave.
Related Theories
QFT
Wave-Particle Duality
A unifying language for mathematics — objects, morphisms, and functors reveal structural patterns that recur across disparate mathematical disciplines.
Functors
Morphisms
Category theory abstracts mathematical structure to its bare essence: objects and the structure-preserving maps (morphisms) between them. Functors map between categories, natural transformations map between functors. Eilenberg and Mac Lane developed it to formalize relationships between algebraic topology constructions. It has since become the lingua franca of mathematics, and its influence extends to theoretical computer science (type theory) and physics (TQFT).
Related Theories
Topos Theory
Type Theory
Agents act to maximize expected utility given their preferences and beliefs, forming the foundation of mainstream economic analysis.
Utility
Game Theory
Rational choice theory assumes that individuals make decisions by comparing costs and benefits to maximize utility. Von Neumann and Morgenstern formalized this into expected utility theory. Arrow's impossibility theorem and behavioral economics (Kahneman, Tversky) have both challenged its descriptive accuracy, but it remains the dominant normative framework in economics, political science, and sociology.
Related Theories
Game Theory
Bounded Rationality
A mathematical framework for quantifying information, entropy, and channel capacity — the theoretical foundation of modern communication and data compression.
Entropy
Compression
Channel Capacity
Shannon's 1948 paper "A Mathematical Theory of Communication" introduced the bit as the fundamental unit of information and entropy H = -Σ p(x) log₂ p(x) as its measure of uncertainty. The noisy-channel coding theorem establishes that reliable communication is possible at any rate below channel capacity. Information theory underpins all modern digital communication, compression algorithms (zip, JPEG, MP3), and has deep connections to statistical mechanics and machine learning.
Kolmogorov complexity extends Shannon's framework to individual objects: the algorithmic information content of a string is the length of its shortest description in a universal programming language.
Related Theories
Algorithmic Complexity
Thermodynamics
Coding Theory
Scientific theories must be falsifiable. Science advances by bold conjectures and rigorous attempts at refutation — not by accumulating confirming evidence.
Science
Demarcation
Popper's criterion of falsifiability distinguishes science from non-science (pseudoscience, metaphysics). A theory is scientific only if it makes predictions that could in principle be shown false by experiment. Science proceeds through conjectures and refutations: we never verify theories, only fail to falsify them. This resolves the problem of induction — we don't need induction; we need only severe tests. Critics (Kuhn, Lakatos) noted that scientists rarely immediately abandon theories under anomalies.
Related Theories
Inductivism
Paradigm Theory
People evaluate outcomes relative to a reference point. Losses loom larger than equivalent gains — loss aversion is a fundamental bias in human decision-making.
Loss Aversion
Heuristics
Kahneman and Tversky's prospect theory (1979) showed that people's choices systematically violate expected utility theory. Key features: (1) outcomes are evaluated relative to a reference point, not in absolute terms; (2) the value function is concave for gains and convex for losses, with losses weighted roughly twice as heavily as gains; (3) probability weighting overweights small probabilities and underweights large ones. This earned Kahneman the 2002 Nobel Prize in Economics.
Related Theories
Behavioral Economics
Cognitive Biases
Four laws govern the relationships between heat, work, temperature, and energy. The second law introduces entropy — an arrow of time.
Entropy
Heat
The four laws of thermodynamics: the zeroth establishes thermal equilibrium; the first, conservation of energy; the second, that entropy in a closed system never decreases; the fourth (Nernst), that absolute zero is unattainable. The second law gives time a direction — the increase of entropy. Boltzmann's statistical mechanics grounded thermodynamics in the behavior of atoms, connecting macroscopic observables to microscopic probabilities.
Related Theories
Statistical Mechanics
Information Theory