Information Theory
Every message you send carries a measurable amount of surprise. Shannon figured out that information is really about uncertainty reduction. The less likely a message, the more information it carries. That "no way!" moment? That's high entropy at work.
H(X) = -∑ p(x) log&sub2; p(x)