Every Monday, my friend Carl Richards (BehaviorGap.com) writes about financial planning for The New York Times and illustrates his ideas with a simple but illuminating drawing. One recent Monday Carl used the nearby sketch (used here with Carl's kind permission) to highlight a really important principle — we ought to make financial decisions only when those decisions are well supported by data and evidence.
Carl's illustration brilliantly encapsulates a point I repeatedly make: One's investment process will be much better when it is data-driven at every point.
Sadly, we don't tend to do this very well or very often. Add together confirmation bias (we tend to see what we want to see), motivated reasoning (our tendency to scrutinize ideas more critically when we disagree with them; we are also much more likely to recall supporting rather than opposing evidence), optimism bias (we tend to expect things to turn out far better than they typically do), self-serving bias (where the good stuff is my doing and the bad stuff is always someone else's fault), the planning fallacy (our tendency to overrate our ability to shape the future) and bias blindness (we're sure that these various foibles don't apply to us), and the result is excess certainty, hubris and lots of mistakes.
Factor in our propensity for favoring powerful stories over the facts and our attitude is often on the order of "Don't bother me with the facts; I've already made up my mind."
This problem is hardly a new one. More than half a century ago, Stanford psychologist Leon Festinger described the issue pretty clearly in the opening lines of his book, When Prophecy Fails. "A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point." Sherlock Holmes put the issue to Watson in The Adventure of Wisteria Lodge like this: "… it is an error to argue in front of your data. You find yourself insensibly twisting them round to fit your theories."
When making his defense of British soldiers during the Boston Massacre trials in December of 1770, John Adams (later our second president) offered a famous insight: "Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence." We are all prone to trying to force uncooperative ("stubborn") facts, what Evgeny Morozov calls "messy reality," into the narrative of our preconceived notions in ways that simply don't fit.
Fast and slow
How can we best try to deal with our messy reality? The obvious answer is that, like Sherlock Holmes, we need to engage in good critical thinking. But doing so is really hard.
In his brilliant book, Thinking, Fast and Slow, Nobel laureate Daniel Kahneman describes two different "systems" operating in our brains. Much of our thinking is fast and intuitive. We don't have to think about flinching when somebody takes a swing at us, for example. This type of thinking Kahneman labels as "System 1."
"System 2" thinking is slow and deliberative, and takes a lot of effort and practice. Learning calculus requires System 2, which can also be used to check and even override System 1.
System 1 incorporates our personalities, training, experience and inclinations. Some of it is entirely instinctual. But we can also make difficult (System 2) tasks "natural" with sufficient practice (like driving). Much of what we perceive as decision-making is actually a consequence of System 1 and hardly deliberative at all. Learning math is largely a System 2 task but a good bit of it (our "times tables," for example) becomes System 1 over time.
A new paper, "Motivated Numeracy and Enlightened Self-Government," from Yale's Dan Kahan and colleagues finds that our political views undermine our most basic analytical skills. More specifically, the study finds that people who are otherwise very good at math often totally screw up a problem they should be able to solve, because the correct answer is not politically palatable. Even worse, both liberals and conservatives with especially strong math skills were even more susceptible to letting politics skew their reasoning than were those with less mathematical ability.