The messy reality of financial decisions

Commentary June 02, 2015 at 10:57 AM
Share & Print

Every Monday, my friend Carl Richards (BehaviorGap.com) writes about financial planning for The New York Times and illustrates his ideas with a simple but illuminating drawing. One recent Monday Carl used the nearby sketch (used here with Carl's kind permission) to highlight a really important principle — we ought to make financial decisions only when those decisions are well supported by data and evidence.

Carl's illustration brilliantly encapsulates a point I repeatedly make: One's investment process will be much better when it is data-driven at ­every point.

Sadly, we don't tend to do this very well or very often. Add together confirmation bias (we tend to see what we want to see), motivated reasoning (our tendency to scrutinize ideas more critically when we disagree with them; we are also much more likely to recall supporting rather than opposing evidence), optimism bias (we tend to expect things to turn out far better than they typically do), self-serving bias (where the good stuff is my doing and the bad stuff is always someone else's fault), the planning fallacy (our tendency to overrate our ability to shape the future) and bias blindness (we're sure that these various foibles don't apply to us), and the result is excess certainty, hubris and lots of mistakes.

Factor in our propensity for favoring powerful stories over the facts and our attitude is often on the order of "Don't bother me with the facts; I've already made up my mind."

This problem is hardly a new one. More than half a century ago, Stanford psychologist Leon Festinger described the issue pretty clearly in the opening lines of his book, When Prophecy Fails. "A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point." Sherlock Holmes put the issue to Watson in The Adventure of Wisteria Lodge like this: "… it is an error to argue in front of your data. You find yourself insensibly twisting them round to fit your theories."

When making his defense of British soldiers during the Boston Massacre trials in December of 1770, John Adams (later our second president) offered a famous insight: "Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence." We are all prone to trying to force uncooperative ("stubborn") facts, what Evgeny Morozov calls "messy reality," into the narrative of our preconceived notions in ways that simply don't fit.

Fast and slow

How can we best try to deal with our messy reality? The obvious answer is that, like Sherlock Holmes, we need to engage in good critical thinking. But doing so is really hard.

In his brilliant book, Thinking, Fast and Slow, Nobel laureate Daniel Kahneman describes two different "systems" operating in our brains. Much of our thinking is fast and intuitive. We don't have to think about flinching when somebody takes a swing at us, for example. This type of thinking Kahneman labels as "System 1."

"System 2" thinking is slow and deliberative, and takes a lot of effort and practice. Learning calculus requires System 2, which can also be used to check and even override System 1.

System 1 incorporates our personalities, training, experience and inclinations. Some of it is entirely instinctual. But we can also make difficult (System 2) tasks "natural" with sufficient practice (like driving). Much of what we perceive as decision-making is actually a consequence of System 1 and hardly deliberative at all. Learning math is largely a System 2 task but a good bit of it (our "times tables," for example) becomes System 1 over time.

A new paper, "Motivated Numeracy and Enlightened Self-Government," from Yale's Dan Kahan and colleagues finds that our political views undermine our most basic analytical skills. More specifically, the study finds that people who are otherwise very good at math often totally screw up a problem they should be able to solve, because the correct answer is not politically palatable. Even worse, both liberals and conservatives with especially strong math skills were even more susceptible to letting politics skew their reasoning than were those with less mathematical ability.

This study is pretty clear evidence that our inherent biases skew our reasoning abilities and that the problem is even worse for people with advanced capacities. The study was designed such that a subject's first instinct is to jump to the wrong conclusion. When that easy (but wrong) answer confirms what we already think, System 1 takes control and it's game over. When it contradicts a subject's ideology — and particularly when the subject has high ability — he or she is both equipped and motivated to use System 2 and find the error. We are rarely trained or motivated to check and contest what we already believe.

Sadly, precious few of us exercise truly independent thought very often and none of us does so nearly often enough. Training ourselves to do so more intuitively, so we don't have to slow down and think about things deliberately, is even harder.

Training ourselves consistently and carefully to examine evidence that might contradict what we already think and believe is almost impossible. The academic research is crystal clear: Productive critical thinking requires adequate domain knowledge along with lots of practice. It isn't just a skill to be learned. One's knowledge base provides the foundation of and context for engaging in critical thinking.

Most personal and far too many professional investors have neither. In A Study in Scarlet, for example, Holmes used both practiced deductive skill and lots of existing knowledge to deduce that Watson had been in Afghanistan prior to their first meeting. Without both, Holmes would have remained entirely in the dark about Watson's path to their meeting.

"I knew you came from Afghanistan. From long habit the train of thoughts ran so swiftly through my mind, that I arrived at the conclusion without being conscious of intermediate steps. There were such steps, however. The train of reasoning ran, 'Here is a gentleman of a medical type, but with the air of a military man. Clearly an army doctor, then. He has just come from the tropics, for his face is dark, and that is not the natural tint of his skin, for his wrists are fair. He has undergone hardship and sickness, as his haggard face says clearly. His left arm has been injured. He holds it in a stiff and unnatural manner. Where in the tropics could an English army doctor have seen much hardship and got his arm wounded? Clearly in Afghanistan.' The whole train of thought did not occupy a second. I then remarked that you came from Afghanistan, and you were astonished."

Similarly, the best investors learn to internalize what they have learned over time. What may appear to be intuition is really the result of thousands of hours of hard work, research, practice and experience.

With sufficient education and proper training, we have the opportunity to deal with messy reality for the good of our clients. As critical thinking skills and the knowledge base to use them effectively are in extremely short supply generally, if you have them and exercise them, all the time, you will have a tremendous advantage. It takes a lot of work, work that is never completed but work that is ever so crucial. As my father used to tell me, it's what you learn after you think you know everything that really counts.

See also:

NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.

Related Stories

Resource Center