Using simple experiments Dan Ariely studies how people actually act in the marketplace, as opposed to how they should or would perform if they were completely rational. His interests span a wide range of daily behaviors and his experiments are consistently interesting, amusing, and informative, demonstrating profound ideas that fly in the face of common wisdom.
Dan is the James B. Duke Professor of Psychology & Behavioral Economics at Duke University, where he holds appointments at the Fuqua School of Business, the Center for Cognitive Neuroscience, the School of Medicine, and the Department of Economics. He is also a founding member of the Center for Advanced Hindsight.
Dan earned a bachelor's degree in psychology from Tel Aviv University, his master's and doctorate degrees in cognitive psychology from the University of North Carolina, and a doctorate in Business Administration from Duke University. He is the author of The New York Times bestsellers Predictably Irrational: The Hidden Forces that Shape Our Decisions and The Upside of Irrationality: The Unexpected Ways We Defy Logic at Work and at Home. I am the proud owner of both and have benefitted from them both.
His research has been published in leading psychology, economics, and business journals, and has been featured many times in the popular press. Dan has graciously agreed to answer what I hope are Five Good Questions.
1. In your work, you often argue that one's "reasoning self" should set up "guardrails" to manage things during those times when one's reason is not in charge. Would you share a favorite example of a guardrail for decision-making and a favorite example of a substantive guardrail in the financial world?
One of the most underused interfaces in human decision-making is the calendar. If you think about it, the calendar does many, many things wrong. But one of the interesting things about the calendar is that when you have something there that is set there's a good chance that you will actually go ahead and do it.
So, for example, a few years ago before the elections we did a study where a few months in advance we emailed some people, and asked them to set up a time on Election Day to go and vote. We asked them to set aside three hours for that activity. Those people basically put the calendar meeting in their diary. And what happened was they ended up going to vote much more often than people who did not set up a time on their calendars.
Because if you don't think about it in advance and you end up remembering two days before the election, all of a sudden the odds are that your calendar will be full with appointments and you'll have to cancel lots of things to go to vote. And therefore people might be less likely to do so.
From this basic principle I now try to use the calendar for things that would prevent me from doing things that I actually want to do. So, I take projects that I really want to work on and need to work on, and I'm afraid that they will be delegated to some lower priority by the things that look like they're emergencies. And I set up times for me to meet them.
I was hoping I would do the same thing for exercise but somehow I never got to do that. But I think that will be my next move – to start creating time for exercise. And who knows? Maybe at some point my life will become so hectic I'll even start to set up time when I need to go to sleep.
In terms of guardrails in the financial system, I think automatic saving for me is the best example for this. You know the rational thing to do is to look at the end of each month and see how much money we had, how much expenses we had and then at the end of each month decide how much to send for our long-term savings.
But, I think we all intuitively realize that if this was the case we would not save as much. American savings rates are nothing to be proud of, but they will probably be even lower. And it clearly doesn't make rational sense because we have different levels of expenses in the winter and in the summer. There are fluctuations. There are things that come as emergencies.
Saving should not be the same every possible month. But we know that if we relied on our mediocre cognitive ability and thinking and demanded that every month we make this decision, the odds are we will never make this decision. Or at least we'll make it very infrequently. So by creating a system that takes this decision out of our hands we actually do much better.
2. You argue pretty consistently that one's reasoning self should be in charge. What, then, do you make of research suggesting that decision-making without an emotional component isn't as good as a more balanced approach? Is decision-making about money different?
I think there's a bit of confusion in the perception of emotion and decision-making. There is clearly evidence that people who don't have an emotional input, who don't have an emotional coding of information, do very badly in decision-making. But the way to think about it is that emotion does two things. Our emotional system provides a way to read information about the world. It reads information about what we like to do and what we don't like to do, what's safe and what's dangerous.
We also have a system to make decisions, and I think we need to separate those two. We need to separate the input part where emotion is a very quick, effective way to give us information about the world — again, what we like, what we don't like, what we're afraid of. And there's a second system in which emotion is being used for making decisions, not just for input. When it comes to making decisions using emotions, that's where emotions are likely to be unsuccessful or lead us astray. Money isn't unique in that regard.
Think about buying a house. You go to a house. You go and look at different houses and you feel emotionally good about some and not emotionally good about other ones. And this provides you some information that you can't quantify. You're not sure exactly which one of those is making you happy. Is it the tall ceilings or the big glass window? Is it the trees outside? You know it's hard to figure out which ones it is but your emotions are giving you information about that. Now that's on the good side because it gives you information that you can't quantify.
If you were going to make a decision about buying a house and you were in an emotional state — you're really angry or revengeful or you just had a fight or you're hungry — Is your decision about how to negotiate for the house going to be in any way better? I don't think so. So you want to use emotional information as a cue. You don't want to use it for decision-making.
3. In the financial world, risk and uncertainty are not identical and are often confused. How does your research relate to this problem?
My research does very little on risk and uncertainty. However, risk and uncertainty have basically been the building blocks for behavioral economics and sort of the "fruit fly" of behavioral economics. But my own research doesn't deal with this that much.
One of the most, I think, amusing pieces of research I've done that would have to do with the perception of risk was research on how men perceive the risk of STDs when they are and when they are not sexually aroused. And this of course goes back to our earlier discussion about using emotions. So this effect has to do also with the effect of emotions on decision-making. And what we find is in line with the Robin Williams joke that God gave men two organs that need a lot of blood but only enough blood for one of them at a time.
The perceptions of risk about STDs for example, vary dramatically when people are aroused and not aroused. When men are not aroused they think the risk is very high. When they're aroused they think the risk is very low. And of course accordingly different actions ensue. This, by the way, is one of those cases where we thought in advance that there's a good probability that we will find this result.
Sometimes people ask why you do some research papers that look so trivial, that when you find out the results people say oh I knew that all along, of course this is the case. Well it turns out that it happens from time to time that we show things that we thought would actually be the case.
But it also happens sometimes that we don't show those things. So science's role is to try to separate when our intuitions are correct and when our intuitions are not correct. Of course when they're correct we are blessed with the hindsight of saying, oh, yes, I knew that all along. But that's not always the case. Anyway, in this case it turns out what we expected is actually what we found.