As part of a simulated stock market game, student subjects at Texas Tech sit inside a machine that resembles a landlocked white submarine. There they pick investment advisors who give them portfolio recommendations. Investment returns are random and, when they drop, Professor Russell James notes an activation of the dorsal anterior cingulate cortex—the part of the brain commonly used in detecting errors and comparing numbers.
When an advisor makes a random recommendation that turns out well, a completely different part of the brain associated with visual recognition of faces (the advisor’s) lights up. When the student focuses on numbers, and particularly numbers with a negative value, they tend to fire the advisor and move on.
Advisors who recognize this response intuitively have already de-emphasized presenting quarterly portfolio changes to clients. New technology, such as the fMRI brain imaging machine at Texas Tech, helps explain why people react predictably toward financial decisions in surprising and often counterproductive ways. Neuroscience can help us understand which advising strategies are most likely to succeed and why the most rational plan isn’t always the most effective.
Scientists used to rely on subjects with damage to a specific brain region to better understand how each part of the brain impacts decisions. Some subjects lost the ability to access a pathway between the slower prefrontal cortex and the faster, more primitive, instinctive and emotional regions deeper in the brain. Being cut off from emotion effectively turned these subjects into purely rational beings. They became better at making risky decisions that involve a chance of loss. They also became insufferably bogged down when even the most basic decision had to be vetted through the slowest part of the brain.
To coin a phrase used by Moshe Milevsky to describe a purely rational client, these subjects had become Vulcans. Most of us aren’t Vulcans. This is mainly a good thing.
What separates us from the Vulcans is our kinship with the animals. We can thank the process of natural selection and our ancestors who avoided excessive risks, hoarded for lean times, developed social skills, and occasionally acted on impulse (they had to or you wouldn’t be here).
The parts of our brain that we share with other mammals developed over time in a way that maximized our probability of survival. When humans responded the wrong way to a stimulus—for example, confronting rather than sneaking away from a bear, or being indifferent to the loss of food rather than defending it—they failed to pass their genes to subsequent generations. When we respond to a visual stimulus, our initial reaction is often an emotional one based on the more primitive response mechanisms. As James notes, “this system is not dysfunctional. It is functional for a different environment where the purpose is survival.”
This instinctive decision making process can often figure things out faster than our more rational processes. Experiments have shown that the body senses a superior deck of cards to an inferior one after only a few cards have been dealt, while it takes the conscious mind many more cards to detect the imbalance. Gut decision-making relies on these faster-acting emotional regions of the brain. But our gut instincts can also lead us astray, particularly when it comes to financial decisions. Instinct meant for short-run survival in a time of scarcity can be inconvenient under very different, modern conditions.
Money and emotion are linked by the powerful forces of social status and survival—two of the most fundamental motivations among all mammals. Responses to threats within either of these domains produce a quick emotional response. Our instinctive regions of the brain are constantly reacting to stimuli over the course of a day while we intermittently apply the slower, more advanced cognitive processes. This is efficient since using the more advanced part of the brain requires more effort and the body tries to perform tasks in the most efficient way possible.
This constant interplay between the slower cognitive brain regions and the much more automatic emotional and visual brain areas is what Nobel Prize-winning psychologist Daniel Kahneman refers to as thinking fast and slow. James prefers to characterize the cognitive brain parts as a rider controlling the much more powerful affective brain regions that resemble an elephant. The rider is generally under the delusion of being in control of the elephant, but when the elephant decides to change course the rider is often powerless to stop it.
Habits are merely our brain’s method of taking a task that normally would require greater cognitive effort and reducing it to more automatic processes in order to bypass cognition. The first time we engage in a task, for example driving to a new restaurant for lunch, our brains use more resources to gauge the best route, where to activate a turn signal, and where we might encounter threats. The second time we do it we bypass many of these processes and substitute them with a non-cognitive approach. If someone honks to remind us to pay attention to the light, we respond immediately with our emotional brain (as if to a threat) and then apply cognitive effort to assess whether our initial response was appropriate.
The notion that many habits occur independent of deliberate cognition is a powerful insight when developing a plan to change financial behavior. Retirement savings studies document the remarkable success of employees who make one simple decision—to walk to the benefits office and sign up for automatic payroll withdrawals. One of the primary mistakes made by our rational cognitive brain is that it over-projects its ability to change habits and delay gratification. We may understand the need to save more for retirement, but when we get home we fall into the same habits and find other things to spend money on. Breaking habits requires deliberate intention to change routines by using our rider to change the direction of the elephant. How do we motivate people to change behavior to meet long-term goals?
Neuroscience suggests that the worst way to motivate people is to focus on numbers. Telling someone they need to save a certain amount to achieve an adequate retirement accumulation goal may be convincing to the rational brain, but not so convincing to the elephant.
In his behavioral finance classes at Texas Tech, James tell his students that “we haven’t had written language all that long. Explaining a concept in a visual or emotional sense uses much more of our brain functions than is used by numbers. If you think of people as being emotional and visual, you’ve essentially tapped into 70% of the brain real estate. There is that rational side, but that rational side might be more like 20% of the real estate. The rational side used to solve math problems might be 8% of the real estate. I need to get you to visualize your end goal and attach emotion to that visualization—that this end goal requires a commitment, and that choosing not to follow the commitment endangers your goal. Our brains work visually and they work emotionally. I need to learn from you where you want to be, what life you want to be living at a certain age, what you want the freedom to be able to do, and then I attach a number to those things. To help a client visualize you have to first identify outcomes that have emotional salience for the client.”
Our brains appear to seek a condition known as homeostasis, or cognitive autopilot. We become comfortable with the status quo because breaking out of old habits and using cognitive resources to form new habits is inefficient. This feeling of satisfaction from maintaining one’s current state leads to a preference for the status quo. It can be useful to frame desired actions as the status quo in order to take advantage of this preference. For example, setting defaults that are beneficial can have an unexpectedly large impact on improving behavior. Car rental agencies know that most renters will select costly added insurance if they are forced to refuse insurance, but few will choose insurance if doing so requires added effort.
Consider disability insurance. This underused financial product protects against an important insurable risk that can be difficult to sell. Providing a client with a number of disability options with varying elimination periods and income replacement rates and explaining their importance may not be particularly convincing to a client. Often they would just as soon make the simple choice of rejecting them all. Another approach is to recommend a specific plan based on the client’s characteristics and preferences. If the client chooses not to pay for the disability plan, the planner can then require a liability release form and explain that the rejection of disability will leave his wife and children destitute resulting in unacceptable liability risk for the adviser. Not only is the advisor forcing the client to actively reject the plan, he is also providing an emotional appeal to explain its need.
One reason the brain prefers autopilot is because extended cognitive effort can lead to mistakes. Asking a subject a number of difficult questions will inevitably yield lower quality results and a tendency to revert to defaults or succumb to temptation. Sales tactics in which a customer is repeatedly asked complex questions before being offered a (costly) default are more effective than providing clients with a few simple choices. A good way to view the cognitive brain region is as a muscle, says James. “We know for example in temptation studies that if you make somebody do something highly cognitive before you give them the temptation then that muscle is going to be worn out. Or if you make somebody refuse temptations several times then with the nth temptation they are going to be more likely to give in. It’s just like working out—I feel weaker immediately afterward because I’ve been lifting these heavy weights.”
The most powerful emotional response related to financial choice is fear. Fear leads to a number of observed decision anomalies identified in behavioral finance such as the excessive attention paid to a loss. One explanation for our overemphasis on fear and loss is an evolutionary concept known as punctuated equilibrium. Human history is characterized by extended periods of mundane existence punctuated by catastrophic events such as a famine or plague that wipes out a large part of the population. Many risk-averse behaviors that appear irrational make sense when they occasionally produce survival among the most cautious. According to James, these seemingly irrational traits “are not necessarily stupid, they may just not fit a particular scenario that we’re in right now. Looking at the idea that we’ve inherited what we’ve inherited because it has been successful during periods of punctuated equilibrium, one of those really important things is fear.”
Fear is generally associated with the amygdala, an emotional region of the brain shared with other mammals. “When it comes to investments, it’s a driving force behind a lot of the persistent irrationality that we see,” says James. “Let’s take for example the very first recognized concept in behavioral economics—Kahnemann and [Amos] Tversky’s prospect theory, where we would treat an equivalent gain and loss unequally. What triggers the amygdala is the fear response, and the fear response is associated with a loss. It comes back to the thought process that works for survival, especially with a punctuated equilibrium. Having that sense of being able to hang onto what you’ve got as long as it’s sufficient for survival is way more important than being able to double, triple or quadruple what you’ve got. It’s nice, but it can’t compare to the importance of being able to hang on.”
This fear of loss manifests itself in a number of observed counterproductive financial behaviors. Individual investors consistently underperform the market because they tend to react badly to losses and sell securities when valuations are the most attractive. They also fail to take rational risks because they focus too much on the possibility of loss without considering the probability and possible outcomes. Examples include the disposition effect, where investors are more likely to hold losing stocks and sell winners because they don’t want to accept a loss; the endowment effect, where we place much greater value on things that we own because we have an emotional response to giving them up; and the sunk cost effect, where we ignore current reality and devote excessive resource to an unworthy investment (say a failing business) rather than give up and accept a loss.
Framing decisions so that they do not necessarily involve a loss is an important tool advisors can use to avoid bringing the amygdala to the table. A client who owns too many shares of a stock has an inefficiently diversified portfolio, but may be unwilling to give up the shares because of an emotional attachment to them (endowment effect) or their current price being below what was paid (disposition effect). Asking clients instead whether they would buy that stock right now with the remainder of their portfolio funds refocuses them on objective valuation. If they say no, then they’ve admitted that they don’t necessarily believe the stock is an attractive investment.
Another example is getting a client to stay in a bear market. James uses dollar cost averaging as a rationally indefensible technique that uses client psychology to achieve a desired outcome. “Dollar cost averaging is an illusion,” notes James. “Unless we have mean reversion in the market (and if we do we can make lots of market timing bets and make ourselves rich), dollar cost averaging does not work. But if people believe that they are buying shares cheaper in a recession, the story makes people stay in the market at the times when their fear-driven emotional side wants them to get out of the market. We have a story that, even if it’s completely false, is generating the behavior that is going to be portfolio maximizing in the end. So maybe the answer to the usefulness of dollar cost averaging isn’t ‘well we’ve figured it out and it doesn’t work, so don’t use it,’ the answer is ‘actually it’s not true but it gets your clients to behave the right way so keep telling them that.’”
Michael Finke is a professor and coordinator of the doctoral program in personal financial planning at Texas Tech University.