Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards
ThinkAdvisor

Financial Planning > Behavioral Finance

Who’s the Easiest Person to Fool?

X
Your article was successfully shared with the contacts you provided.

In 1587, the Roman Catholic Church created the position of Advocatus Diaboli (the “Devil’s Advocate”) to prepare and raise all possible arguments against the canonization of any candidate for sainthood. Theology aside, it’s a fantastic idea, even if almost nobody emulates it.

There is a wide body of research on what has come to be known as motivated reasoning and its flip side, confirmation bias. Confirmation bias is our tendency to notice and accept that which fits within our preconceived notions and beliefs, while motivated reasoning is our complementary tendency to scrutinize ideas more critically when we disagree with them than when we agree.

We are similarly much more likely to recall supporting rather than opposing evidence. Upton Sinclair offered perhaps its most popular expression: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”

A related problem is what Nobel laureate Daniel Kahneman calls the “planning fallacy.” It’s a corollary to optimism bias (think Lake Wobegon—where all the children are above average) and self-serving bias (where the good stuff is my doing and the bad stuff is always someone else’s fault). The planning fallacy is our tendency to underestimate the time, costs and risks of future actions and at the same time overestimate the benefits thereof. It’s why we underestimate bad results. It’s why we think it won’t take us as long to accomplish something as it does. It’s why projects tend to cost more than we expect. It’s why the results we achieve aren’t as good as we expect.

These cognitive biases—among others—are a constant threat to our decision-making. Fortunately, behavioral economics has done a terrific job at providing an outline as to what these risks look like. It is one thing to recognize these cognitive difficulties, of course, and quite another actually to do something about them.

Unfortunately, one major difficulty is the bias blind spot—our general inability to recognize that we suffer from the same cognitive biases that plague other people. If we believe something to be true, we quite naturally think it’s objectively true and assume that those who disagree have some sort of problem. It’s the same kind of thinking that allows us to smile knowingly when friends tell us about how smart, talented and attractive their children are while remaining utterly convinced about the objective truth of the attributes of our own kids.

Unfortunately, there isn’t a lot we can do to deal with these issues, as even Kahneman concedes. Perhaps if we could only be more like scientists, who employ a rigorous method to root out error, our process might improve.

We can start by using what Harvard Medical School’s Atul Gawande calls “the power of negative thinking,” which is the essence of the scientific method. That means actively looking for failures and how to overcome them. Yet scientists themselves fall prey to inherent bias, despite their formal protocols designed to find and eliminate error. In his famous 1974 Caltech commencement address, the great physicist Richard Feynman talked about the scientific method as the best means to achieve progress. Even so, notice his emphasis: “The first principle is that you must not fool yourself—and you are the easiest person to fool.”

Scientists routinely acknowledge that they get things wrong, at least in theory, but they also hold fast to the idea that these errors get corrected over time as other scientists try to build upon earlier work. However, John Ioannidis of Stanford has shown that “most published research findings are probably false,” and subsequent reviews support that claim.

In a commentary in Nature last year, scientists at Amgen disclosed that they had been unable to replicate the vast majority (47 of 53) of the landmark pre-clinical research studies they had examined. In a similar study, researchers at Bayer HealthCare reported that they had reproduced the published results in just a quarter of 67 seminal studies they examined. Despite rigorous protocols and a culture designed to root out error aggressively, scientists get things wrong—really important things—a lot more often than we’d like to think.

A Grim Story

The cautionary tale of Ignaz Semmelweis illustrates the problem. Semmelweis was a 19th century Hungarian obstetrician who discovered that the often-fatal puerperal fever, then common among new mothers in hospitals, could essentially be eliminated if doctors simply washed their hands before assisting with childbirth. He thereupon initiated a strict regimen at his hospital whereby all who would assist in a birthing must first wash their hands with a chlorinated solution. As a consequence, death rates there plummeted.

Semmelweis expected a revolution in hospital hygiene as a consequence of his findings. But it didn’t come, because of a phenomenon now called the “Semmelweis Reflex.”

His hypothesis, that there was only one cause of the disease and that it could be prevented simply through cleanliness, ran counter to the prevailing medical ideology of the time, which insisted that diseases had multiple causes. Despite the practical demonstration of its effectiveness, his approach was largely ignored, rejected or even ridiculed. Ideology trumped facts. Things got so bad that Semmelweis was ultimately dismissed from his hospital post and harassed by the medical community in Vienna, forcing him to move to Budapest.

In Budapest, Semmelweis grew increasingly outspoken and hostile towards physicians who refused to acknowledge his discovery and implement his protocols. Vitriolic exchanges ensued, in medical literature and in letters, and Semmelweis was eventually lured to an asylum where his opponents had arranged for his incarceration. He was beaten severely and put in a straitjacket. He died within two weeks. This story hardly fits within the scientific ideal. As a consequence, the “Semmelweis Reflex” is our tendency to reject evidence when it contradicts our favored norms and beliefs. It is exceedingly hard for us to follow the evidence wherever it leads.

Per Kahneman, organizations are more likely to succeed at overcoming bias than individuals. That’s partly on account of resources, and partly because self-criticism is so difficult. The best check on bad decision-making we have is when someone (or, when possible, an empowered team) we respect sets out to show us where and how we are wrong. Within organizations that means trying to foster what Kahneman calls “adversarial collaboration” and making sure that everyone can be challenged without fear of reprisal and that everyone (and especially anyone in charge) is accountable. But it doesn’t happen very often. Kahneman routinely asks groups how committed they are to better decision-making and if they are even willing to spend 1% of their budgets on doing so. Sadly, he hasn’t had any takers yet.

Innovation in financial planning typically starts with an idea. If enough people (or the right people) think it might be a good idea, it then moves to evidence-gathering for confirmation. But the entire endeavor—designed to try to confirm if the idea makes sense—is inherently prone to confirmation bias. We should be systematically and consistently looking to disprove the idea. Without a devil’s advocate with the specific mission to try to show why the idea is a bad one, without recrimination or criticism for doing so, many bad ideas will seem to be confirmed.

Do you have a devil’s advocate or at least an accountability partner? Your decision-making and your advice are almost certainly suffering if you do not.


NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.