Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards
ThinkAdvisor

Financial Planning > Behavioral Finance

This Hidden Flaw Can Lead to Grave Errors in Economic Predictions

X
Your article was successfully shared with the contacts you provided.

Daniel Kahneman, Princeton University emeritus professor of psychology and economics, and Nobel Prize winning author of “Thinking Fast, and Slow,” now warns that “noise” — variability in judgments that ought to be identical — causes costly errors.

In “Noise: A Flaw in Human Judgment” (Little, Brown Spark-May 16), Kahneman and his co-authors, Olivier Sibony and Cass R. Sunstein, explore just how harmful noise can be in numerous fields, including economic forecasting, and provide remedies to reduce it. 

“Unlike bias, noise isn’t intuitive, which is why we think we’ve discovered a new continent,” Sunstein says of the book’s new insights and ideas, in an interview with ThinkAdvisor.

A law professor on leave from Harvard University, Sunstein co-authored, with behavioral economist Richard Thaler, the bestseller “Nudge: Improving Decisions about Health, Wealth and Happiness.” Sibony is a professor of strategy at HEC Paris and an associate fellow at Oxford.

Financial advisors are noisy, doctors are noisy, judges are noisy — no one is immune, says Sunstein, who notes that “Noise” was more than five years in the writing. That includes Zoom meetings with Kahneman, who was awarded a Nobel in economic sciences in 2002 and the Presidential Medal of Freedom in 2013.

In the interview, Sunstein discusses ways that financial advisors can be less noisy — significant since decisions of all kinds involve predictive judgment, in which noise causes errors.

He also reveals the name of the prominent professor who helped him decide between two investment advisors to manage his and his wife’s stock and bond portfolio.

Sunstein is married to Samantha Power, who was recently confirmed to head the U.S. Agency for International Development and was a former Obama administration U.S. ambassador to the U.N.

In February, Sunstein was appointed senior counselor and regulatory policy officer in the Department of Homeland Security. He was head of the Office of Information and Regulatory Affairs during the Obama presidency.

In our interview, he stresses the need to reduce noise, which, the authors point out, can cause “grave damage.” One method is to think probabilistically — “a good thing to find in an advisor,” he remarks.

Sunstein, a former consultant to Apple years ago, also points to the remedy of “decision hygiene,” which may also help advisors deal with an expected increase in digital financial advice.

Even mood affects generating noise or being susceptible to it, Sunstein maintains in a short foray into what in psychology circles is known as “bull—- receptivity.”

ThinkAdvisor recently interviewed Sunstein, who was speaking by phone from Washington, D.C.  This April he released “Averting Catastrophe: Decision Theory for COVID-19, Climate Change and Potential Disasters of All Kinds” (NYU Press).

“There’s some overlap with ‘Noise’ because if you’re trying to reduce risk, you want to have some decision hygiene so your decisions aren’t noisy or biased,” he notes.

Here are highlights of our interview:

THINKADVISOR: Why is it important to be aware of noise? You write that it’s “rarely mentioned” and “goes unnoticed”; yet “it can cause grave damage.”

CASS SUNSTEIN: Noise is unwanted variability. Each of us is noisy, some more than others. We’re learning that noise is often the most important “character” in human life. Unlike bias, noise isn’t intuitive, which is why we think we discovered a new continent. Noise is like the character in the background of a movie that you don’t pay a lot of attention to but who turns out to be the most important one, as you learn at the end.

“Disagreement among traders makes markets,” you write. Is that an example of noise?

In markets, some people think a certain stock is going to go up; others think it’s not — and that’s part of the [process]. In terms of noise, we’re concerned with unwanted variability [judgments that should be identical], which isn’t good. 

“We’re really focused on reducing biases. Let’s also worry about reducing noise,” you three write. What’s the difference been bias and noise, in addition to what you’ve just mentioned?

With bias, there’s a tendency to go in one direction or another that’s really predictable. But when you enter a noisy system, you’re subject to a lottery. Bias overvalues the present and undervalues the future. If a person is too optimistic about how the economy will go, that’s a bias. 

Please elaborate.

If you’re at a firm where some people discriminate against men and some discriminate against women, those different biases will result in noise, where “different” people are going to be treated differently. 

Where else does noise occur?

In a hospital, if some of the doctors say, “Let’s wait and see if it gets better” and others say, “You need surgery.” Or if a doctor concludes in the morning that a person has a serious illness and probably should undergo extensive testing but in the afternoon, if he’s tired, decides to hold off on testing for six months, that’s a noisy doctor. In a system where the judges are very variable and one says, “five years in jail,” and the other says, “probation,” that’s a noisy legal system. 

Does this type of noise apply to inanimate objects, too?

Yes. A scale that’s sometimes five pounds too high and at other times five pounds too low is noisy. If you believe in that scale, you’re going to be extremely confused about your weight. A noisy scale can screw up your consumption pattern pretty badly even if the average is accurate.

Your book focuses on forecasting quite a bit. Psychologist Philip Tetlock’s research suggests that “detailed long-term predictions about specific events are simply impossible” but that “superforecasters” can predict short-term events of less than a year, you write. Please discuss the superior talent they have for thinking analytically and probabilistically.

Superforecasters are less noisy — they don’t show the variability that the rest of us show.  They’re very smart; but also, very importantly, they don’t think in terms of “yes” or “no” but in terms of probability. They break problems down to their component parts and don’t think holistically.

Care to name any superforecasters?

Here’s a concrete example: In my family in the recent past, we’ve had two investment advisors with respect to stocks and bonds: After my wife and I got married, we each stuck with the advisor we had — both successful and reputable. But they have radically different approaches, and the variability was astounding. When I started talking to them both, the first thing I noticed was noise; and also, one of them was biased. 

How did you proceed?

I consulted my superforecaster, [Nobel Prize winning behavioral economist] Richard Thaler, my co-author of “Nudge.” He told me who to go with. He said the other advisor had an approach that wasn’t sensible. So I decided that one of them was right and the other was wrong. 

What qualities make Richard Thaler a superforecaster?

He’s my favorite superforecaster because he thinks about probability; he thinks in terms of long-term trends. Following his advice concerning my own investments has been a good idea. But also, Philip Tetlock found that people whose names you’ve never heard of are often unusually good forecasters because they have characteristics of the sort I’ve described.

What’s an example of probabilistic thinking?

One [of the things] I learned in this [book] collaboration is not to think in terms of [for instance], “Will this stock go up”? “Is this the right investment strategy?” but instead to think: “What’s the probability that this stock will go up?” “What’s the probability that this is a good investment strategy?” So rather than asking, “Is it good to invest in international stocks [versus] domestic stocks?”, it’s better to ask, “What probability do you assign to the proposition that international stocks will outperform domestic stocks in 2022?”

What’s another advantage to reasoning probabilistically?

It’s somehow calming because if you think, for example, the chance international stocks will perform better than domestic stocks in 2022 is 70%, you have clarity that there’s a 30% chance domestic stocks will outperform international stocks. And that’s disciplining.

Should financial advisors develop the ability to think probabilistically? 

That’s a good thing to find in an advisor. To see if certain kinds of investments are likely to do well [the advisor would] break [them] up into their component parts to [determine] if sub-judgments would [apply].

Why do most forecasters come up with inaccurate forecasts? 

When a forecaster is wrong, we think, “Why did they make that mistake?” The better question is: “Why did we think they could get that one right?” Some economic forecasters are unrealistically optimistic with respect to [U.S.] growth, for example. That’s a bias. 

So forecasting decisions can be very noisy as well?

Yes. If you go to five experts and ask about economic performance in 2022, you’ll find noise  because people weight variables differently. So the smart thing is to take a consensus forecast rather than [look for] the smartest forecaster in the room or one you particularly like or the one who has the best background. The smartest one might say the economy will grow by 3% in a given year; [another] might say it will grow by zero percent. 

So it’s preferable to consult multiple experts on a particular issue?

Yes. It might be better to [ask] 15 people whose business it is and just take the average. A good way to reduce error in noisy economic forecasting is to aggregate disparate judgments.

“Noise” is subtitled, “A Flaw in Human Judgment.” You write that “algorithms will replace human judgment, and that is why it must be improved.” Financial advisors have been concerned, though less so nowadays, that robo-advisors will largely replace human ones. Thoughts?

In the fullness of time, the role of algorithms will grow. The likelihood that algorithms will have a larger role in 2023 than in 2021 is 80%. In some domains, algorithms are replacing human judgment. In medicine, often an algorithm will be the determinant of the diagnosis rather than individual clinical judgment.

What about in investing?

From the standpoint of investors, individual investment advisors can be biased and very noisy. That’s a problem. Whether the right recipe for that problem is decision hygiene for investment advisors — which we quite like — or greater reliance on algorithms is TBD. In some domains, people are very squeamish about decision by algorithms.

What’s decision hygiene?

A way of counteracting both bias and noise — reducing the effect of the virus of noise, [analogous to] washing your hands. One way to do that is to aggregate.

In discussing mood as a source of noise, you write that the trait known as “bull—- receptivity” — from Princeton University philosopher Harry Frankfurt’s book, “On Bull—-” — is often brought about by being in a good mood, which makes people more gullible in general. Please discuss. 

With regard to the avoidance of bias and noise, bull—- receptivity is not a positive thing. Some people are more bull—- -receptive than others. They tend to see sense or profundity in nonsense statements such as, “The ecological footprint of wild geese raised under conditions of momentous scrutiny is affirmatively more numerous than the ecological footprint of other kinds of mountain lions.” I just made that up. 

In relation to noise, what are the implications of someone’s having a high level of bull—- receptivity?

I would speculate that bull—- -receptive people are likely to indulge their imaginations, and they might well be prone to being noisy. If someone is receptive to bull—- that’s in the form of seemingly profound or meaningful sentences that actually don’t mean anything at all, that’s a warning sign. 

— Related on ThinkAdvisor:


NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.