I’m a big fan of books that help us cut through the noise of our current information overload to better understand how we got where we are and how to do better in the future. A few of my favorites are “Freakonomics” by Stephen Levin; “Stumbling on Happiness” by Daniel Gilbert, which details why our brains are woefully bad at predicting what will make us happy; and “Guns, Germs, and Steel” by Jared Diamond, who explains how “advanced” societies got that way when others didn’t.
Diamond’s answer is that a relatively small number of technological breakthroughs (“tipping points” to use Malcolm Gladwell’s term), one built upon another, yielded huge results such as going from horse-drawn carriages to the moon in under 100 years. One of those advancements was the printing press, which enabled the sharing of information, which led to machines, steam power, the industrial revolution and the greatest explosion of wealth in human history.
Perhaps not coincidentally, in the latest addition to my favorite books list—“The Signal and the Noise”—author Nate Silver also talks about the printing press and how the aftermath of its “information explosion” paralleled that of the Internet. You may recognize Silver’s name: In his New York Times blog, FiveThirtyEight, he predicted the results of the past three national elections with great accuracy.
I found “The Signal and the Noise” to be a brilliant book, explaining a broad range of current phenomena such as the extreme political divisiveness in America, how the ratings agencies (and almost everyone else) missed the mortgage meltdown of 2007, why most political pundits are terrible at predicting elections, why virtually no experts foresaw the fall of the Soviet Union, why the majority of published research findings are false, and how the Red Socks defied the odds and missed the 2009 Major League playoffs.
Silver, whose background includes successfully applying statistical analysis to sports betting (think “Moneyball”) and poker, does more than simply articulate the difficulty of prediction in the modern world. He cites some success stories as well, including vast increases in the accuracy of weather forecasting, a winning professional gambler and chess-playing computers. Silver uses these and other examples to offer a set of solutions—or, at least, guidelines—to make better predictions within the noise of too much information.
If anyone can use help separating signals of important future changes from the background information noise, it’s financial advisors. Constructing investment portfolios is, after all, an effort to cut through all the noise that we’re bombarded with daily by the news media. Asset allocation models are designed to mitigate our natural impulses and tendencies. Yet, even with models designed to capture long-term upward trends in the markets while minimizing risk, advisors are making assumptions (usually backed by historical data), about how various asset classes will perform—both in terms of returns and risk profile—and how they will perform relative to each other. If we learned anything from the turbulent markets of 2007 and 2008, it’s that outside factors can and do affect both investment returns and their correlations, which suggests that clients would be better served by advisors who do a better job of monitoring the factors that affect asset class performance.
According to Silver, “too much” information hampers our predictions in two ways. The first is that our statistical analysis hasn’t caught up with the flood of information. Current statistical analysis theory, called “frequentism,” is based on the idea that all subjective judgments should be eliminated from our calculations. I’ll spare you Silver’s lengthy discussion of the flaw in this thinking, except to say that he believes this academic notion isn’t possible in the real world, consequently resulting in faulty conclusions.