Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards
ThinkAdvisor

Life Health > Life Insurance

When History and Finance Go Wrong

X
Your article was successfully shared with the contacts you provided.

On a fine morning 100 summers ago in Sarajevo, an automobile driver ferrying the Austro-Hungarian Archduke Franz Ferdinand made a wrong turn off the main street into a narrow passageway and came to a stop directly in front of a teenaged activist member of the Serbian terrorist organization Black Hand. Gavrilo Princip drew his pistol and fired twice. The archduke and his wife fell dead. Within hours, World War I was well on its way to seeming inevitable (or at least necessary), all because of a wrong turn. And the idea that history is rational and sheds light on an intelligible story, much less a story of inevitable and inexorable advance, was also shot dead, as dead as the archduke himself.

Austria invaded Serbia in response to the assassination soon thereafter. Russia had guaranteed protection for the Serbs via treaty and duly responded. Germany, also by treaty, was committed to support Austria when Russia intervened. France was bound to Russia by treaty, causing Germany to invade neutral Belgium in order to reach Paris by the shortest possible route. Great Britain then entered the fray so as to support Belgium and, to a lesser extent, France. From there, the entire situation went completely to hell.

Alleged experts on both sides expected a quick outcome. “Over by Christmas” was the consensus view, even though the expected prevailing parties remained in dispute. Forecasters then were no better than today’s iteration, obviously. The “absolute neutrality” of the United States lasted until the spring of 1917 when Germany’s unrestricted submarine warfare policy caused too much damage to U.S. interests to be ignored.

By 1919, after the war’s end, 10 million had died in addition to the archduke and his wife even as the seeds of Nazism and the Second World War were sown. Western civilization, previously so full of optimism and fairly broad prosperity, had drawn itself into a very tight circle and begun blasting away from very close range for no very good reason.

Planning Fallacy

The lead-up to World War I demonstrates the planning fallacy (outlined by Nobel laureate Daniel Kahneman in his terrific book, “Thinking, Fast and Slow”) writ large. In the same way that the narrative fallacy constructs stories to create a make-believe history of grand designs and chess-master-like wisdom (since winners tend to write the accepted accounts), the planning fallacy projects those same fanciful renderings forward with the idea that the future can somehow be managed—and perhaps controlled—despite the lack of any actual historical support for the notion.

As Adam Gopnik sagely points out in a recent edition of The New Yorker, “[w]hat history actually shows is that nothing works out as planned, and that everything has unintentional consequences.” Indeed, “the best argument for reading history is not that it will show us the right thing to do in one case or the other, but rather that it will show us why even doing the right thing rarely works out.”

The planning fallacy is related to optimism bias (think Lake Wobegon—where all the children are above average) and self-serving bias (where the good stuff is deemed to be my doing while the bad stuff is always someone else’s fault). We routinely overrate our own capacities and exaggerate our abilities to shape the future.

Thus the planning fallacy is our tendency to underestimate the time, costs and risks of future actions and at the same time to overestimate the benefits thereof. It’s at least partly why we underestimate the likelihood of bad results. It’s why we think it won’t take us as long to accomplish something as it does. It’s why projects tend to cost more than we expect. It’s why the results we achieve aren’t nearly as good as we expect and why they are so often disastrous.

It’s why I take three trips to Home Depot on Saturdays and why it takes me all day to finish a household chore I expected to take maybe an hour (which then doesn’t work right or look right). As John Lennon put it, “Life is what happens to you while you’re busy making other plans.”

Stay Humble

The key lesson, then, should be interpretively Hippocratic: First and foremost, do no harm. Humility is paramount. As Nate Silver emphasizes in his book, “The Signal and the Noise,” we readily overestimate the degree of predictability in complex systems. We need to promise less and expect less. Things are still not likely to turn out the way we hope, expect or claim, but at least our embarrassment will be less when they don’t—when life happens.

A second lesson is related: Avoid errors. A famous study by the U.S. Institute of Medicine concluded that up to 100,000 people die each year due to preventable medical errors. Since physicians are among the smartest and most highly trained professionals imaginable, desperately trying to do the right thing, the aggregate level of error in human life must be almost unimaginably large. Unfortunately, even though most of us are willing to acknowledge a wide array of poor decision-making exhibited broadly and generally, our bias blindness means that we tend to fail to expect or recognize the problem personally.

Sadly, errors cost us more than good decisions help. When nearly everyone is smart together, nobody wins. When nearly everyone screws up together, nearly everyone loses and loses much more than they otherwise would have. The more universal the error, the greater the loss will be. Because this tragedy of errors is such a major problem, dealing with risk first is absolutely essential for good investing. Thus learning what not to do is more important than learning what to do. As Charley Ellis famously established, investing is a loser’s game much of the time—with outcomes dominated by luck rather than skill and high transaction costs. If we avoid mistakes we will generally win.

The third lesson should be obvious: Plan for the worst even if and as you hope for the best. In a financial context, this lesson has several particular applications, including the following:

  1. Because we discount future risk too much, we ought to be particularly skeptical about our various estimates of results and outcomes and ought to consider more carefully the consequences if (when!) things don’t turn out as well as we planned.

  2. We should value the benefits of guarantees (when available) more than the benefits of potential. Accordingly, we should typically be concerned more about the costs of failure than about opportunity costs. Income annuities look particularly attractive in this context.

Approval Issue

Another reason why this general problem is particularly acute for advisors is the so-called “authorization imperative.” Our plans and proposals must be approved by our clients and we have a stake in getting that approval. This dynamic leads to our tendency to understate risk and overstate potential. Perhaps we see it as easier to get forgiveness than permission or perhaps it’s just a sales pitch. Or maybe we have convinced ourselves that we’ve got everything covered (confirmation bias!). Whatever the reasons and despite its strategic benefits, we run the risk of serious misrepresentation.

Things rarely turn out the way we expect. We never have everything covered. Life happens. Act accordingly. You have been warned.


NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.