Probabilistic retirement income strategies—such as the so-called “4% rule”—are used by advisors more than any other type. In a typical example, a withdrawal rate is typically deemed “safe” if a series of Monte Carlo simulations suggests a 90-95% success rate over 30 years. Not surprisingly, understanding and utilizing such strategies requires some working understanding of probability.

Unfortunately, we are all prone to innumeracy, the inability to make sense of the numbers that run our lives. And while we’re bad at math in general, we’re *really* bad at probability. If a weather forecaster says that there is an 80% chance of rain and it remains sunny, instead of waiting to see if it rains 80% of the time over a statistically significant data set, we race to conclude—perhaps based upon that single instance— that the forecaster isn’t any good. Therefore, retirees using probabilistic strategies must be carefully prepared to deal with an uncertain future.

Tom Stoppard’s play, *Rosencrantz and Guildenstern Are Dead*, presents Shakespeare’s *Hamlet* from the point of view of two of the Bard’s bit players, the doomed nobodies who become headliners for Stoppard. The play opens with our heroes marking time by flipping coins and getting heads each time. Guildenstern keeps flipping coins and Rosencrantz keeps pocketing them when they come up heads. Significantly, Guildenstern is less concerned with his losses than in puzzling out what the defiance of the odds says about chance and fate. “A weaker man might be moved to re-examine his faith, if in nothing else at least in the law of probability.”

Guildenstern offers among other possible explanations for the streak one mathematicians and investors should favor —“a spectacular vindication of the principle that each individual coin spin individually is as likely to come down heads as tails and therefore should cause no surprise each individual time it does.” In other words, *past performance is not indicative of future results*.

The probability that a fair coin, when flipped, will turn up heads is 50% (the probability of any two independent sequential events both happening is the product of the probability of both). Thus the odds of it turning up twice in a row are 25% (½ x ½), the odds of it turning up three times in a row are 12.5% (½ x ½ x ½) and so on. Accordingly, if we flip a coin 10 times, we would only expect to have a set of 10 end up with 10 heads in a row once every 1024 sets [(½)^{10} = 1/1024]. Rosencrantz and Guildenstern got heads more than 100 consecutive times. The chances of that happening are: (½)^{100} = 1/7.9 x 10^{31}. We should thus expect it to happen once in 79 million million million million million (that’s 79 with 30 zeros after it) sets.

If anything like that had happened to you (especially with money at stake), you’d be wise to suspect that the probabilities favor a loaded coin. But then again, while 100 straight heads is less probable than 99, which is less probable than 98, and so on, *any* exact order of tosses is as likely (actually, *unlikely*) as 100 heads in a row: (½)^{100}. We notice 100 in a row because of the pattern. Other combinations *look* random to us and thus seem more normal. Looked at another way, if there will be one “winner” within a stadium of 100,000 people, each person has a 1 in 100,000 chance of winning. *But we aren’t surprised when someone does win*, even though the individual winner is shocked.

**Seeking Patterns**

The point here is that the highly improbable happens all the time but is always unexpected. This math explains why we shouldn’t be surprised when the market remains “irrational” far longer than seems possible. But we are. Randomness is difficult for us to deal with. Instead of dealing appropriately with probability, we look for patterns to convince ourselves that the numbers don’t really say what they clearly do. In this regard, we are dumber than rats—literally.

In multiple studies (most prominently those by Edwards and Estes, as reported by Philip Tetlock in his book *Expert Political Judgment*), subjects were asked to predict which side of a “T-maze” held food for a rat. The maze was rigged such that the food was randomly placed (no pattern), but 60% of the time on one side and 40% on the other. The rat quickly “gets it” and waits at the “60% side” every time and is thus correct 60% of the time. Human observers keep looking for patterns and choose sides in rough proportion to recent results. As a consequence, the humans were right only 52% of the time—they (we!) are much dumber than rats. We routinely misinterpret probabilistic strategies that accept the inevitability of randomness and error.

If we are going to recommend and implement probabilistic retirement planning strategies, we need to prepare for client and advisor difficulty in dealing with such concepts.

When Nobel laureate Daniel Kahneman was asked why probability is such a tough concept for us, he said that “to compute probabilities you need to keep several possibilities in your mind at once. It’s difficult for most people. Typically, we have a single story with a theme. People have a sense of propensity, that the system is more likely to do one thing than the other, but it’s quite different from the probabilities where you have to think of two possibilities and weigh their relative chances of happening.”

We prefer to think linearly, manufacturing a storyline, in effect, with a beginning, middle and end. That’s why we are so susceptible to the “narrative fallacy.” We inherently prefer stories to data. Contingencies and (perhaps random) consequences don’t correspond to the way we like to see the world. We are—pretty much all the time, largely on account of confirmation bias and the narrative fallacy—fitting what we see or assume we see into a preconceived story line. A client who is told only that a withdrawal method should work “90% of the time” will not be equipped for failure or even for plan deviation if and when things don’t turn out as planned.

Dealing effectively with probabilities requires that we recognize the power of the random and contingent. No matter how good a story we have concocted with respect to what we expect to happen, no matter how careful our analysis, stuff happens that can and often does mess up, and mess with, our hopes, dreams and schemes.

In his 1974 CalTech commencement address, the great physicist Richard Feynman talked about the scientific method as the best means to achieve progress. Even so, notice what he emphasizes: “The first principle is that you must not fool yourself—and you are the easiest person to fool.” It *is* easy to fool ourselves, especially when we want to be fooled. We all *really* like to be right and have a vested interest in our supposed rightness. We also want things to turn out well and have an obvious interest in that happening.

Accordingly, we need to check our work constantly and prepare clients extremely carefully to deal with the random and the contingent, because we’re very bad at dealing with mathematical concepts and even worse at dealing with probability.