On August 18, 1913, at the famous casino in Monte Carlo, Monaco, the roulette ball fell black 26 times in a row. Gamblers, seeing the streak grow, increasingly shifted their bets to red, feeling that an excess of reds was due, to counterbalance the remarkable black streak. They lost millions.
Gambler’s Fallacy I – Forgetting That The “Coin Has No Memory”
Gamblers often believe that after a long streak of one outcome, the probability of a different outcome has increased. Sports commentators often say that a batter in a slump is “due” for a hit. Psychologically, they think that an outcome opposite to the streak of poor hitting is needed to balance the sample and bring the data into conformity with the known long term average.
In the Gambler’s Fallacy I, the gambler is looking backwards and seeking balance. There is a somewhat related phenomenon – the “Law of Large Numbers” that provides that balance looking forward. The Law of Large Numbers says that, while smaller sets of data may deviate from the true average, as more and more data are accumulated, the cumulative average will come closer and closer to the true average. But this does not involve any short term shifts in probability – the probability that the coin will land heads remains 50%, even if 75% of the last 25 tosses went tails. The coin does not know what happened on previous tosses and continues to land with a fixed probability of 50% heads – hence the phrase “the coin has no memory.”
This Gambler’s Fallacy is also used loosely to describe situations where decision-makers operating under uncertainty “tilt” their decisions to counterbalance short streaks. For example,
- In judging a close pitch, baseball umpires are 5% less likely to call a strike if they have called the two previous pitches strikes.
- US judges in refugee asylum cases are 3% more likely to reject asylum if they approved the previous case. This effect is stronger if there was a streak of several approvals before the current case.
- Loan officers in India were up to 23% less likely to approve a loan application if they had approved the prior loan application.
These cases are described in detail in “Decision Making Under the Gambler’s Fallacy: Evidence from Asylum Judges, Loan Officers, and Baseball Umpires,” by Chen, Moscowitz and Shue, in the Quarterly Journal of Economics.
The Statistics 1 course offered by Statistics.com includes an exercise in which students write down an invented sequence of 50 coin tosses. It is almost always the case that students cut their runs of heads and tails too short. A class’s worth of actual coin tosses will yield a few runs of 8-9 heads or tails in a row. The invented sequences rarely go beyond 5 or 6 in a row.
Gambler’s Fallacy II – The Hot Hand
With Gambler’s Fallacy II, the gambler compensates in the opposite direction from Gambler’s Fallacy I. In this scenario, the gambler observes a streak and believes that the probability of a positive or negative event has shifted in the direction suggested by the streak. The gambler thinks the basketball player with the hot hand is more likely to score than their long-term average would suggest, or that the baseball player who is mired in a slump is less likely to get a hit. In Gambler’s Fallacy I, the gambler shifts their estimate in the opposite direction of the streak, in Gambler’s Fallacy II – the adjustment is made in the same direction.
A famous illustration of this is Deming’s “Funnel Experiment.”
A ball is dropped through a funnel and its landing spot noted; there is some variability in where it lands due to chance. If several balls fall off-target in the same direction, a common reaction is to conclude the funnel has become biased in that direction, and to compensate by shifting the funnel back toward the target. The catch: people are likely to interpret a random co-location of several off target balls as system bias. In such a case, moving the funnel to compensate actually exacerbates variability.
The implications of the experiment extend beyond well-defined manufacturing processes, the original focus of the quality movement in the 1950’s. In any process involving human actions or decisions, the normal tendency to interpret random noise as signal and intervene on the basis of that noise actually increases overall system variability. A key component of the quality assurance philosophy is that system variability be reduced so as to better identify the factors that truly affect a process. Statistical process charts (control charts) are one way to force people to keep their hands off a process when they otherwise might be tempted to tinker.
Gambler’s Fallacy III – Identifying Spurious Meaning from Chance Events
In this fallacy, which is really a variant of the second, the gambler uses information from a small set of prior chance events to make decisions about future bets, bets that should be based instead on knowledge of the system (e.g. a coin being flipped). In its most extreme form it, a positive chance outcome is associated in the gambler’s mind with some earlier equally random event – e.g. if a big roulette payoff for a corner bet (4 numbers out of the 38) occurred right after a black/red/black/red sequence, watch for that same black/red/black/red sequence to occur again, and repeat the winning bet.
We all know people who carry with them quirky ideas of what will and won’t work based on ephemeral evidence. My great uncle, highly educated and a senior government financial official, accumulated over his lifetime a list of various food combinations that he had to avoid. Cucumber and melon together, or black beans and pork, for example. The reason? His digestion was often causing him trouble, and when he had a bad episode he would think back on what he had eaten in the past 24 hours and identify food combinations over the time period that he considered likely culprits, and stay away from them.
What ties together all the Gambler’s Fallacies is the natural human tendency to underestimate the role that random chance can play in fooling us with small samples. Much of statistics is concerned with counteracting that tendency.