×   Home   Blog   Newsletter   Privacy   Contact Us   About

Gambler's Ruin

This blog post started out as a Coffee Time Challenge puzzle that I posted to Facebook. We’ll look at this puzzle first, then go deeper down the rabbit hole and looking into the wisdom of trying to double your money in a casino.

You have $100 in your pocket and meet a friend who also has $100. He offers you a wager: Flip a coin, with winner takes all.

Unbeknownst to him, you have a biased coin in your pocket, which falls heads 60% of the time. He will let you flip, and he will let you call. You feel smug.

Just before you flip, he offers to change the rules. Instead of flipping for the entire $100, you flip for $50, win or lose, you carry on flipping until one person has all the money, and the other none (You always use your coin, and always call).


Question: Is better to accept the modified rules, or does it not matter?

Answer

The answer is, of course, that it’s better to accept the modified rules. Let’s see why.
For the unmodified rules, it’s just a (biased) coin flip. You are going to double your money 60% of the time.
For the modified rules game, things are a little more complicated; various things can happen. Two heads could come up in a row, for an instant win. Two tails could come up, for an instant loss. Or, one of each could come up (in either order), and you are back where you started. These possibilities are enumerated in the tree diagram below.
Rather than looking at 60% coin we’ll look at the generic case and define p to represent the probability of winning a coin toss so, for instance, the 60% coin would be p=0.6. The law of probability states that the chance of losing is therefore 1-p (somebody has to win!)
Let’s define D to represent the probability of doubling your money, and we can come up with this equation:
We can win straight away with two heads p2, or we can revert back to where we started, two ways, each with probability p(1-p). We can write this above as a self-referencing equality, with D appearing on both sides. The easy way to think about this is think that our chances of winning are the chances of getting two heads OR the chances of winning by getting back to the start multiplied by the chance of getting there. Simple rearrangement gives the generic result:
Plugging the value for p=0.6 into this equation gives the result D ≈ 69.23%, a big improvement over the basic 60% chance with a single flip.
You can improve your odds from 60% to 69% by taking two flips!
Let's see how this graphs for different values of p:
We can see how the value of p=0.6, corresponds the the 69.23% chance of doubling our money. We can see that, the more biased the coin, the better our chances become. If we have a p=0.75 coin, for instance, we have a 90% of winning. The graph is rotationally symmetric (which of course it should be; there are two players, if you're not the winner, you are the loser).
Also, for an unbiased coin, at p=0.5, it does not matter if you make one flip, or two (or, in fact, a million flips). Again this should be obvious from the symmetry argument.
If you are a regular reader of my blog, a curve of this shape might look familiar. I derived similar win/loss curves for the chance you might win a tennis match based on the probability you can win a single point.
In the words of Dr Seuss, from the classic Ten Apples Up On Top, what would happen if we sliced the problem into three flips, or four … ?
How big can we make the advantage? Let's take a journey …
Advertisement:

Random Walks

We're going to start our journey with a look at random walks. These are sometimes given the colloquial name of drunken man walks. They occur in many places in Maths and Science. We'll start with an unbiased example.
Imagine there is a man, and he has a fair coin. He flips the coin, and if it lands heads, he walks to the left, and if it lands tails, he walks to the right. (In the drunken man version of this tale, the man is so innebriated he staggers around in a random direction with no bias).
Above is a little simulation. Click "Start" to start the man meandering. He starts off in a modal position, and dithers around. Over time, he might make a few excursions to the sides, and the longer you watch the larger these deviances might become, but, on average, he's centered around the starting position. Again, our thought experiment about symmetery tells us this is what we would expect. The coins is fair, and there is no distinction between the sides, and the coin has no memory. It's a Gambler's Fallacy to think that "after a long run of heads, tails is more likely to happen!" This is totally bogus. There is no such thing as the conservation of luck!
There is no such thing as the conservation of luck!
Now, let's spice up the situation. Instead of walkling side-to-side on an infinite line, imagine there are constraints. Mathematicians like to call these things boundary conditions or absorbing bariers. If you reach one of these, it's game over.
Sticking with our drunken man, imagine he is walking on the top plateau of steep sided island. It's dark, and he has no idea where he is. If he falls off either side, he will instantly die, there is no 'undo' and going back. Converting this to a casino analogy, it's like going bust when gambling. Once you run out of chips, the game ends. For our drunken man, there is no way to climb back up the cliff*. (In the casino analogy, an example of an absorbing state on the opposite side is "quitting when I've doubled my money").

Image: Ian Mercer
*Falling off cliffs is not a pleasant experience. I have first hand knowledge. Thirty years ago, I fell off The Grand Canyon (it's a long story, and an even longer rescue chronicle involving a speed boat, a mustang, a helicopter, park rangers, and rope belay). I broke my pelvis into a few pieces, hurt my back, smashed my shoulder, split my right foot open, broke my left ankle and right wrist, got many bruises, numerous lacerations, and had more cactus spines inserted than I could count. Thanks to the ER room in Las Vegas, and some good friends and family, I was able to make a full recovery, and not only walk again, but I've returned many times to the canyon to to hike it. Last year, before I was diagnosed with cancer, I hiked the canyon rim-river-rim-river-rim (Start on the South Rim; hike down to the bottom, then up the otherside to the North Rim, all in one day. Sleep, then turn around and do it the other way on the next day). It's an epic and magical journey if you ever get the chance (be sure to train well, it's 50 miles, and close to 21,000ft of elevation change). I was lucky enough to do the trip in the company of some great friends.
Below is an simulation of a man on a cliff. If you click start, and have patience, you will eventually see him fall.
So, how long do we expect the man to last? If we're in a casino with $100 and playing a totally fair game (yeah, good luck with that!), how long, betting $5 a time, will we be seated at the table before either going bust, or doubling our money?
Rather than drawing a gruesome mega tree of all the possible movements, let's take a different approach using something called linear summation. Below is a snapshot of our man, let's call him Stan, on top on the island. The Island is w units wide, and he's currently standing at position n (where the datum for our experiment is zero on the left, and the cliff on the right is at w units).
What we need to calculate is the probability that, from position n, Stan will fall off the cliff at position w. We'll call this Pn.
There are a couple of edge cases that are 'gimmies'. If Stan already is at the right edge, he's 100% certain of falling off this edge, so Pw = 1. By similar logic, if he's at the left cliff already, he's going to fall off that cliff, so there's zero chance he'll fall off the right cliff! I know that sounds a little weird, but stick with me. From this logic we can say P0 = 0 (remember, we're calculating the probability that he will fall off the right cliff only at this point).
What happens if Stan is somewhere in the middle? (anywhere that is not one of the two boundary points). We can say for 0 < n < w
Well, Stan has two choices, he can move to the left (which has a ultimate probability he will eventually fall off the right cliff of Pn-1), or he can move to the right (Pn+1). Each of these has a 50% chance. The move to the left moves him to position n-1, and the move to the right moves hime to position n+1. There is no option to stay where he in, so we can write the following formula:
With rearrangement we get:
This is an equation that describes the next term as a linear combination of a couple of the previous terms. Mathematicians call these linear recurrences, and there are ways to solve. If you are familiar with the technique, there is no inhomogeneous part. If you are not familiar with the technique, there is no easy way to follow along other than to learn the technique first, so here are the Cliff Notes (See what I did there?):
The characteristic equation is:
Which has a double root at x=1, so the general solution is just:
Putting in the two boundary conditions of gives two simultaneous equations and we can see that a=0, and b=1/w, resulting in the very simple final result of:
We've now got the probability that Stan will fall off the right cliff. Using symmetry, we can simply flip this to calculate the probability he'll eventually fall off the left cliff. If Stan is at position n from the left cliff, he is at position w-n from the right cliff, so the probability he falls of the opposite cliff is (w-n)/w.
Logically ORing these, we add the probailities to find out the chance Stan will fall off the right cliff or the left cliff.
What do you notice about this equation? That's right, it equals one. It's a certainty that poor Stan is going to die! Eventually he's going to fall off the right edge or left edge. Nobody lives forever Stan.
It's worse than that for poor Stan. Let's imagine that the island is really wide. Really, really wide (tending to infinitly wide), and say he starts at position n, then, for any finite value of n
Poor old Stan will never escape! It's tough to be a stick figure.

Life Expectancy?

How long can Stan expect to meander around before taking the plunge?
It's possible to use the same technique as above, this time counting the number of 'steps' that Stan will take. We can derive an equation for the number of steps taken from a postion Sn.
If he starts at either of the extremes, he needs to take no steps to die. From position n, he takes one step (1 + …) followed by the expected number of steps from the place he ends up.
This time there is an inhomogeneous term, and the math gets a a little messy, so I'll jump straight to the solution:
Shuffling this gives an easier to interpret, and fascinating (geometric) result that:
The expected steps lifetime for poor Stan is the product of his distance from both edges!
If it's not completely obvious from symmetery, to last as long as possible, Stan should start in the middle of the island.

Back to the casino …

Let's apply what we've learned from Stan to betting. Using the assumption that we could find a fair game in Vegas (paying out even odds at 50:50 chance), if Stan's sister, Lucy, went into a casino with n dollars, and placed $1 bets, and plays until she goes broke (falling off the left cliff), or quits after winning x dollars (where w = n + x), which is like falling off the right cliff at w.
The probabilities that Lucy goes broke, or goes home a winner are:
And the expected number of bets before one of these things happens is:
I know this sounds kind of obvious but, like her brother on the inifinitely wide island, if Lucy starts with a finite amount of funds, she's eventually going to go broke. If you play forever, and never stop, you'll eventually go broke. You have to quit sometime whilst you are ahead and cash out!

Gambler's Ruin

Casino games, however, are not fair. There is a house edge. This is how casinos make money. Below is a little simulation showing this. It's like if Stan were more likely to walk left than right. When you click 'Start' below its plays a game (the yellow trace). It repeats this over and over again, and the red line shows the average over many games. When the game is fair, the red line hovers over the middle. You can use the buttons to adjust the bias of the game, and watch the red line move up or down. Even if the game is biased just a slight amount, over time, it moves in that direction.
The above simulation above starts off at zero, and goes above and below the line. In reality, however, you walk into a casino with finite resources, and the casino has, essentially, infinite resources. The constant whittling away from the (even if small) expected loss on each bet, puts you into a trajectory that drifts you to zero. Yes, there are swings in wins/losses, but the longer you are into the game, the larger and larger the 'luck' needs to be to pull you away from the black hole of bankruptcy. Downward drift dominates over luck.
Downward drift dominates over luck!
If you visit Vegas, set yourself a target, and if you reach it, cash out! If you carry on playing, you are eventually going to lose. The house edge will always win out over your lucky streak.
I'm not saying gambling is not fun. I enjoy the odd flutter myself. There's a lot of psychology at play. I've written quite a long article about why people gamble. The botttom line is that winning feels great with its associated Dopamine rush.
Know (and set) your limits. If you are susceptible to addictive behavior, please stay away from temptation.
Let's go further down the rabbit hole with a little more math and see just how much the house odds work against you.

Biased Games

Using the principles of above, it's possible to derive an equation for the probabilities of reaching one of the absorbing states based on a biased game. We'll call these absorbing levels: a, b
Explicitly:
We get the following results:
And the number of expected steps (how long you'll stay at the table is):
When p = (1 - p), it means the game is fair. (A simpler way of saying this is to say when p = 0.5). For the mean steps calculation, if p = 0.5, this reverts back to the fair game, and instead of using the above (which divides by zero), the formula is (x - a)(b - x).

Try it out

Rather than forcing you to type those equations into Excel to test it out, I've embedded a calculator below. You can try inputting values for a,b,x,p and seeing the results.

Lower bound absorbing value

Upper bound absorbing value
Starting Value
Probability of winning game

Interesting Results

Let's look at the consequences of these equations. Going back our original coin flipping game, we can confirm that, with one flip, the chance of doubling our money is 0.6, and if we do two flips it is 0.6923 (set a=0, b=4, x=2). For three slices our odds increase to 0.7714, and if we slice the $100 in bets of $10 a time we get to an impressive 0.9830 chance of doubling our money instead of going bankrupt. We can get to over 99.99% chance with 23 slices.
Now let's look at a casino favourite, the roulette wheel.
There are 38 spaces on an American* roulette wheel, and if the player makes, say, a bet on 'red', there are 18 numbers that will payout. This give a p of 18/38 ≈ 0.473684
*An American roulette wheel has both 0 and 00 to improve the house odds. Most European wheels still have one zero, so the denominator is 37 for a p ≈ 0.486487
If we start off with $10, and make $1 bets, what is the chance we will double our money before going bust? (Have a guess before scrolling down).
The chances are just 25.9% chance you will double your money before going bust. (The chances of trippling your money to $30 are just 8.27%)
If you start with $10 in roulette, betting $1 a spin, and carry on until you either double your money, or go bust, the chance you will double your money is just 25.9%
Recall back to the drift chart. The longer you carry on, the more the expected loss gets you with each spin. You are constantly battling a headwind to even get to a break-even point, nevermind to profit.
Paradoxically, if the odds are against you, splicing your bets thinner to 'extend your play' works against you (if your goal is to either double your money or go home empty). If you have $10 to play on slot machines (let's say with an average payout of p = 0.48), if you pay the $1 slots, there's a 31% chance of doubling your money before going bust. If you play the quarter slots with the same balance your chances of doubling before going bust reduce to just 4%. Is it any wonder why penny slots are still so popular in Vegas?*
*Of course for slots, there is a high variance in the payout matrix. Sure, on average, the value for p might be 0.48, but for slots there are massive prizes, and no prizes. A slots player is playing for the dream of that big win!

What if I have a bigger bank roll?

If you watch James Bond movies you'll know that casinos are playgrounds for the rich. If you have a bigger bank balance, surely it's easier to weather the storm of a patch of 'bad luck'? How would Bill Gates, or Mark Zuckerberg fair in a casino?
Let's stick with roulette. Your goal is to win $10 (or go bust trying), by betting $1 a spin. What is the sensitivity of the probability of winning $10 based on your starting balance? We've already seen that the chance of going from $10 to $20 is 25.9% What is the probability of earning $10 if you have $100 in your wallet (going from $100 to $110)? How about going from $1,000 to $1,010? What about from $1,000,000 to $1,000,010? The results might surprise you.
Bank RollTargetProbability of hitting target
$1$110.05081373132741231
$5$150.17980822753180667
$10$200.25853341295657256
$50$600.3475059014843332
$100$1100.34867240790301957
$500$5100.3486784400999991
$1,000$1,0100.34867844009999915
$5,000$5,0100.34867844009999915
And I have to stop there as I'm running out of precisson! If you have a bankroll of $5,000 to risk, you are only a few ten thousandths of a percent better odds of earning $10, than someone with a bankroll of $100. Even a billionaire, even if they risked their entire net worth!, is not going to measurably improve their odds of gaining $10 before going bust. Once you start heading down that road into loss-land, it's going to take an unfeasable amount of luck to get back to just your starting point against the constant headwind of an expected loss on every spin.

Why?

Let's see why. Recall back to our formula from before:
When p = 18/38, then (1 - p) = 20/38, so (1 - p)/p = 10/9
10/9 is larger than 1, and x gets larger and larger, so as x gets big, as it is an exponent, it dwarfs the negative ones, and we can ignore them.
We're left with a simple expression for the probability:
Note that nowhere in this expression does x appear! No matter how large x becomes, there is a limit to the winning percentage. Even with a trillion dollars at risk, you are not going to do better.
Even with a trillion dollars at risk, you are not going to do better!