## Another gambling puzzle

A forum for good logic/math puzzles.

Moderators: jestingrabbit, Moderators General, Prelates

### Another gambling puzzle

Given a casino with total assets A and a gambler with total assets B, where the gambler plays the St Petersberg game against the casino, until one or other is bankrupt: what is the price to enter (as a function of A and B) such that the probability of the gambler "breaking the bank" equals the probability of the gambler losing?

The St Petersberg game:

The pot starts at $1 and we toss a fair coin. As soon as a tail appears, the gambler wins what is in the pot and thr game ends. After every head, the pot is doubled and we toss again. If you work out the maths, then the expected value for playing the game is infinite: so any finite price to play the game "should" be a good deal for the gambler. However, this expected value is computed from astronomically small possibilities of winning astronomically large amounts of money. With a finite casino bank size, almost all of the (infinite) expected value cannot be realised. On the other hand: with a finite gambler's bank roll, the gambler also cannot stay in the game long enough to have a decent chance of winning any of the really huge amounts. If the price to enter is$1, then the gambler cannot lose and will always clean out the bank. According to http://en.wikipedia.org/wiki/St._Petersburg_paradox, if the price to enter is $10 then the gambler needs to play about 1,000,000 games to make$10 per game: if the gambler doesn't have a few million dollars to spare, then he is likely to be wiped out before playing 1,000,000 games.
mward

Posts: 69
Joined: Wed Jun 22, 2011 12:48 pm UTC

### Re: Another gambling puzzle

This is just a function of the casino's assets (A).

Spoiler:
Replace the expected returns with an upper bound:
\begin{align*}
E &= \sum_{k=1}^{\infty}\left(\frac{1}{2}\right)^k P \text{ for } \{P = 2^{k-1} \text{ when } 2^{k-1} \le A,\text{ otherwise } A\} \\
&= \sum_{k=1}^{\log_2(A)+1} \left(\frac{1}{2}\right)^k 2^{k-1} + \sum_{k=\log_2(A)+2}^{\infty}\left(\frac{1}{2}\right)^k A\\
&= \sum_{k=1}^{\log_2(A)+1} \frac{1}{2} + \left(\frac{1}{2}\right)^{\log_2(A)+1} A\\
&= \left(\log_2A\right)/2+1
\end{align*}

Assuming I didn't mess up, that's how much the game is worth. I might be off by a dollar or two.
Ben-oni

Posts: 276
Joined: Mon Sep 26, 2011 4:56 am UTC

### Re: Another gambling puzzle

Ben-oni wrote:This is just a function of the casino's assets (A).

Spoiler:
Replace the expected returns with an upper bound:
\begin{align*}
E &= \sum_{k=1}^{\infty}\left(\frac{1}{2}\right)^k P \text{ for } \{P = 2^{k-1} \text{ when } 2^{k-1} \le A,\text{ otherwise } A\} \\
&= \sum_{k=1}^{\log_2(A)+1} \left(\frac{1}{2}\right)^k 2^{k-1} + \sum_{k=\log_2(A)+2}^{\infty}\left(\frac{1}{2}\right)^k A\\
&= \sum_{k=1}^{\log_2(A)+1} \frac{1}{2} + \left(\frac{1}{2}\right)^{\log_2(A)+1} A\\
&= \left(\log_2A\right)/2+1
\end{align*}

Assuming I didn't mess up, that's how much the game is worth. I might be off by a dollar or two.

The question wasn't the value of the game, it's what the price should be to make the probability of the gambler running out of money equal to the odds of the casino going broke. In other words, each round, we move (1-P) with probability .5, (2-P) with probability .25, (4-P) with probability .125, etc. What should P be so that the odds of moving under -B before moving over A are 50%?
BlueSoxSWJ

Posts: 25
Joined: Tue Apr 17, 2012 4:09 am UTC

### Re: Another gambling puzzle

Okay, I'm convinced.

Spoiler:
After any particular play of the game, the gambler has either lost or gained, and the casino has gained or lost the same. So, suppose the cost of the game exceeds its value. Then over a series of games the assets of the casino increases, thus increasing the value of the game. If the gambler has sufficient assets to cover the costs so far, continued play will yield a net gain for him. That is, there exists a stable point where the value of the game is zero. So the only question is whether the gambler has enough money to reach that point. It's a bit of an oversimplification, but assuming the cost of the game is C, if C < (log2(A+B)/2 + 1, the gambler can reach that stable point. So how much money does he need to have remaining to be able to break the bank before the bank breaks him?
Ben-oni

Posts: 276
Joined: Mon Sep 26, 2011 4:56 am UTC

### Re: Another gambling puzzle

Ben-oni wrote:Okay, I'm convinced.

Spoiler:
assuming the cost of the game is C, if C < (log2(A+B)/2 + 1, the gambler can reach that stable point. So how much money does he need to have remaining to be able to break the bank before the bank breaks him?

That doesn't make sense:
Spoiler:
If the gambler's fortune (B) is zero and the casino's fortune (A) is non-zero, then your game cost C is greater than zero, but the gambler is already bust! Clearly, as B tends to zero we must have C tending to zero, otherwise the gambler cannot play a single round. To put it another way, if the casino's fortune is huge in comparison to the gambler's. then your formula approximates closely to C < log2(A)/2 + 1. If B = C then the gambler can only play one round and is very unlikely to come out ahead on that single round.
mward

Posts: 69
Joined: Wed Jun 22, 2011 12:48 pm UTC

### Re: Another gambling puzzle

The problem simply does not make sense in the case that the gambler has less than $1. If the cost is less than or equal to the gambler's asset's, the gambler plays, and each time he gets more than he loses, so he wins automatically. If the cost is greater than the gambler's assets, then he wins automatically. The same argument works for when the amblers assets is exactly$1, the gambler simply cannot lose money, unless the gambling cost is higher, in which case the gambler loses, so no solution is valid or A<=1. For A close to 1, the solution is also hard. If the cost to enter the game, C, is bounded by (1+A)/2<C<A, then if the gambler enters and only wins $1 (50% likely) then he loses. Also there is a slight chance that the gambler will lose even if he gets$2. This means that the probability of the gambler losing is greater than 50%, which means it is not a solution to the original problem. Therefore 1<C<(1+A)/2 for all A>1.

Edit: I think that an exact solution to the original problem does not exist for most values of A and B (B=casino assets). I realized that, using the above argument, assuming that a solution exists for a given A and B implies that the the probability of the gambler breaking the bank with C=(1+A)/2 (I'll call it P(A,B), as it is a function of the casino's assets), is bounded by P(A,B)<=0.5, as the cost cannot increase, as that would make the bank more likely to win, so it can only decrease, which increases the chance of the gambler winning. Actually, extending this argument, assuming that a solution exists for a given A implies that the probability of the gambler never running out of money (assuming the bank has unbounded assets) with C=(1+A)/2 (I'll call it P(A)) is bounded by P(A)<=0.5. This is true because this is the worst case that the gambler can still win with, so the probability of the gambler winning with any other fair scenario must be >P(A).

This is useful because it gives the ability to give tighter bounds on the solvable values of A.
I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.
tomtom2357

Posts: 556
Joined: Tue Jul 27, 2010 8:48 am UTC

### Re: Another gambling puzzle

Spoiler:
Let X = A + B. Then the probability of winning is given as
P_{C, X}(a) = \left\{ \begin{align*} &0 \text{, for } a < C \\
&1 \text{, for } a \ge X \\
&\sum_{k=1}^{\infty} 2^{-k} P_{C, X}(a-C+2^{k-1}) \text{, otherwise } \end{align*}\right.

The upper bound on the summation can be given as k ≤ log2(2X + C) + 1, so it looks like it should be solvable.
Ben-oni

Posts: 276
Joined: Mon Sep 26, 2011 4:56 am UTC

### Re: Another gambling puzzle

If the cost is 4/3, and the gambler's asset is 5/3, the game could be simplified to a game where you get 1 or 2 with prob 0.5 each. The probability of losing is then either 1 or (3-sqrt(5))/2 (I used the reasoning found at http://math.stackexchange.com/questions/153123/biased-random-walk-on-line, and squared the result). If the probability of losing is (3-sqrt(5))/2, then the probability of losing the larger game is less than that, and therefore less than 0.5, and therefore the cost has to be greater than 4/3. I'm just having a bit of trouble proving that the probability isn't 1.
I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.
tomtom2357

Posts: 556
Joined: Tue Jul 27, 2010 8:48 am UTC