The idea is that if the coin flip goes in the player’s favor, they win double their bet. After winning, they can either collect their winnings, or risk them all on another coin flip to have a chance at doubling them. The initial bet is fixed at, let’s say $1.
Mathematically, this seems like a fair game. The expected value of each individual round is zero for both house and player.
Intuitively, though, I can’t shake the notion that the player will tend to keep flipping until they lose. In theory, it isn’t the wrong decision to keep flipping since the expected value of the flip doesn’t change, but it feels like it is.
Any insight?
If both players have infinite bankrolls, but only one of them is allowed to stop the game once they are ahead, the one with the option of stopping has an advantage. They can play until they are in the lead, then stop. The reason this doesn’t work in real life is that real bankrolls aren’t infinite.
I don’t know if that applies to this scenario. In this game, the player is always in the lead until they aren’t, but I don’t see how that works in their favor.
Oh wait you mean the player has to stop if they lose? That’s different.
Well, they have to start over with a $1 bet.
That looks like the St. Petersburg Paradox. Much ink has been spilled over it.
The expected payout is infinite. At any point, the “rational” (profit-maximizing) decision is to keep flipping, since you wager a finite sum of money to win an infinite sum. It’s very counter-intuitive, hence called a paradox.
In reality, a casino has finite money. You can work out how many coin flips it takes to bankrupt it. So you can work out how likely it is to reach that point with a given, finite sum of money. Martingale strategies have already been mentioned.
Not quite the same, since in my scenario the player loses everything after a loss while in the St. Petersburg Paradox it seems they keep their winnings. But it does seem relevant in explaining that expected value isn’t everything.