Tucenabar, I cannot really understand you graphs, care to explain them again?
Ok, I will try. (I updated the previous post with a line explaining what's on the axes)
We know that the player will alway go bust eventually, if he keeps playing. So I ignored the number of plays and just asked the question, during the whole time from the point he starts and until he goes bankrupt, what is the minimum bankroll for the house? Or equivalently, what is the maximum capital of the player?
The graphs show the percentiles of the minimum bankroll.
With kelly bet. There is a 10% chance that the bankroll goes
down to below 0.35, i.e. 35% of initial bankroll.
With 50% Kelly, there is a 10% risk of
reaching going below 0.57 or 57% of initial bankroll.
With 25% Kelly bet, the 10% risk is 0.74 or 74%.
Does that make it clearer? I'm not very good at explaining I'm afraid
Thanks a lot for posting those results, tucenaber. I started doing some simple simulations myself yesterday, but I keep running into one conceptual problem, and was wondering if you maybe have an idea how to solve it:
You start from the premise of a single player playing until he goes bust.
First question: for a sufficiently large player bankroll (say, 10 times the casino bankroll) there should be the possibility as well that the casino does in fact go broke. Can you quantify that risk?
Second question: As your simulation shows, there's a 10% risk that we reach 57% of initial bankroll, assuming 1/2 kelly (it's a bit of a simplification I guess, since casino bankroll is dynamic, but it doesn't matter for our purposes). Say the whale/player stops at this point. That's a possibility, right? We could specify, ahead of time, "player keeps gambling until he brought the house down 40%". It was your choice to let the simulation run until the player is broke, but in reality, a player could pull out, leaving the site at 40% loss.
Now comes the tricky part, at least for me: (a) What if another whale comes along? (b) What if "the other whale" is the old whale, continuing to play. Is the correct way to think about this situation as independent probabilities, so in this case: there's a 10% chance the house lost 43% (assuming the whale "left"), and there's a 10% chance that this will happen again (from the current, reduced bankroll), either by the same whale, or another whale who plays maximum profit bets?