You are right about losing streaks. There is no doubt probability of losing 15 times consecutively is much lower when you wager on 1.01x
Okay, I will try a different approach to convey what I'm particularly curious about
If you run 10 million bets on 1.01x, your average losing streak will be equal to 1 since it can't be equal to 0 by definition and it still can't be greater than 1 plus some small fraction since the chances of winning are pretty high. On the other hand, if you bet on 9900x, the length of your losing streak on average can be like 1000 (I don't really know, it's just a hunch and a number to show what I mean). But neither average tells us anything about outliers. Is the frequency of outliers normalized to some common denominator (i.e. variance or deviation from the mean) going to be the same for these two multipliers? That's what interests me so much