Because I can't resist the prospect of getting flamed, here is my analogy.
A guy writes down a number from 1-100 on the back of a postcard. You guess what that number is, and if you choose the correct number, you win. You keep guessing as fast as you can.
Every 30 seconds, the guy tears up the postcard and creates another one with a new number.
Can we agree that your previous guesses have no bearing on your chances of choosing the correct number now? Same when a block changes.
Expanding the analogy to include difficulty makes no difference. You now need to choose a number from 1-500, so it's going to take you longer to find a correct answer. But when the guy tears up the postcard and writes down a new number, your previous guesses does not effect your new guesses.
BTW: I am all for lowering difficulty to reduce *variance*, and I'm sure there is some flaw in that analogy, so flame away
Your analogy only works if you are SOLO mining.
The correct analogy, following your setup would be something more like this.
You have 100 people each trying to guess a number from 1-10000 on a card.
As soon as somebody guesses the number, the game is over for that round (the block is solved).
However, if anyone comes within say, 10 numbers of the true number, they get a 'ticket'.
The problem is, every one is guessing at different rates. One person might guess at a rate of once per minute. Another person might guess at a rate of once per second. Rate of guesses spans that entire range amongst the 100 people.
Now, say a researchers watches people play this game for 1000 sessions. He compiles his data, and finds out that on average, the number is guessed in 5 minutes.
Therefore, on average the guy who guesses once a minute gets 5 guesses. He is not going to get within 10 very often. But, you wouldn't expect him too right? After all, he is 60 times slower than the fastest guy, so you'd expect him to get 1/60th of the tickets that the fast guys does.
Lets say that a correct guess occurs in 4 minutes, 50 seconds. Slow guy only got 4 guesses. Had the guess happened 10 seconds later, he would have gotten one more guess in.
He lost 20% of his guesses.
Fast guy got 290 guesses. He was working on his 291th guess when the number was solved. He too, lost one guess. But instead of losing 20% of his guesses, he only lost 0.0035% of his guesses.
If you mapped this out over all 100, an EXPONENTIAL curve would form. This exponential curve could then be reverse engineered into an equation, that would tell you exactly what percent of guesses you would expect to lose.
One last way of thinking about it... if it takes you an average of 30 seconds to find a share... say you don't make it in time, block changes...your most likely range to be sitting at when the block change happens might look something like 15-45, or using the same deviation, 5-15 for the faster miner. Which amount of time would you rather go unrewarded? But guess what, the block was found, everyone gets paid... its just that the fast guy is getting some of your cut, but cause maybe hes 10x faster and got in 19 shares when you only got in 1. Had you got 2, and he got 20, all would be fair, but statistically you are going to get screwed over time.
Edit: Thought of one more
Would you rather flip one coin to see if you get an entire pie, or get nothing? Or would you rather have 7 pieces secured and be flipping a coin to see if you do or do not get the 8th piece?
This represents being "on the bubble", hoping you get your "next" share in before the block change. If you manage to get your whole pie, and the other guy manages to get his 8th piece, its all the same. But what if you both fail? If you both fail, you get nothing. The other guy still got 7 pieces.
You had 512 diff. He had 64.
(note to people who are going to pick that apart - I know its mathematically possible for the first guy to get 2 pies, 3 pies 4 pies. I know its possible for the second guy to get 3 slices, 10 slices, or 45 slices. But, for every time such a thing happens, we can count on the opposite happening with the same frequency. That is why we can trust probability. Given enough time, probability always takes us to the answer we expect. Everyone keeps talking about variance. Well there's your variance! As more time passes, are variance goes down and the whole pie guy WILL get screwed).