The frustration with arguing with many of you is that you come at this issue as though it were an economic problem. It's not an economic problem. Economically, the block size should not be artificially limited.
We agree on this, or were you just strawmanning?
What it is, instead, is a technical problem, or a political problem, or an existential problem. The problem is not whether miners will continue to get paid. That hasn't been a problem for years. Mining is so ridiculously huge at this point that any "security" achieved by marginal increase in hashing power pales in comparison to other, much larger, existential threats to Bitcoin. And those *do* exist. What Bitcoin is attempting will not be a cakewalk.
Gavin has done a good job of laying out the technical limitations, which, frankly, are few. He says the technical limit is somewhere beyond 16.7GB. I have no reason to dispute this. And I have seen no one actually attempt to dispute it. If you think 20MB blocks are too large, you probably have sub-standard internet service. I'm right there with you. Most people probably have sub-standard internet service.
Which brings us to the real issue. No one has done a decent job of laying out the political problems, and the existential threats posed by a block size increase. A lot of people have made various insinuations that there is a plot against Bitcoin, which, if you read my posts, I would even tend to agree with. Yet there is little concrete discussion of what that threat even is. The threat is usage? The threat is growth? The threat is voluntary centralization?
*One* person has suggested that 2MB blocks are acceptably large. Come on, be realistic. 2MB or 1MB, really just doesn't matter at all. Such a limit is simply laughable. What an idiotic hill to choose to die on. Anyone who insists on such a limit would be part of the real "plot" against Bitcoin, as far as I'm concerned. For all of your crying about "decentralization," to insist on crippling Bitcoin at a rate that is only useful for gigantic financial institutions is just embarrassing. At that point, if that's the best you all can come up with, then it will be time to move on to plan B, because this iteration of Bitcoin will have failed.
This is just not a serious discussion, at all. There are a dozen different possible outcomes, here. There are a dozen different ways that Bitcoin can evolve in the future. So far we have explored three, maybe four of them. Please try to think a little outside the box.
Gavin has made a good start but it is only a beginning. He's run some software testing and made some proposals.
He also looked up the historic data network growth rates in North America and decided that that pattern is good enough to base the protocol upon. Nielson's law. That is where we diverge.
Gavin would argue that since it is an upper boundary, it can only be "too low" and never "too high". Where his reasoning fails is that there is a cost to the transaction data set and this cost is borne out by the number of times it must be replicated across the network.
There are pernicious effects of permitting too large blocks to be confirmed. Increased orphaning, bandwidth attacks on smaller nodes, excessive spamming would be a few of these.
We agree that his proposal is 'the simplest that could possibly work'. However, it is not so much to ask for better than 'could possibly'? We'll end up settling for 'the best we can do by the time we need to do something', and a number of folks would agree that his proposal isn't the best we can do. It is merely an expedient one.