We have no way of knowing how much protection is enough. We can't define it in a fixed fashion.
If by protection you mean hash rate, I think there's only one right answer, "as much as possible." Someone said earlier that Bitcoin should be thought of as the block chain with the largest amount of hashing power of any block chain and I think that's the right way of looking at it. Any less, and it is vulnerable to a competing block chain that offers more security. Or it is vulnerable to an attack. I believe that any increase in the maximum block size is going to reduce the maximum hashing power that can be reached at any given time, because increasing the block size will inevitably lead to a decrease in fees,
can anyone provide a counter argument?This having been said, if the community decides that Bitcoin is best served by trading away some of that maximum hashing power in exchange for an increase in the rate of transactions, the maximum block size must be increased. I said earlier that these two guidelines should apply for increasing the block size:
1) Block size adjustments happen at the same time that network difficulty adjusts (every 210,000 tx?)
2) On a block size adjustment, the size either stays the same or is increased by a fixed percentage (say, 10%). This percentage is a baked-in constant requiring a hard fork to change.
3) The block size is increased if more than 50% of the blocks in the previous interval have a sum of transaction fees greater than 50BTC minus the block subsidy. The 50BTC constant and the threshold percentage are baked in.
If the idea of having a constant block reward forever bothers you, then here's another scheme:
1) Block size adjustments happen at the same time that network difficulty adjusts
2) On a block size adjustment, the size either stays the same or is increased by a fixed percentage (say, 10%). This percentage is a baked-in constant requiring a hard fork to change.
3) The block size is increased if more than 50% of the blocks in the previous interval have a size greater than or equal to 90% of the max block size. Both of the percentage thresholds are baked in.
As I said, these are just building blocks to stimulate ideas. Both of them directly respond to scarcity, and only look at information in the blockchain (easy consensus).
Can anyone improve on these ideas or offer a better alternative?Does anyone think that
letting miners produce blocks of arbitrary size and letting the network battle it out for them is a good idea? Will this produce more orphans and waste more of that precious bandwidth that everyone is all hopped up about? Is this better than either of the two schemes that I described above? If so, how?