It could be in miners interests to keep the block size limit small, to make the resource they are “selling” more scarce and improve profitability. The assumption that miners would try to manipulate the block size limit upwards is not necessary true, it depends on the bandwidth issue versus the need for artificial scarcity issue dynamics at the time. If Moore’s law holds then eventually the artificial scarcity argument will become overwhelmingly more relevant than the bandwidth issues and miners may want smaller blocks. Miners could manipulate it both ways depending on the dynamics at the time.
I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network. However why would miners in this scenario want to manipulate the limit upwards?
It is already explicit in the bitcoin network structure that miners can 'manipulate the block size down'. They could all issue empty blocks if they wanted. And yes, miners can also 'manipulate' the block size up. So the lower bound for the 'manipulation' is zero. The upper bound is the block size limit, currently at 1MB. We all agree miners can do whatever they want within those limits. Gavin's proposal is just a concept for moving that upper bound, and thus giving miners a larger range of sizes of which they may choose to make a block. An idea I support, and I think Gavin supports, is to have the block size be bounded by the technical considerations of decentralization. Miners can create their own cartel if they want to create artificial scarcity, so they don't need a max block size to do it. But cartel or not, max block size enshrines, essentially into 'bitcoin law', that bitcoin will remain auditable and and available to the interested individual, both financially and practically speaking.
My own feeling is that we should be looking at "as much block-space as possible given the decentralisation requirement" rather than "as little block-space as necessary given current usage".
Totally agree.
However, if you can find an appealing notions of necessity, smallness, or some alternative method of attempting to balance centralisation risk against utility which involves fewer magic numbers and uncertainty than the fixed-growth proposal then it's certainly worth it's own thread in the development section.
I think MaxBlockSize will remain a magic number, and I think that is okay. It is a critical variable that needs to be adjusted for environmental conditions, balancing, exactly as you put it teukon, [de]centralization against utility. As computing power grows, it is easier to conceal, hide, and keep "decentralized" computational activities.
Raise it too quickly and it gets too expensive for ordinary people to run full nodes.
So I'm saying: the future is uncertain, but there is a clear trend. Lets follow that trend, because it is the best predictor of what will happen that we have.
If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).
The more accurate the projection of computing / bandwidth growth is, the less often the magic number would need to be changed. If we project very accurately, the magic number may never need to be adjusted again. That being said, it is safer to err on the side of caution, as Gavin has done, to make sure any MaxBlockSize formula does not allow blocks to grow bigger than the hobbiest / interested individual's ability to keep up.