technology, sociology, and economics. But these factors aren't a simple
function; the procedure I'd prefer would be something like this: if there
is a standing backlog, we-the-community of users look to indicators to
gauge if the network is losing decentralization and then double the
hard limit with proper controls
This is Greg Maxwell talking about the blocksize increase.
He advocates raising the limit only if there is a "standing backlog"
Not only does that sound dangerous for adoption, but its probably
not as easy as he makes it sound (he posted that in May),
as we're seeing first hand how challenging hard fork consensus is.
Also, Mike Hearn has suggested it could cause technical problems (nodes crashing).
Greg's other concern seems to be about fees.
While that's important, I don't think it should be a major consideration this early
when coinbase subsidies will likely be greater than fees for
at least a decade, and so not a reason to forestall an increase.
I think this approach is wrong!
One of the last interviews I saw with Andreas Antonopoulos, they asked him about the block size debate and his opinion. He said that he believes in consensus, but that he leans toward the block size increase because of 1 simple thing.
He said that we all hope for mass adoption. Well imagine if there was somebody prominent in China, that advertised Bitcoin and whole nation would see this in TV. Imagine if we would get an influx of 10 million new users over a very short period of time, Bitcoin network would become paralyzed.
I completely agree with him. How can all these devs, Greg Maxwell included, wish for a mass adoption, and not be afraid of scenario like this.