Anyone here seem to agree that sooner or later we will get to that point when we will have to say 'enough is enough'.
Agreed however there is a debate on if 1MB is "enough". Most people would say it isn't and that likely means "enough" will be moved to a higher number at some point in the future and in some manner. It would be smarter to focus on the how & when given that an increase eventually is an almost certainty.
So why not to just keep this point at the 1MB, as Satoshi originally designed?
Bitcoin was never designed with a 1MB limit initially. You can check it doesn't exist in the early versions of the source code. That was added later as a safety limit to prevent an early attacker from massively bloating the blockchain and thus killing off the project. Imagine if in 2010 there was no block limit you had to download a 5TB blockchain just to start using this experimental currency with very little actual value or use. Most people wouldn't and the "ecosystem" might have died in the crib. 1MB limited the size of the blockchain to no more than 52GB per year. High and luckily early volume was much lower but it provided an upper bound while Bitcoin was young. When the average block has 2 to 8 transactions it doesn't make sense for a single bad actor to add GB worth of transactions to hinder future users. Bitcoin is far more developed now so it likely is time in the near future to take the training wheels off.
If you don't increase MAX_BLOCK_SIZE people will naturally start using BTC payment processors, which will take the load off the net, and which has to eventually happen anyway.
Sure. However the debate becomes at what point and how much goes off blockchain. In a perfect world I would say MAX_BLOCK_SIZE should be large enough to allow anyone to run a full node with "reasonable" resources. Reasonably to be loosely defined as the resources (storage, computational and bandwidth) available to the dedicated user willing to pay for being a peer in a global network (in other words not everyone will be a full node but most "could" be a full node). Yeah that is a very gray term but it is more important to look at what the MAX_BLOCK_SIZE does conceptually to the idea of centralization:
Lets look at two extremes futures where Bitcoin is massively adopted (say 100x current usage in terms of users, merchants, tx volume both on & off blockchain, etc):
average block size = 1MB. max annual on blockchain tx volume = ~3.6 million transactions. tx fees relatively high. overwhelming majority of tx occur off blockchain. the blockchain becomes sort of an open interbank settlement system *
average block size = 5GB. max annual on blockchain tx volume = ~18 billion transactions. tx fees relatively low. most non-micro tx remain on blockchain. the blockchain in theory can be used by anyone however the cost of a full node exclude most**
* At 1MB average block, while the cost of running a full node is relatively trivial the cost of transactions would exclude all but the largest bulk transactions. Remember as subsidy declines the tx fees pay for the cost of securing the network. So either Bitcoin is popular and thus fees are high (because only ~3.6 million tx can occur annually) or Bitcoin is unpopular and thus becomes less and less secure as the subsidy declines.
* at 5GB just about anyone running a full node can directly interact with the blockchain with low transaction cost however the resources for a full node would be on the order of:
- 262TB per year in storage requirements
- 500 mbps connectivity (bidirectional) this likely can be reduced up to 80% by optimal tx and blockheader sharing but it would still be high
- memory pool of ~2 blocks worth of tx would be ~26 million transaction thus RAM requirements (to avoid painfully slow disk lookups in validation) would be say something like 32GB.
- The UXTO is likely very large as the number of independent direct users of the blockchain are high. It is hard to estimate but we can expect the UXTO to be large and efficient validation requires at lest a significant portion to be in high speed memory.
Both extremes result in centralization.The low limit results in a centralization of transactions. It becomes too limited and expensive to transaction on blockchain so most occur off blockchain.
The high limit results in a centralization of nodes. The extreme cost of running a node means there will be less of them.
The "optimal" blocksize would be one that perfectly balances the centralization of transactions against the centralization of nodes. Now 1MB obviously isn't that perfect limit and whatever the limit is raised to likely isn't either but it certainly moving in the right direct. In other words a 10MB limit is closer to optimal than 1MB is.