Finally, there is no such thing as a course of action to prepare for the future that is not based on some form of extrapolation or prediction. But that doesn't mean failing to be prepared is a good idea.
And I still have no idea what miner would deliberately lose money by taking block size up to the very limit using bogus transactions, or why they would try to. We're talking these days about people running actual businesses that they have invested major resources in and want to see some ROI. This is not just set of randoms that's going to contain trolls who came along just to wreck things for lulz. So I'm just not seeing the immediate threat model of a ridiculously enormous block-chain that you claim to see.
There is a course of action to prepare for the future that is not based on some form of extrapolation. Many proposals have taken this form. All of them have the same failing in that they are not implementable without adding some code for metrics.
What will be there for us in 2, 5, 10, 20 years that will know how big bitcoin blocks are and need to be? The block chain will.
Simply stated the threat model of a ridiculously enormous block chain is that the cost of bitcoin can be made to be unreasonably high, such that it becomes non useful, and even non-economic.
Perhaps you do not see it because you are considering only those within the Bitcoin Economy as important, but when you consider the larger scope of players in the game theory, and consider that there might be some who would like our experiment in crypto currency called Bitcoin to fail?
The Bitcoin Network Cost = Data Size * Decentralization.
It doesn't take a 50+% attack to grow the data size if the protocol permits it. It does not lose money to do this attack, except only at the margins with orphaned blocks. This is a small fraction of the mining revenue, and it is important to those that are working to be the most competitive.
If you consider an 'attacker' who might be willing to absorb whatever losses might accrue from the very occasional orphaned block in order to grow the data size by as much as the new maximum allows with every block they solve. This will have some bad effects.
First bad effect is to increase the cost for all miners and node operators (these are not necessarily the same folks). The node operators take the biggest hit because there is no revenue for them anyway.
The miners also take a hit in increased cost (bandwidth, maintenance, storage). The smallest and those with the most expensive bandwidth may fall below profitability and may leave the market. This benefits the 'attacker' in several ways.
1) they get a larger share of the hash rate by knocking out competition
2) they increase the centralization of node operations and mining making Bitcoin ever easier to attack.
Second bad effect of the exponential growth plan is its perniciousness.
The greatest effect of an attack is done when it is sudden, persistent, and overbearing.
We may have a great majority of Bitcoin Economy miners that manage their block sizes and even though the max is 16MB-16GB or whatever the limit is of the period. Average block size may continue to be less than 1MB for many years to come, or grow much slower than the extrapolation predicts. We simply do not know what it will be. If the attacker waits until a time when Bitcoin is particularly vulnerable, and then starts mining the huge bloated blocks to make it more expensive. The risk will slowly increase until such time that it can be exploited.
Third bad effect. The limit could be too low. Ridiculously high may not be high enough.
Bitcoin could become wildly successful much sooner than expected.
Does you seriously think that it might take 20 years to solve the block size measurement problem?
I like that the new proposal does have a sunset provision (only 10 doublings so increases 2^10). Each revision Gavin has improved the proposal, though it still seems so very pessimistic. If we are postulating exponential Bitcoin transaction growth, why not also postulate exponential growth in Bitcoin Developer expertise?