Even if it takes time: the blockchain is already very big - if you make it bigger normal people will need to upgrade their hardware to use it and people won't do that.
Right now there isn't even an immediate need to fork so the proposale doesn't make sense at this point in time.
As noted before: reaching the blocklimit will at first result in microtransactions being pushed off the chain and that won't be an issue for most users.
Fork to a bigger chain isn't rational at this point in time. Period.
Do you know how many viable blockchains are out there with almost only empty blocks and very small chanis (below 1bg storage)? Dozens!
Blockchains aren't scarce. So why would i use one blockchain that requires hundreds of GB storage when i can use one almost as secure with much less HD-use? I personally will leave btc behind for good with a larger chain (just refuse using Gavincoin - it isn't even 'bitcoin' - it is really 'gavincoin') or stick to the old fork in case it can survive.
Microtransactions aren't the only thing that will get pushed off the chain. I already mentioned about the Lighthouse project being hindered by a 1MB block limit. Why not discuss that? Stop with the microtransactions already. Those will be free and not relevant. What is relevant is to allow services to run on the blockchain!
Gavin perfectly understand the economics of the block fee. You do not seem to understand that a block size of 1MB hinders and blocks services from running on top of blockchain. I have already pointed out that project Lighthouse is having issue because of the 1MB block size limit. Nobody seems to talk about the subject and I don't understand why. Are you happy that the Lighthouse project is hindered? I'm definitely not!
There is currently room for up to 500 KB, which is enough for most kind of contracts. As long as the current usage does not even fill the blocks it's unlikely that increasing the available space by 20x is going to increase the use by more than 20x. This has not been Gavin's argument for increasing the size either.
Increasing the size may be a good idea some time in the future but, until the usage actually increases, it's a bad idea.
I read everywhere that Bitcoin is programmable money. 500KB doesn't seem to leave a lot of space for complex scripts. Thanks to the great analysis posted by D&T we can see that a 15-of-15 P2SH transaction needs 1.5KB which means that only ~300 transactions of this type has room in one single block without without counting the rest of the transactions.
P2PkH: 131 bytes per script round trip (25 byte scriptPubKey + 106 byte scriptSig)
2-of-3 P2SH: 253 bytes per script round trip (22 byte scriptPubKey + 231 byte scriptSig)
3-of-5 P2SH: 383 bytes per script round trip (22 byte scriptPubKey + 361 byte scriptSig)
15-of-15 P2SH: 1,481 bytes per script round trip (22 byte scriptPubKey + 1,459 byte scriptSig)
If you have 300 corporations willing to do a 15-of-15 P2SH transaction then you have already filled all the blocks leaving no space for the free transactions.
Nobody said that increasing the available space by 20x will increase the use by more than 20x, but at least we have the possibility of increasing the usage compared to being blocked. I have no idea what services are the smart minds preparing for the blockchain, but increasing the block limit allows development!
Also don't forget about the Lighthouse project! I don't understand why is everyone ignoring it.