I'm not smart enough to figure this out. I wish Satoshi would weigh in on this issue. I suspect he may have already envisioned the outcome.
Austrian economic theory says that no one is smart enough, because no one can have all of the information. The best that we can do is take a guess, and I would guess that it isn't going to be a real problem. Certainly not in my lifetime.
The main issue raised by this thread does seem like something of concern for the long-term health of this project, as there does seem to be the very real possibility that the current limit is not going to be sufficient for "ordinary" transactions at some point in the future. Using examples for data processing rates and transaction rates for other payment processing systems like PayPal, this current limit is not only going to be insufficient but woefully insufficient. I realize we aren't anywhere near those demand levels, but it still is an issue to think about.
There are also plenty of examples where decisions of software architecture including the use of constants or other features in software architecture have profound real-world impact simply because the software design team has been short sighted and didn't anticipate the future very effectively. Examples of this include the Y2K bugs, the Unix 2038 date overflow bug (remains to be seen how it will be completely solved), and perhaps most similar to this current situation is the IPv4 address space issue. There are other instances where a coded constant of some kind also can come up and bite end-users in unexpected ways... one of the reasons computer software developers call these kind figures "magic numbers". When you have some very intelligent people who are complaining about an issue of this nature as having some significant impact, it is at least something which needs some attention.
The specifics on how to avoid this problem is the point of this thread, and a strong suggestion that "rules" ought to be incorporated into the network in terms of how to somehow allow this hard coded limit.
The moral of this story is that the non-generating clients operate on the network at the pleasure of the miners. The miners are effectively in control of the "health" of the network and the current block size limits reflect that. So for example block
http://blockexplorer.com/b/92037 is about 200455 bytes long and mostly contains spam. Normal blocks max out at 50k. This shows that at least one generator has chosen to waive the current fees scheme. I think that letting miners effectively decide their own fees scheme will be seen to be the least bad option.
I will say in regard to the control of the network by the miners, that is mostly true but not 100% of the time. Blocks sent out by miners can also be rejected by "the vast masses of clients" who simply refuse to recognize a block. Perhaps some other miner that fits within the rules set up by the network will do something that another miner doesn't take into consideration, and that particular block is simply going to be rejected. With the rules as currently established, a miner who chooses to create a very large block is simply going to have that block ignored by the current network. Essentially this is "proof" that the miners don't have absolute authority here. Miners also work at the pleasure of the network as a whole, and have "constitutional limits" imposed upon them by the networking rules. This particular issue with the maximum block size is one of the few rules that is outside of the control of a single miner. Other kinds of similar rules could be adopted by a significant portion of the clients that may exclude certain miners or even groups of miners providing a check to a sort of "tyranny of the miners".
I'm not going to speculate about how such rules might be established or what other potential rules might be, other than suggesting that the block limit rule is one such rule and that needs to be somewhat reconsidered, certainly as a fixed size. The long-term consequence is that without being changed, transaction fees may potentially escalate to absurd levels as more people trying to get the network to incorporate a particular transaction becomes a sort of "fees arms race", particularly when miners simply would be unable to get blocks of a larger size incorporated into the network.
The opposite situation is a voluntary self-limiting feature on miners who simply choose not to grow blocks to large sizes. As long as somebody somewhere is allowed to have an arbitrarily large block, it will deal with the transactions with a low fee or perhaps no fee at all, even if it will take awhile to get those blocks incorporated into the network.