Include the soft limit into the verification rules of as many clients as possible, and miners who first comment out that rule for themselves will be punished by the network at least until a majority of users upgrade their clients to match. The rest of the miners that didn't commetn out the rule would benefit from teh harm the first mover takes upon himself.
Huh...wha...eh??? This makes no sense. The "soft limit" is not a verification rule, it is part of the algorithm that the mining example code uses to put together a candidate block. It stops when it reaches 250kb. This doesn't mean that miners will reject blocks that are over 250kb, it just means that they will not PRODUCE them (unless someone modifies the code). This is neither a hard fork, nor a soft fork. Think of it as a "canary in the coal mine." Right now, there is little economic incentive to modify the piece of code. For two reasons: 1) the transaction volume is not high enough, and 2) block subsidies are orders of magnitude larger than fees. When these conditions change, miners at the margin will have a financial incentive to change the code. Someone like Gavin can study the blocks in the block chain to see what fraction of blocks are larger than 250kb. This will provide insights into how miners react to the soft limit.
Making the 250kb a verification rule of clients is a fork (not sure if its a hard fork or a soft fork). It makes no sense to do this. You can't assume that everyone is going to upgrade to this version, nor should you assume that once this rule is adopted by clients that it will ever go away. You have effectively reduced the 1 megabyte hard limit down to a 250 kilobyte hard limit. Good job, LOL, the opposite of what people are arguing for here!
The fundamental market logic behind that idea seems solid enough that it actually doesn't matter too much how the relation is calculated.
But it does, and I gave an example. The transition from FPGA/GPU to ASIC will cause the network hashing rate to skyrocket in a way that is totally unconnected to the value of Bitcoin or the amount collected in fees. This alone should tell you that the connection between hash rate and transaction scarcity is tenuous at best (non-existent at worst, as is the case currently). If we had this system in place now, it would cause the block size to grow despite the absence of scarcity, resulting in less miner revenue not more.
I hope that we can put to rest the idea that tying the block size to the network hash rate is a bad idea.
I believe that any scheme for adjusting the maximum block size should:
1) React to scarcity
2) Prevent centralization by forcing out marginal miners