The tragedy of the commons relates to unregulated use of common resources.
I don’t think this is any way relevant. Mining resources are not under common ownership and there is no common right to use these resources. They are generally owned by private entities trying to make a commercial profit and you have no right to make them mine your transaction. Miners can choose what transactions they mine so the use of this resource is regulated by the miners.
The common resource here is abstract. It is the willingness of users to pay tx fees. By accepting low-fee txs, a miner consumes this resource (that is, makes users less willing to pay tx fees) for his own benefit, at the expense of the total benefit of all miners. And, since Bitcoin needs miners, this is a problem for all Bitcoin users.
Nobody suggested that the mining devices themselves are a common good.
Miners will simply set prices at what it is worth for them to mine and make a reasonable profit. With no block size limit they will be able to set the fee at which they can turn a profit at a lower level.
economic theory says that in a competitive market, supply, demand, and price will find an equilibrium where the price is equal to the marginal cost to suppliers plus some net income (because suppliers can always choose to do something more profitable with their time or money)
(Gavin’s blog)
Quoting this theory blindly is failing to acknowledge some specific characteristics of Bitcoin mining. Most importantly, that we want to keep mining costs artificially high. Absent any limiting mechanism, sure, the market will reach an equilibrium... An equilibrium where tx fees are low, total hashrate is low, and the network is vulnerable to hashrate attacks.
In a healthy network with a high hashrate, the main cost of a tx is amortized, and the marginal cost is negligible. Hence, having the price equal to the marginal cost (as in the quote) is a disaster.
Thought of another way: Mining has a positive externality which is difficult to monetize, due to "race to the bottom" effects. Left to the market's own devices, no mining will be done. Hence we need some way to coordinate players into providing this externality.
Speaking of caps: I'd like to stress the point that there are two separate costs in the Bitcoin network, each should be addressed in its own way:
1. The marginal cost of propagating, verifying and storing transactions. Caps on the block data size and amount of ECDSA signatures help with funding this.
2. The amortized cost of hashing blocks to secure the network. This has nothing to do with data size, and using data size caps to fund this is misguided and creates perverse incentives.
Meni
Thanks for this interesting comment. I guess you are correct that the amortised cost of hashing blocks has nothing to do with the data size. However as you say, knowing the value for the user is difficult. I am not sure of the value of bitcoins sent is a good proxy, because of additional layers like colored coins and who is to say that a transaction for one person buying medical care has less value to the other than a millionaire pointlessly moving money between wallets? Why not assume all transactions are equal? The number of transactions may be the best proxy.
Additional layers are an issue to consider, but even so I believe value sent is a better proxy than all transactions being equal.
Regarding your other point:
1. I hope you realize that my suggestion makes things easier for the person paying medical care. The quantity sent is lower for him, so he is expected to pay less fees.
2. I'm talking about value in the economic sense, not in the emotional, personal sense. Someone who is rich and is sending large amounts is generally willing to pay higher fees. The willingness is an advantage, and should be used to fund the network.
3. If the moving of money between wallets is "pointless", I see no harm in a policy that discourages it. If it's not pointless (security concerns etc.), then the sender is willing to pay for it, and we should use that.
Therefore I would alter my “transaction fee targeting”, mentioned
https://bitcointalksearch.org/topic/m.9208935 to be the following:
I propose the following rule to determine the number of transactions in a block limit, once the block reward is lowThe number of transactions in a block limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less). This kind of dynamic rules can have very unstable, unpredictable behavior. They should be approached with great care.
Most of the conversation seems to have been about ensuring sufficient transaction fees are paid to miners. But what about the verifiers? Currently, all tx validation is done by volunteers. I think Satoshi initially intended for validators to double as miners, but in a world where the two are largely mutually distinct, how do we support the verifiers? And if we can't, isn't the network doomed anyway?
This is true, and I believe research such as the
Red Balloons paper is a step towards resolving this issue.
Anyway, I expect the cost of verifying to be much lower than of hashing, so finding a way to fund that should be easier.
Related to this is something I have not seen considered: the upper limit on hash speed per device.
There is no known physical limit on computation. (There are some erroneous limits occasionally quoted, based on the physical limits of erasing information; but in truth, it is not known that computation requires erasing information proportionally).
And even if there is:
1. We're decades away from being anywhere near it
2. I'm not sure why the rest of your scenario would follow.
I couldn't resist peeking at the literature; the first hit on a google search for "experimental economics free rider" turns up
this 1984 paper:Both conventional wisdom and economic theory have been called into question recently by a series of research papers which report experimental studies of collective decision-making about public goods. Almost without exception, these papers have reported results that cast serious doubt upon the importance - and, in some cases, even upon the very existence - of the free rider problem.
AFAICT, this paper discusses the methodological errors in those research papers that dismissed the free rider problem, and itself did not find evidence against the free rider problem.
Anyway, in the real world the free rider problem obviously exists.