Pages:
Author

Topic: Individual Block Difficulty Based on Block Size - page 2. (Read 4688 times)

member
Activity: 129
Merit: 14
Work calculations would need to follow the declared work of the block -- e.g. a 3x block is valued at 3x the work.

Is this a significant forking risks.  Miners could continue to work on an old block and ignore a new small low difficulty block to get the higher fees.  This miner would know that if they found a large 3x difficulty block they would be the longest chain and everyone will ignore the small blocks.
legendary
Activity: 1400
Merit: 1013
Only if block sizes are limited. If they are unlimited even a tiny portion of the hashrate can continually clear the market at effectively no cost to themselves, thats the whole point of this post-- creating a cost for doing so.
Why do you think this is a problem, or more specifically, why do think there is no cost to larger blocks in the system I proposed (did you read it?)

If nodes are charging for bandwidth, and those transactions are in the mempool of most of the network, then the people who created the transactions already paid for them to be distributed.

If some miner with 5% of the hash rate includes all transactions in the mempool regardless of fee, then it just means that paying no fee means the transaction takes 3 hours to get confirmed. Anyone who wants a confirmation faster than that will pay a higher fee.

If miners are independently collecting transactions which are not broadcast to the network ahead of time, then they lose O(1) propagation speed. Their block will be more expensive to propagate in terms of direct cost and orphan risk.
legendary
Activity: 905
Merit: 1014
Work calculations would need to follow the declared work of the block -- e.g. a 3x block is valued at 3x the work.
member
Activity: 129
Merit: 14
Dear foolish_austrian

Thanks for this really interesting writeup.  I think its very well written and could be a great idea or move us closer to a great idea.

How does the "longest chain rule" concept fit into this?  If there are two valid blocks of equal height, but different difficulty, should miners work on the larger higher difficulty block due to more cumulative difficulty or the first block they recieve?   Does 1 block with 3x the average difficulty equal 3 average blocks?  Transactions could then have below average conformations.

If so would this increase the orphan rate?

On the other hand, if all valid blocks of equal height are treated the same, would this make a double spend attack less costly?  Also, could this support a selfish mining attack as a selfish miner focuses on smaller, lower difficulty blocks to get the crucial two block lead? The selfish miner also has less competition as other miners try to find larger blocks.
legendary
Activity: 905
Merit: 1014
This is a challenge for the distant future, if ever. It seems pretty likely the value of the currency at the next halving will be at least double its value at the last halving (which was only 12 USD), with 3 more doublings already provided for at today's price.

It's a problem today in contexts such as sidechains where there is no block subsidy. That's why we were looking at this exact proposal internally at Blockstream (alas, I can find no public descriptions of the idea, so we've been scooped!). I share gmaxwell's concern that there seem to be an infinite number of possible curves to use, and it is not even obvious what metric should be used to rate them. It would be ideal to show that for various distributions of priority/fee-per-kb in the mempool that there exist Nash equilibrium mining strategies which result in a stable block size. The block size adjustment algorithm must also do something to mitigate centralization pressures of larger block sizes -- larger blocks make better connected miners have larger apparent hash rates than they actually do, due to differences in propagation time. Neither of these are well formalized yet, and there may be other requirements remaining to be discovered.

A few minor notes: since the difficulty is specified in the block header, it is possible for the miner to choose a block size which is possibly greater than the actual block size by adjusting their reported difficulty (the bits field of the PoW). Additionally, for bitcoin you have the question of whether or not to adjust the subsidy by the same factor. Arguments for and against adjusting the subsidy have to do with whether you want there to be a near-term cost to adjusting the block height.
newbie
Activity: 54
Merit: 0
They can only influence users' expectations of required fees in proportion to their hashrate, which you specified was small.
Only if block sizes are limited. If they are unlimited even a tiny portion of the hashrate can continually clear the market at effectively no cost to themselves, thats the whole point of this post-- creating a cost for doing so.

The size of the portion must matter since an even tinier portion could still mine all the free txes at 3x difficulty, but that much less frequently. 

This is a challenge for the distant future, if ever. It seems pretty likely the value of the currency at the next halving will be at least double its value at the last halving (which was only 12 USD), with 3 more doublings already provided for at today's price.
newbie
Activity: 13
Merit: 5
An interesting concept. I need to spend some time digesting it but it is an interesting direction to take.   I think the biggest risk is the complexity factor.  Bitcoin is at a conceptual level very simple and yet the number of potential edge cases, attack vectors, and exploits is relatively large.  Take a more complex system and all the risks become harder to model.  Still that alone isn't a reason to discount it, just a risk factor.

I agree that simplicity is always preferable. I mulled the concept of adding additional POW into the Merkel tree, or somewhere parallel to the header, and those all added complexity and in the end didn't add to the network security. I really didn't like the idea of changing the header structure.

That said, I don't think this is all that complex, and to the average user they shouldn't care. All they need to know is that miners need need to make a profit, and the higher fee they pay the sooner a miner can include it in a block. If they pay too low, a miner will need to wait until it's profitable, which may never happen if you pay far below market price.

In the past when I'd worked on it's I'd struggled at producing a difficulty size relation which was well justified, there seem to be an infinite class of functions that works just about as well. "Why this shape and not this other one?".

I think you're right that there are an infinite number of polynomials that work. This is where it seems we cannot divorce the politics from the algorithm. Each polynomial is going to produce a slightly different outcome. I simply chose one as a starting point to illustrate the concept of difficulty vs. size (which I'm glad to find you've mulled also!). I think the key is that anything sub-linear will eventually cause most transactions to become profitable, but with a varying delay. For example, we could make a polynomial that is nearly linear, which would result in the time penalty for lower-fee transactions to be even longer. Linear would be the opposite extreme to what Satoshi chose. I'm not sure we can say that one is correct, but I would point out that Satoshi did chose one parameter, namely constant, and it's at the other extreme. I would also say that we have no reason to claim this one is any more correct, but we do know it has the potential to produce certain outcomes that we're not too excited about... so in other words, we cannot be neutral and say we just won't pick a polynomial, because one has already been picked.

https://i.imgur.com/n5pPYES.png



but I was unable to prove that even with homogenous-cost rational miners if there is a single supply level at which miner income is maximized that the system has an equilibrium there. I think it would be really good to be able to show that.

I actually started down this path, then realized that with anything other than constant (Satoshi's parameters), the population and distribution in the unconfirmed transaction pool is always going to be changing, and therefore the profitability of mining will constantly be changing. As a result, I think you're going to see a whole array of mining strategies.

One of my concerns with this is if these complex strategies would lead to the expectation value for the block time to always remain 10 min, or if it would oscillate. It might produce undesirable oscillations if the natural frequency is too close to 2016 blocks, but perhaps this could be mitigated by calculating the average block size over a longer time span.


because the function you've shown there doesn't obey the invariant that the result should be 1 if s,d = 1.


My bad, I copied the equation wrong from my code. I think the figures were correct. Are the labels confusing? Here is the axis with numbers instead (I was trying to make it more descriptive by adding the "x average"

https://i.imgur.com/UE08LJy.png


As far as implementation goes, ... using floating point in consensus would be an unmitigated disaster. But thats no problem, a reasonable, cheap, fixed point approximation is no problem-- esp. for the limited range this function needs to operate over, a rational polynomal fit will work fine.

Yep... I've revealed that I'm at best an informed hobbyist. I don't know the code, but I did do my best to read about the block structure and such to attempt to suggest minimal changes. I have done a tiny bit of embedded design, so I know how painful floating point can be.
donator
Activity: 1218
Merit: 1080
Gerald Davis
An interesting concept. I need to spend some time digesting it but it is an interesting direction to take.   I think the biggest risk is the complexity factor.  Bitcoin is at a conceptual level very simple and yet the number of potential edge cases, attack vectors, and exploits is relatively large.  Take a more complex system and all the risks become harder to model.  Still that alone isn't a reason to discount it, just a risk factor.
staff
Activity: 4326
Merit: 8951
No, they can't.  They can only influence users' expectations of required fees in proportion to their hashrate, which you specified was small.
Only if block sizes are limited. If they are unlimited even a tiny portion of the hashrate can continually clear the market at effectively no cost to themselves, thats the whole point of this post-- creating a cost for doing so.

Some useful thought fodder... Monero/Bytecoin/etc. do something like this, but they require the miner to throw away some of the coins they'd receive in their coinbase txn. The problem with that is that once subsidy is small the miners can simply bypass it by accepting fees "out of band" (e.g. with addition txouts). Difficulty scaling avoids that trap.
member
Activity: 129
Merit: 14
Brilliant post!!!
newbie
Activity: 54
Merit: 0
A small minority of miners could impose a situation where all transactions are mined, even no-fee transactions.

No, they can't.  They can only influence users' expectations of required fees in proportion to their hashrate, which you specified was small.
staff
Activity: 4326
Merit: 8951
tl;dr Non-mining nodes can in principle receive compensation for transaction handling without radical changes to Bitcoin.
I don't see why you think this is relevant. Other nodes relaying plays no role in the economics of mining: miners can (and already do, and have in the past since 2011) advertise locations which can receive transactions for them, and can do so trivially and in a jamming proof way by listing them in the blocks they mine. Transactions then go straight from users to miners, which is inherently more efficient since the multi-hop flooding network relay was just pure overhead (a product of trying to achieve decentralization at great bandwidth cost). Anyone can "in principle" pay anyone else, sure. But there is no obvious reason for anyone to do so.

foolish_austrian, I'm very interested in that kind of closed loop control vs difficulty... In the past when I'd worked on it's I'd struggled at producing a difficulty size relation which was well justified, there seem to be an infinite class of functions that works just about as well. "Why this shape and not this other one?".  Have you considered an absolute floor (e.g. size limit can't go below 1MB regardless of what the average size is) and if that creates weird incentives?

I tried doing some work where I assumed there was a unimodal optimum quantity and tried to prove (for some function) that the system would always find equilibrium there but I was unable.  All these 'quadratic curve' like systems do have a nice property that differences in miner efficiency turn into differences in the block sizes they produce, thus equalizing their participation some-- more efficient miners produce larger blocks than typical rather than just driving people out of the market quite so strongly (though this is less powerful when you assume transaction fees/byte are power-law rather than uniform)... but I was unable to prove that even with homogenous-cost rational miners if there is a single supply level at which miner income is maximized that the system has an equilibrium there. I think it would be really good to be able to show that.

Back to your writeup; can you check your figures? because the function you've shown there doesn't obey the invariant that the result should be 1 if s,d = 1. Using 1/2 as both your coefficients does, but disagrees with your illustrations; did you have some scale and offset you're not listing here?

Subsidy can be addressed by scaling it with the function (e.g. 2x difficulty, you get 2x the subsidy) and just letting it modulate the move the inflation schedule in or out somewhat, so the subsidy is neutral to the process... though it's annoying because it makes it more complex and harder to reason about.   Your notion of having coin-holders influence the target hashrate growth target is interesting; though if you do so internally to the network with transactions miners can censor them to change the result so some care and thought must be taken there. It sort of combines a proposal I liked before (basically only increasing the block size if there is enough coins days destroyed voting for it) but thought perhaps was too costly to use directly.

As far as implementation goes, ... using floating point in consensus would be an unmitigated disaster. But thats no problem, a reasonable, cheap, fixed point approximation is no problem-- esp. for the limited range this function needs to operate over, a rational polynomal fit will work fine.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
newbie
Activity: 13
Merit: 5
Abstract— The current economic incentives of the Bitcoin network result in a tragedy-of-the-commons that could result in transaction fees eventually falling to zero. This is because the marginal cost of mining each transaction is zero for the individual miner (assuming O(1) block propagation), but non-zero for the network. If all miners are capable of externalizing the cost of mining ‘spam’ transactions, and if anyone can mine, then nothing within the consensus protocol today exists to establish a market price for transaction fees. As a result, miners must either rely on charity, or form cartels outside of the consensus protocol to enforce transaction fees. One solution to this problem is to require a local difficulty correction for each block based on the size of the block in bytes. Blocks that are larger or smaller than the network average would require a slightly more or less difficult proof-of-work respectively. The result would be that less efficient miners with a higher marginal cost of mining would be required to mine only the highest fee transactions in smaller than average blocks, and leaving lower fee transactions for more efficient miners. Those most efficient miners will be able to mine larger blocks at a net discount per byte when compared with their less efficient counterparts. If the difficulty correction per byte is sub-linear, then very large block will always become profitable eventually, so that even extremely low fee transactions will eventually be mined, although with significant delay. No-fee transactions will always cost miners in the form of increased difficulty, and therefore will only be mined as charity. A localized block difficulty does not need to change the block interval, the coinbase schedule, or alter any of the economic incentives that have made Bitcoin successful up to the present time, although attention must be given to the special (current) case where the coinbase is larger than the transaction fees. It would, however, require the addition of a block size field in the header. Localized block difficulty would be ideally suited for a sidechain where there is no block reward.



I.   INTRODUCTION

If proposals for O(1) block propagation are successful, then the cost of mining will be dominated by hashrate. However, there is no correlation between the amount of data published to the blockchain and the hashing required to publish it. As a result, a spam-miner can mine all transactions irrespective of their transaction fees with no additional cost to themselves. A small minority of miners could impose a situation where all transactions are mined, even no-fee transactions. If no-fee transactions are mined, then only charity remains as the motivation for including a transaction fee.

One possible solution to this is to keep the block size limit low enough to cause a market to form which causes users of the network to bid for space in each block. The difficulty with this approach is that it requires a central planner to correctly guess the growth rate of the network and determine the ‘correct’ size based on a subjective opinion of the legitimate uses of the blockchain (i.e. store of value vs. microtransactions).

Another possible solution is for the majority hashrate to form a cartel and censor spam-miners. This is undesirable because it encourages consensus outside of the consensus algorithm, which ultimately is not public and cannot be audited. The best solution to incentivizing the payment of transaction fees would conform to the following principles:

  • The mechanism for creating a transaction fee market should be built into the consensus algorithm as much as possible so that it is public, auditable, and enforceable by all individual node (i.e. wallets and holders of bitcoin).
  • The solution should not assume the growth rate of the network, and should work equally well regardless of the use cases of Bitcoin that develop.
  • The solution should be as conservative as possible with respect to making changes to the protocol, and should not change any fundamental parameters of the inflation schedule or block time.

II.   INDIVIDUAL BLOCK DIFFICULTY

Let D be the global difficulty of the network, and d be the target difficulty for an individual block based on its normalized size in bytes (s) relative to the average size for the network, calculated every 2016 blocks. We can define d as

                     https://i.imgur.com/iLm0MID.jpg

In this case, the average difficulty (d) is equal to the global difficulty D, therefore the block rate remains constant at 10min on average. The function is well behaved and should not cause divergence away from the mean block time and size.

The validity of each block would be determined by the POW hash meeting d, not D. If computational resources are extremely limited, it may be desirable to create a lookup table every 2016 block for valid d values to avoid numerous floating point operations. In order to preserve self-validating headers, the header must contain a new field to capture s.

III.   ECONOMIC CONSEQUENCES OF VARYING BLOCK DIFFICULTY


          https://i.imgur.com/nVaRQqn.jpg

The first thing to note is that if a block is smaller than average, the difficulty to mine this block is less than the average difficulty. For simplicity, the following discussion assumes minimal block reward, which is the condition under which a robust fee-market is required. However in the interim the implications of the block reward must be addressed. For this discussion see section IV.

In the absence of a block reward, a miner can mine small blocks at a reduced difficulty. The marginal cost per block (in hashrate) would be higher, so that only the highest paying transactions would be profitable to include in a small block. Assuming a roughly power-law distribution in fees paid, the cutoff for transactions included in the block will depend on the marginal costs of the miner. Efficient miners will be able to produce larger blocks profitably, while less efficient miners will need to be selective and will produce smaller blocks on average.

The pool of unconfirmed transactions is unlikely to follow any simple distribution because as blocks are mined, the highest fee transactions will be pulled from the pool first, causing the tail of low-fee transactions to grow relative to the high-fee transactions. Eventually, the pool of low-fee transactions will grow large enough that it will become profitable to mine a super block (i.e. 10x the normal network size) due to the decreasing difficulty per byte. As long as a transaction pays a fee greater than the marginal cost of the most efficient miners, it should always be picked up in a superblock.

Although it’s difficult to model the behavior of such a system, it seems reasonable that a natural frequency would start to emerge where a distribution of small and average sized blocks are mined continually until enough hashing power is incentivized to mine a super block, which might start to occur at roughly regular intervals. Individual behavior of wallet users might start to predict this and pay lower fees as the expected super block approaches. However, because the marginal cost per transactions is continually changing as the pool size changes, it is not clear that any one mining strategy will emerge as the only strategy.

Under no circumstances will it ever be more profitable to mine lower-fee transactions and not higher-fee transactions. However, if the miner experiences real costs associated with increasing the block size, their expenses measured in electricity and equipment will place an absolute floor on the minimum transaction fee that will not result in a loss for the miner.

IV.   VARIABLE BLOCK DIFFICULTY AND BLOCK REWARDS

In a setting such as a sidechain where there is no block reward, equation 1 would work well. However, if the block reward is much greater than the transaction fees, there is an incentive to mine the smallest block possible because of the reduced difficulty of blocks smaller than the network mean. This will create a race to zero block size. To prevent this, the block difficulty should be limited such that

                     https://i.imgur.com/1PtIAKM.jpg

where m starts at 1. It may be wise to have m decrease by 0.1 every block reward halving so that there is a known schedule for the reduction in the minimum difficulty.

V.   BEHAVIORAL IMPACT TO VARIOUS PLAYERS

  • The spam miner – The spam miner is a minority miner who wants to include all unpaid transaction in the blockchain. This miner could accomplish this by including only small amounts of spam, and would always do so at their own expense because their increased block size will result in increased difficulty for themselves only. The more spam they mine, the more uncompetitive they become at producing blocks.
  • The greedy miner / cartel of miners – This miner or cartel of miners refuse to mine any transactions except for transactions above some artificially high price. In this case, the cost of mining the average-fee transaction decreases as the transaction pool grows, so that eventually the greedy miner will need to pass up a large number of profitable transactions to maintain his artificial requirement.
  • The fake-transaction miner – This is the miner who pads his transaction with fake data to increase the block size. In this case, the miner will be hurting himself because he will artificially increase the difficulty of his own blocks.
  • The cheapskate – This is the bitcoin holder who refuses to pay a fee comparable to his peers. If his fee is simply too low, the added difficulty/byte will never reach a profitable point for a miner to include it in a block. The cheapskate will need to rely on charity of a miner to get his transaction published.

VI.   ATTACKS

  • Increased risk of 51% attack and double spends - If the polynomial chosen for block difficulty vs. size gives a discounted difficulty for blocks less than the mean size for the network, then an attacker wanting to perform a 51% attack could inflate their effective hash power by generating small blocks during the attack. For instance, if a double-spender controls 15% of the hash power, their effective hashing power with blocks 1/10th the network average size would be 1.82 times larger, making it appear for the attack purposes that they control 27.3% of the network hashrate. However, this behavior would be mitigated by the fact that the miner would need to perform this attack by mining smaller blocks, which would result  in them leaving profitable transaction fees un-mined for other miners. In the case where the dishonest miner can achieve an effective hashrate above 51%, they may be able to recoup this loss at the end of their chain. This attack would need to be mitigated by choosing a polynomial that does not provide significant discount for small blocks, or by a simple min(f(D),l) limit. [Credit to jonny1000 for this contribution to the discussion]


VII.   THE SLOW DRIFT PROBLEM

One scenario that isn’t solved is the slow drift problem. What happens if the average transaction fee slowly drifts lower and lower such that eventually the bitcoin network is no longer secure yet everyone is paying the average transaction fee? This seems like a rather hypothetical situation, and there is no obvious reason to insist this will necessarily happen. However, in the event that his happens, equation 1 can be generalized such that

                     https://i.imgur.com/GXkK4M9.jpg

where i is the inflation rate of the hash power desired to maintain 10min block time. Adjusting i to 1.02 will require a 2% increase in the hash power such that D remains constant. If i were ever to be adjusted, it could be adjusted through some proof-of-holding mechanism, although such a process would need to be reserved for a case in which no other mechanism exists to secure the network. The meaning of this would essentially be the network demanding that “the hashrate might increase by 2% per 2016 blocks regardless of the fees paid.” The failure to meet this  imposed growth in hashrate would be that the block time would slowly drift slower and slower until more hashpower is added to restore the blocktime.

VIII.   Closing Thoughts

It is my belief that the block size must increase, and therefore something in the consensus algorithm must change. However, increasing the block size indefinitely could very likely remove the only existing incentive (except for charity) to pay a transaction fee. There is no relationship between the amount of data being securely stored on the block chain and the cost of securing it. This leads to a scenario where transaction fees could go lower than the marginal cost of mining, essentially destroying the security of the Bitcoin network.

As a user of the network, I should be purchasing security. If the addition of my transaction requires additional hash power, then the miner now has a need to require a fee of me. This need can be built into the consensus algorithm such that all participants in the network know the origin of nature of the mechanism that gives rise to the fees.


EDIT 02-16-2015: The equations were copied wrong. Pointed out by gmaxwell
EDIT 02-17-2015: Changed some title headings for clarity and added a comment about the risk of 51% attack.

Pages:
Jump to: