Pages:
Author

Topic: Increasing the block size is a good idea; 50%/year is probably too aggressive - page 11. (Read 14297 times)

legendary
Activity: 3878
Merit: 1193
What would happen if the blocksize were increased to 1 GB tomorrow? Pretty much nothing. Miners will always be able to create blocks less than the maximum blocksize.
What would happen if the blocksize were decreased to 1 KB tomorrow? Bitcoin would come grinding to a halt.

Too small blocksize = death to bitcoin.
Too big blocksize = non-issue.

I'd rather see the blocksize too big than too small.
legendary
Activity: 2408
Merit: 1121
I think this is where the ugly head of "Bitcoin is just an experiment" raises its ugly mug.

You see, Gavin could totally nuke Bitcoin, but he has the plausible deniability that Bitcoin is just an "experiment". You know, something you just putter about on in the garage, if raccoons break in and tear it apart, hell, its just foolin' around, no big loss.

And that is the attitude that is being put forth here. 50%? Sure, why the hell not. Maybe roll a D-100 and decide that way, it would be just as rigorous as a complete and utter guess.

What is completely unreasonable is why you wouldn't base any of these metrics on actual USAGE, with a sliding window ala Difficulty Adjustments to adhere to what is actually HAPPENING in the network.

Gavin doesn't know, but hey, we have to trust him.

I don't think we do...
legendary
Activity: 1050
Merit: 1002
Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.
Not only that, but there will always be non-infinite bandwidth and storage available to users, while anyone can create transaction spam essentially for free. So minimum fees remain necessary for non-priority transactions.

Check your assumptions.

1) We don't know what the future networks will look like.

No, but we do know the science of today. I'm not sure you appreciate the meaning of infinite.

It's not possible to transmit information with perfect efficiency, unless, probably, using quantum entanglement. It's also not possible to store unlimited meaningful information within a confined space, never mind making it all computationally accessible. I'd say my statement is less an assumption and more an observation, unless of course you can show how it's reasonably possible to make use of quantum phenomena in ways we can't imagine today.


2) Commodity prices do go to zero for periods of time.  Sometimes they rot in silos, and cost money to dispose of them (negative worth).

I think he meant go to zero permanently, or at least substantially long periods of time.
legendary
Activity: 2324
Merit: 1125

If this forks as currently proposed, I'll be selling all my BTC on Gavin's fork and mining on the other.  I suspect I will not be the only one.

Okey dokey. You can join the people still mining on we-prefer-50-BTC-per-block fork (if you can find them... I think they gave up really quickly after the 50 to 25 BTC subsidy decrease).


This is so weak. If we follow this analogy YOU are the one wanting to mine 50 BTC blocks ad infinitum since halving to 25 BTC is what Satoshi proposed.

I really don't like the way you are handling this. It seems like you are trying to push your little pet project through as a little dictator. As long as you don't change I'm with NewLiberty on this one and will hold Bitcoin instead of GavinCoin.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.
Not only that, but there will always be non-infinite bandwidth and storage available to users, while anyone can create transaction spam essentially for free. So minimum fees remain necessary for non-priority transactions.

Check your assumptions.

1) We don't know what the future networks will look like.
2) Commodity prices do go to zero for periods of time.  Sometimes they rot in silos, and cost money to dispose of them (negative worth).
legendary
Activity: 1050
Merit: 1002
Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.

Not only that, but there will always be non-infinite bandwidth and storage available to users, while anyone can create transaction spam essentially for free. So minimum fees remain necessary for non-priority transactions.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
It also may be contrary to the eventual goal of usage driven mining, where transaction fees ultimately overtake block reward in value.  This proposal may drive TX fees to zero forever.  Block chain is a somewhat scarce resource, just as total # of coins.  Adding an arbitrary 50% yearly inflation changes things detrimentally.

I'm sending a follow-up blog post to a couple of economists to review, to make sure my economic reasoning is correct, but I don't believe that even an infinite blocksize would drive fees to zero forever.

Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.

That has very little to do with whether or not transaction fees will be enough to secure the network in the future. I think both the "DON'T RAISE BLOCKSIZE OR THE WORLD WILL END!" and "MUST RAISE THE BLOCKSIZE OR THE WORLD WILL END!" factions confuse those two issues.

Great, we agree on all of this.

I don't think adjusting the block size up or down or keeping it the same will have any effect on whether or not transaction fees will be enough to secure the network as the block subsidy goes to zero (and, as I said, I'll ask professional economists what they think).
Here is where it jumps the tracks.  
Your thoughts, and my thoughts aren't going to answer this.  
Math will.  It is not about opinion, it is about measurement and calculation.  Picking 50% out of a hat is hubris, and you know it in your heart.
Justify it, show your work, or it can not be taken seriously.  Looking forward to your follow-up, and its analysis, economists sure, but lets have game theory analysis as well as an analysis of new risks.

If this forks as currently proposed, I'll be selling all my BTC on Gavin's fork and mining on the other.  I suspect I will not be the only one.
Okey dokey. You can join the people still mining on we-prefer-50-BTC-per-block fork (if you can find them... I think they gave up really quickly after the 50 to 25 BTC subsidy decrease).
Strawmen, will make you look stupid and petty.  Play well with the other scientists please?  If this was your best and final offer, you needn't bother responding.  I don't know the answer, but so far we haven't seen it in sufficient detail to end dialog and discovery.

Not to belabor it, but the obvious difference is the 50 BTC folks were going against Satoshi's design, whereas those following 50% love-it-or-leave-it fork would be going against Satoshi's design.  If we need a hard fork, we do it right so that it need not be repeated.

Your proposal started a dialog that may bring a good result.  
The first effort isn't that end result.  If we think we got it perfect on a first guess, our minds are closed to learning and consensus.


No comment on this?
Quote
One example of a better way would be to use a sliding window of x number of blocks 100+ deep and basing max allowed size on some percentage over the average while dropping anomalous outliers from that calculation.  Using some method that is sensitive to the reality as it may exist in the unpredictable future give some assurance that we won't just be changing this whenever circumstances change.
Do it right, do it once.

There isn't a way to predict what networks will look like in the future, other than to use the data of the future to do just that.  Where we are guessing we ought acknowledge that.
legendary
Activity: 1652
Merit: 2301
Chief Scientist
It also may be contrary to the eventual goal of usage driven mining, where transaction fees ultimately overtake block reward in value.  This proposal may drive TX fees to zero forever.  Block chain is a somewhat scarce resource, just as total # of coins.  Adding an arbitrary 50% yearly inflation changes things detrimentally.

I'm sending a follow-up blog post to a couple of economists to review, to make sure my economic reasoning is correct, but I don't believe that even an infinite blocksize would drive fees to zero forever.

Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.

That has very little to do with whether or not transaction fees will be enough to secure the network in the future. I think both the "DON'T RAISE BLOCKSIZE OR THE WORLD WILL END!" and "MUST RAISE THE BLOCKSIZE OR THE WORLD WILL END!" factions confuse those two issues. I don't think adjusting the block size up or down or keeping it the same will have any effect on whether or not transaction fees will be enough to secure the network as the block subsidy goes to zero (and, as I said, I'll ask professional economists what they think).

If this forks as currently proposed, I'll be selling all my BTC on Gavin's fork and mining on the other.  I suspect I will not be the only one.

Okey dokey. You can join the people still mining on we-prefer-50-BTC-per-block fork (if you can find them... I think they gave up really quickly after the 50 to 25 BTC subsidy decrease).
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer

It doesn't make sense to guess at this.  Any guess is bound to be wrong.
If after picking the low hanging fruit, there is still an issue here (and there may be).
It ought not be resolved by a guess when there is data within the block chain that would be useful for making a determination on max block size.
In the same way that difficulty adjustment is sensitive to data within the block chain, so also this could be.

I don't know what the right answer is anymore than Gavin does, but making an estimation would not be the best way to solve this in any case.

.....
One example of a better way would be to use a sliding window of x number of blocks 100+ deep and basing max allowed size on some percentage over the average while dropping anomalous outliers from that calculation.  Using some method that is sensitive to the reality as it may exist in the unpredictable future give some assurance that we won't just be changing this whenever circumstances change.
Do it right, do it once.

There isn't a way to predict what networks will look like in the future, other than to use the data of the future to do just that.



Is this 50% per year intended to be a hardcoded rule like the block reward?

That's not how I interpreted Gavin's report. It sounded more like a goal that the developers thought was attainable.

That said, 50% per year does seem aggressive. At some point, the opportunity cost of including more transactions is going to exceed the tx fee value, certainly as long as the block reward exists, so the blocksize cannot increase indefinitely. And so what if there is little room in the blockchain? Not every single tiny transaction needs to be recorded indefinitely. Since the (I expect) cost of increasing the block size is increased centralization, shouldn't the developers be hesitant to make such a commitment without allowing for discretion?

I also wonder what the best approach will be, way out in the future, when the block reward is near zero. Can there be an equilibrium transaction fee if the difficulty is allowed to continue to fall? A simple, kludgy solution might be to fix the difficulty at some level, allowing blockrate to depend on the accumulated bounty of transaction fees.

Though I'm sure some new kind of proof of work/stake approach could best solve this problem and make the network more secure and cheaper.

It also may be contrary to the eventual goal of usage driven mining, where transaction fees ultimately overtake block reward in value.  This proposal may drive TX fees to zero forever.  Block chain is a somewhat scarce resource, just as total # of coins.  Adding an arbitrary 50% yearly inflation changes things detrimentally.

If this forks as currently proposed, I'll be selling all my BTC on Gavin's fork and mining on the other.  I suspect I will not be the only one.
member
Activity: 75
Merit: 10

It doesn't make sense to guess at this.  Any guess is bound to be wrong.
If after picking the low hanging fruit, there is still an issue here (and there may be).
It ought not be resolved by a guess when there is data within the block chain that would be useful for making a determination on max block size.
In the same way that difficulty adjustment is sensitive to data within the block chain, so also this could be.

I don't know what the right answer is anymore than Gavin does, but making an estimation would not be the best way to solve this in any case.

.....
One example of a better way would be to use a sliding window of x number of blocks 100+ deep and basing max allowed size on some percentage over the average while dropping anomalous outliers from that calculation.  Using some method that is sensitive to the reality as it may exist in the unpredictable future give some assurance that we won't just be changing this whenever circumstances change.
Do it right, do it once.

There isn't a way to predict what networks will look like in the future, other than to use the data of the future to do just that.



Is this 50% per year intended to be a hardcoded rule like the block reward?

That's not how I interpreted Gavin's report. It sounded more like a goal that the developers thought was attainable.

That said, 50% per year does seem aggressive. At some point, the opportunity cost of including more transactions is going to exceed the tx fee value, certainly as long as the block reward exists, so the blocksize cannot increase indefinitely. And so what if there is little room in the blockchain? Not every single tiny transaction needs to be recorded indefinitely. Since the (I expect) cost of increasing the block size is increased centralization, shouldn't the developers be hesitant to make such a commitment without allowing for discretion?

I also wonder what the best approach will be, way out in the future, when the block reward is near zero. Can there be an equilibrium transaction fee if the difficulty is allowed to continue to fall? A simple, kludgy solution might be to fix the difficulty at some level, allowing blockrate to depend on the accumulated bounty of transaction fees.

Though I'm sure some new kind of proof of work/stake approach could best solve this problem and make the network more secure and cheaper.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
(e.g. by using transaction hashes and/or IBLT), once implemented, will certainly keep the new block message size growth rate much lower than the bandwidth growth rate.  
Keep in mind these techniques don't reduce the amount of data that needs to be sent (except, at most, by a factor of two). They reduce the amount of latency critical data. Keeping up with the blockchain still requires transferring and verifying all the data.

Quote
Not a big deal?  Well, except that we can expect the power of nodes to follow some sort of curve ("exponential" in the vernacular) such that most nodes are barely above the threshold to be viable.  Meaning that this event would mean that the majority of nodes would shut down, likely permanently.
Right. There is a decentralization trade-off at the margin.  But this isn't scaleless-- there is _some_ level, even some level of growth which presents little to no hazard even way down the margin.   The a a soft stewardship goal (not a system rule, since it can't be) the commitment should be that the system should be run so that it fits into an acceptable portion of common residential broadband, so that the system does not become dependant on centralized entities. As some have pointed out, being decenteralized is Bitcoin's major (and perhaps only) strong competitive advantage compared to traditional currencies and payment systems. How to meet that goal best is debatable in the specifics.

At the moment there are a bunch of silly low hanging fruit that make running a node more costly than it needs to be; we're even at a case where some people developing on Bitcoin core have told me they've stopped running a node at home. It's hard to reason about the wisdom of these things while the system is still being held back by some warts we've long known how to correct and are in the process of correcting.

It doesn't make sense to guess at this.  Any guess is bound to be wrong.
If after picking the low hanging fruit, there is still an issue here (and there may be).
It ought not be resolved by a guess when there is data within the block chain that would be useful for making a determination on max block size.
In the same way that difficulty adjustment is sensitive to data within the block chain, so also this could be.

I don't know what the right answer is anymore than Gavin does, but making an estimation would not be the best way to solve this in any case.

.....
One example of a better way would be to use a sliding window of x number of blocks 100+ deep and basing max allowed size on some percentage over the average while dropping anomalous outliers from that calculation.  Using some method that is sensitive to the reality as it may exist in the unpredictable future give some assurance that we won't just be changing this whenever circumstances change.
Do it right, do it once.

There isn't a way to predict what networks will look like in the future, other than to use the data of the future to do just that.

staff
Activity: 4284
Merit: 8808
(e.g. by using transaction hashes and/or IBLT), once implemented, will certainly keep the new block message size growth rate much lower than the bandwidth growth rate.  
Keep in mind these techniques don't reduce the amount of data that needs to be sent (except, at most, by a factor of two). They reduce the amount of latency critical data. Keeping up with the blockchain still requires transferring and verifying all the data.

Quote
Not a big deal?  Well, except that we can expect the power of nodes to follow some sort of curve ("exponential" in the vernacular) such that most nodes are barely above the threshold to be viable.  Meaning that this event would mean that the majority of nodes would shut down, likely permanently.
Right. There is a decentralization trade-off at the margin.  But this isn't scaleless-- there is _some_ level, even some level of growth which presents little to no hazard even way down the margin.   The a a soft stewardship goal (not a system rule, since it can't be) the commitment should be that the system should be run so that it fits into an acceptable portion of common residential broadband, so that the system does not become dependant on centralized entities. As some have pointed out, being decenteralized is Bitcoin's major (and perhaps only) strong competitive advantage compared to traditional currencies and payment systems. How to meet that goal best is debatable in the specifics.

At the moment there are a bunch of silly low hanging fruit that make running a node more costly than it needs to be; we're even at a case where some people developing on Bitcoin core have told me they've stopped running a node at home. It's hard to reason about the wisdom of these things while the system is still being held back by some warts we've long known how to correct and are in the process of correcting.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
My concern is that there is little room for error with geometric growth.  Lets say that things are happily humming along with bandwidth and block size both increasing by 50% per year.  Then a decade goes by where bandwidth only increases by 30% per year.  In that decade block size grew to 5767% while bandwith grew to 1379%.  So now peoples connections are only 24% as capable of handling the blockchain.

Not a big deal?  Well, except that we can expect the power of nodes to follow some sort of curve ("exponential" in the vernacular) such that most nodes are barely above the threshold to be viable.  Meaning that this event would mean that the majority of nodes would shut down, likely permanently.

Compression techniques (e.g. by using transaction hashes and/or IBLT), once implemented, will certainly keep the new block message size growth rate much lower than the bandwidth growth rate.  

At the moment the 1MB in checkblock is agnostic as to how the blocks are received.  

Code:
    // Size limits
    if (block.vtx.empty() || block.vtx.size() > MAX_BLOCK_SIZE || ::GetSerializeSize(block, SER_NETWORK, PROTOCOL_VERSION) > MAX_BLOCK_SIZE)
        return state.DoS(100, error("CheckBlock() : size limits failed"),
                         REJECT_INVALID, "bad-blk-length");

Consider that bandwidth is the constraint and disk space, perhaps 10x less so. This implies that a 1MB block maximum for transmitted blocks should be reflected as a 10MB maximum for old blocks read from / written to disk (especially when node bootstrapping is enhanced by headers-first and an available utxo set).

Put another way, a newly mined block of 2MB might be transmitted across the network in a compressed form, perhaps of only 200KB, but it will get rejected, yet it should be acceptable as it is within currently accepted resource constraints.
hero member
Activity: 1008
Merit: 531
My concern is that there is little room for error with geometric growth.  Lets say that things are happily humming along with bandwidth and block size both increasing by 50% per year.  Then a decade goes by where bandwidth only increases by 30% per year.  In that decade block size grew to 5767% while bandwith grew to 1379%.  So now peoples connections are only 24% as capable of handling the blockchain.

Not a big deal?  Well, except that we can expect the power of nodes to follow some sort of curve ("exponential" in the vernacular) such that most nodes are barely above the threshold to be viable.  Meaning that this event would mean that the majority of nodes would shut down, likely permanently.
Pages:
Jump to: