Pages:
Author

Topic: Increasing the block size is a good idea; 50%/year is probably too aggressive - page 8. (Read 14297 times)

sr. member
Activity: 333
Merit: 252
My 2uBTC on this issue:
Instead of guessing the costs of the network via extrapolation, code in a constant-cost negative feedback mechanism.  For example, similar to difficulty adjustments, if mean non-coinbase block reward > 1 BTC, increase max size.  If mean block reward < 1 BTC, decrease max size (floor of 1MB).


a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
legendary
Activity: 2324
Merit: 1125
I am just hoping that some more serious thought goes into avoiding the need to guess or extrapolate (an educated guess but still a guess).

It is offered as an example of the sort of thing that can work, rather than a finished product.

This is the problem.

People don't seem to realize Gavin's proposal may be the best we can do. I happen to think it is. If anyone had a better idea we'd have heard it by now. We, the entire community, have brooded on this issue for months if not years now. Here is a spoiler alert: nobody can predict the future.

New Liberty's unpolished prototype is already far superior to Gavin's nonsense so this is easily debunked.
hero member
Activity: 709
Merit: 503
My 2uBTC on this issue:
...
I'd welcome comments / criticism of why having such a feedback mechanism is a good or bad idea.
I realize transactions can come in a wide variety of sizes so my back-of-the-envelope calculations need to be taken with a big grain of salt;

https://blockchain.info/charts/n-transactions-per-block shows around 3-Mar-2014 a peak of 618 transactions in a block (as averaged over 24 hours) & https://blockchain.info/block-index/477556 is a 396KB block with 710 transactions in it.  1BTC/710txn ~= 0.0014BTC/txn or about $0.53 at the current exchange rate; so much for micro-transactions.  Also, 396KB/710txn ~= 558B/txn, so, 1MB/558B/txn ~= 1792txn/MB.  Even, 1BTC/1792txn*$377.79/BTC ~= $0.21/txn.  I think maybe 0.1BTC/block would be nice.  If exchange rate climbs to $2000/BTC and if the block size were still at 1MB then 0.1BTC/1792txn*$2000/BTC ~= $0.11/txn but if the block size were at 2MB then the per transaction fee drops to $0.055 or so.  As legit transaction rates climb presumably so does the exchange rate.  At what point does BTC decouple from fiat?
hero member
Activity: 709
Merit: 503
A government with a strong enough military/police can potentially take over a miner's equipment by force/violence all in the name of supposed social good while calling it eminent domain http://en.wikipedia.org/wiki/Eminent_domain.
hero member
Activity: 709
Merit: 503
We should anticipate governments becoming miners; if they aren't already.
newbie
Activity: 6
Merit: 0
My 2uBTC on this issue:
Instead of guessing the costs of the network via extrapolation, code in a constant-cost negative feedback mechanism.  For example, similar to difficulty adjustments, if mean non-coinbase block reward > 1 BTC, increase max size.  If mean block reward < 1 BTC, decrease max size (floor of 1MB).

Here's why I think this is a long term solution.  With Bitcoin, "costs" and "value" have a very interesting relationship; currently with mining, the costs to run the network are determined by the exchange value of a bitcoin.  Long term, the block size constrains both the cost and value of the network.  By "long term", I mean 100 years from now.  Long term, there's no more coinbase reward.  So miners compete for transaction fees.  Limited block size causes transactors to compete for space in the block, driving up the fees.  An unlimited block size would, without other costs, tend to drive fees to near-zero, and then there's not enough incentive for miners to bother, and the security of the system is compromised.  That's the death spiral idea anyway, which may not actually happen, but it's a legitimate risk, and should be avoided.  The value and utility of bitcoin today has a lot to do with the probability that it will have value in 100 years.

Max block sizes doubling every two years makes them pretty much unlimited.  Capping after 20 years is also a big guess. That also extrapolates Moore's law for potentially longer than the law keeps going.  Gigabit ethernet is what, 15 years old?  And that's what every PC has now, I've never seen 10G over copper ethernet.  Reliance on everything else becoming awesome is a very fragile strategy.

An issue I have with expoentially increasing block size, or static block size, is there's no feedback, and can't respond to changes in the system.  The block size in many ways determines the value of the network. All else being equal, a network that can handle more transactions per day is more useful and more valuable.

I think that similar to the current system of mining costs determined by bitcoin value, block propagation, verification and storage should be determined by how much people are willing to pay.  If transaction fees are high, block space is scarce, and will expand.  If transaction fees are low, block space is too cheap, and the max block size will expand.

This fixes a cost independent of the mining coinbase reward, allowing for sustainable, predictable mining revenue.  The issue is we would have to come up with a number.  What should it cost to run the bitcoin network?  1% of M0 per year?  That would be 210,000 coins per year in transaction fees to miners.  That would be about 3BTC per block.

0.5% M0 annually would be 1.5BTC per block, and so on.  This would be a ceiling cost; it could cost less, if people didn't make too many transactions, or most things happened off-blockchain, and the blocks tended back towards the 1MB floor.  It would effectively put a ceiling on the maintenance cost of the network, however; if blocks were receiving 6BTC in fees, the size would double at the next difficulty adjustment, which would tend to push total fees down.

If you wanted to get fancy you could have hysteresis and non-linearity and stuff like that but if it were up to me I'd keep it really simple and say that max block size is a linear function of the previous epoch block rewards.

This can be "gamed" in 2 ways.  It can be gamed to a limited extent by miners who want to push up the max block size.  They can pay a bunch of fees to themselves and push up that average.  I can't think of a clean way to get rid of that, but hopefully that's OK; isn't it the miners who want smaller blocks anyway?  If miners are competing for larger blocks, why would the non-mining users complain?  The only issue is one miner who wants larger blocks, and everyone else wants smaller ones.  Maybe use median instead of mean to chop out malicious miners or fat-finger giant transaction fees.

It can also be gamed the other way.  Your transaction fee is 0, but you have some off-channel account with my mining group which includes all your txs for a flat monthly rate.  This also seems unlikely; if it were more expensive that way, transactors would stop using the off-channel method and just go to the open market for transaction inclusion.  If it were cheaper, why would the miner forgo that revenue?

So if I ran this whole Bitcoin thing (which would defeat the point... Smiley, that's what I would do.  The question is how much it should cost.  1BTC per block sounds OK, it's nice round number.  That's 50K BTC per year for the miners.

I'd welcome comments / criticism of why having such a feedback mechanism is a good or bad idea.
hero member
Activity: 1008
Merit: 531
Decreasing the block limit (note, not required block size) in the future would not be a hard fork, it would be a soft fork.

It won't be a soft fork, it will be an impossibility.  The miners of the future will be few in number and hostile to the ideas of bitcoin.  This is the reality that we need to design for.  Entities in control of a node will want to keep the price of maintaining a node as high as possible, so that they can control access to the information in the blockchain.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

This can be easily gamed by stuffing transactions into the blockchain, shutting out smaller players prematurely.
Thank you for contributing.
This was already mentioned earlier, you may have missed it.  Yes it can possibly be gamed in the way you mention, it is just unlikely, unprofitable, and ineffective to do so.

This effect of such an "attack" is limited by
1) Anomaly dropping
2) The % of blocks won
3) The disadvantage to those that do so by requiring transmission of larger blocks
4) Even if this "attack" is performed with 100% success by all miners, the max size only grows only a bit over 50% per year anyway (with the proposed numbers - so worse case scenario, it is about the same as Gavin's proposal).
5) Counter-balanced perhaps by other miners may want to shrink the limit and make inclusion in a block more valuable?

If you think that these factors are insufficient disincentive, and the benefits of doing such an attack are still worth it, please help us to better understand why that is?  

I maintain that I do not think we have the best answer yet, so these criticisms are valuable.  This is simply better than other proposals we have seen so far simply because it accommodates for an unpredictable future, but IMHO, not yet good enough for implementation.  Regression testing on previous block chain and some more game theory analysis.
legendary
Activity: 1596
Merit: 1100
It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

This can be easily gamed by stuffing transactions into the blockchain, shutting out smaller players prematurely.

legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.

This.


To imagine I (or anyone) can predict the future would be engaging in hubris.
Thanks to Satoshi we do not have to predict anything, because the block chain will be there, in the future, telling us what is needed.

I've offered one option.  Heard one good criticism and responded to that with a modification that I think will resolve the concern.
Then outlined one research task to help further refine this option (regression testing with the block chain).

There is more work to be done here, that much is clear.  There are graphs to be plotted, data to be crunched, and code to be written.   There are a LOT of smart folks engaged in this, who else can step up with a critique, or spare some cycles to work on the data?
newbie
Activity: 17
Merit: 0
I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.
legendary
Activity: 1050
Merit: 1002
I am just hoping that some more serious thought goes into avoiding the need to guess or extrapolate (an educated guess but still a guess).

It is offered as an example of the sort of thing that can work, rather than a finished product.

This is the problem.

People don't seem to realize Gavin's proposal may be the best we can do. I happen to think it is. If anyone had a better idea we'd have heard it by now. We, the entire community, have brooded on this issue for months if not years now. Here is a spoiler alert: nobody can predict the future.

Did anyone ever stop to think Bitcoin couldn't work? I mean I have, not for reasons technological, but for reasons of solving issues via consensus. Have you ever watched a three-legged human race, you know where one leg gets tied to the leg of another person? The reason they're funny is because it's hard to coordinate two separate thinking entities with different ideas on how to move forward, the result being slow or no progress and falling over. That may be our fate and progress gets harder the more legs get tied in. That's the reason for taking action sooner rather than later.

I've posted it before, but I'll say it again. I think a big reason Satoshi left is because he took Bitcoin as far as he could. With Gavin and other devs coming on-board he saw there was enough technical expertise to keep Bitcoin moving forward. I don't think he thought he had any more ironclad valuable ideas to give Bitcoin. Its fate would be up to the community/world he released it into. Bitcoin is an experiment. People don't seem to want to accept that, but it is. What I'd love to see is somebody against Gavin's proposal offer an actual debatable alternative. Don't just say, sorry it has to be 1MB blocks and as for what else, well that's not our thought problem; and don't just say no we don't want Gavin's proposal because it doesn't matter-of-factly predict the future, and as for what else, well we don't know.

Come up with something else or realize we need to take a possibly imperfect route, but one which could certainly work, so that we take some route at all.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

Here is an example that can come close to Gavin's first proposal of 50% increase per year.

If average block size of last 2 weeks is 60-75% of the maximum, increase maximum 1%, if >75% increase 2%
If average block size of last 2 weeks is 25-40% of the maximum decrease maximum 1%, if <29% decrease 2%

Something like this, would have no external dependencies, would adjust based on what future events may come, and won't expire or need to be changed.

These percentage numbers are ones that I picked arbitrarily.  They are complete guesses and so I don't like them anymore than any other number.  This is just to create a model of the sort of thing that would be better than extrapolating.  To do even better, we can do a regression analysis of previous blocks to see where we would be now and tune it further from there.

This may be manipulable:  miners with good bandwidth can start filling the blocks to capacity, to increase the max and push miners with smaller bandwidth out of competition.

Agreed.  And thank you for contributing.

It is offered as an example of the sort of thing that can work, rather than a finished product.
It is merely "better" not best.  I don't think we know of something that will work yet.
By better, I mean that Gavin gets his +50%/year, iff it is needed, and not if it isn't.  And if circumstances change, so does the limit.

If it is 100% manipulated, it is only as bad as Gavin's first proposal. (+4% or so)
That of course could only happen if miners with good bandwidth win all block and also want to manipulate.

If we fear manipulation, we can add anomaly dropping  and exclude the 10% most extreme outside of standard variance (so that fully padded and empty blocks are dropped out of the calculations).

It would be good to avoid creating any perverse incentives entirely wherever possible.

And again, the percentages chosen here are samples only, arbitrarily chosen.  A regression analysis of the block chain ought be employed to determine where we would be with this sort of thing as well as how it would affect the path forward.


The point here is to allow market forces to dictate.  If some miners want to shrink block size to make transactions more precious and extract fees, others will want to get those fees and increase block size.  We want something that can work in perpetuity, not a temporary fix which may get adjusted centrally whenever the whim arises.

Our guide must be math and measurement, not central committees, no matter how smart they may be.
sr. member
Activity: 333
Merit: 252
It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

Here is an example that can come close to Gavin's first proposal of 50% increase per year.

If average block size of last 2 weeks is 60-75% of the maximum, increase maximum 1%, if >75% increase 2%
If average block size of last 2 weeks is 25-40% of the maximum decrease maximum 1%, if <29% decrease 2%

Something like this, would have no external dependencies, would adjust based on what future events may come, and won't expire or need to be changed.

These percentage numbers are ones that I picked arbitrarily.  They are complete guesses and so I don't like them anymore than any other number.  This is just to create a model of the sort of thing that would be better than extrapolating.  To do even better, we can do a regression analysis of previous blocks to see where we would be now and tune it further from there.

This may be manipulable:  miners with good bandwidth can start filling the blocks to capacity, to increase the max and push miners with smaller bandwidth out of competition.
hero member
Activity: 709
Merit: 503
One does not want to see a burst of transactions sit in a queue waiting to be blocked into the chain for very long ever.  Instead we should address the source of transactions to make sure excessive spam is precluded or excluded.  Having a maximum block size at all is an aberration; except if there is some functional restriction.  In the face of such then the only reasonable course forward would be to reduce the time between blocks.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
It would seem that there could be a simple mathematical progressive increase/decrease, which is based on the factual block chain needs and realities of the time that can work forever into the future.

Here is an example that can come close to Gavin's first proposal of 50% increase per year.

If average block size of last 2 weeks is 60-75% of the maximum, increase maximum 1%, if >75% increase 2%
If average block size of last 2 weeks is 25-40% of the maximum decrease maximum 1%, if <29% decrease 2%

Something like this, would have no external dependencies, would adjust based on what future events may come, and won't expire or need to be changed.

These percentage numbers are ones that I picked arbitrarily.  They are complete guesses and so I don't like them anymore than any other number.  This is just to create a model of the sort of thing that would be better than extrapolating.  To do even better, we can do a regression analysis of previous blocks to see where we would be now and tune it further from there.
sr. member
Activity: 302
Merit: 250
Sure, we were also able to get x.25 and x.75 telecom to run over barbed wire, in the lab.  (There are places in the world that still use these protocols, some of which would deeply benefit from bitcoin in their area.)
The logistical challenges of implementation is not what you find in the lab.  
This stuff has to go out in environments where someone backs up their truck into a cross country line so they can cut it and drive off with a few miles of copper to sell as scrap.  We live in the world, not in the lab.

We're in luck then, because one advantage of fiber lines over copper is they're not good used for anything other than telecom Smiley

I'm no telecommunications specialist, but do have an electronics engineering background. Raise some issue with fundamental wave transmission and maybe I can weigh in. My understanding is it's easier to install fiber lines, for example, because there is no concern over electromagnetic interference. Indeed, the fiber lines I witnessed being installed a week ago were being strung right from power poles.

However, is such theoretical discussion even necessary? We have people being offered 2Gbps bandwidth over fiber not in theory but in practice in Japan, today.

That's already orders of magnitude over our starting bandwidth numbers. I agree with Gavin that demand for more bandwidth is inevitable. It's obvious all networks are converging - telephone, television, radio, internet. We'll eventually send all our data over the internet, as we largely do now, but in ever increasing bandwidth usage. To imagine progress in technology will somehow stop for no apparent reason, when history is chock full of people underestimating what technological capacity we actually experience is not only shortsighted, it borders unbelievable.

Perhaps few disagree that Bitcoin can be improved by a plan for block size maximum adjustment.  My issues with the proposals are less what it achieves (a good thing) but what it doesn't (preventing this from happening in the future).

There are myriad external realities that we can not know about.  The development of the telecom technology is perhaps less the issue than what the world has in store for us in the coming decades.  I don't know, and no one else does either, but that shouldn't stop us from striving to achieve what has not been done before.

Undersea cables are cut accidentally, and by hostile actions, economic meltdowns and military conflicts halt or destroy deployments, plagues, natural disasters etc, OR new developments can accelerate everything, robots might do this all for us.  We can't know by guessing today what the right numbers will be.  We could be high or low.  I am just hoping that some more serious thought goes into avoiding the need to guess or extrapolate (an educated guess but still a guess).  We do not have a crisis today other than some pending narrow business concerns (some of which are on the board of TBF and possibly suggested that Gavin "do something").  I am also thankful that he is doing so.  This is an effort that deserves attention (even with the other mitigating efforts already in development).  Gavin is a forward thinking man, and is serving his role well.  We should be all glad that he is not alone in this, and that no one person has the power to make such decisions arbitrarily for others.

The difficulty adjustment algorithm works without knowing the future.  We should similarly look for a way that can also work for many generations, come what may, and save Bitcoin from as many future hard forks as we can. 

This is our duty, to our future, by virtue of us being here at this time.

Decreasing the block limit (note, not required block size) in the future would not be a hard fork, it would be a soft fork.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Sure, we were also able to get x.25 and x.75 telecom to run over barbed wire, in the lab.  (There are places in the world that still use these protocols, some of which would deeply benefit from bitcoin in their area.)
The logistical challenges of implementation is not what you find in the lab.  
This stuff has to go out in environments where someone backs up their truck into a cross country line so they can cut it and drive off with a few miles of copper to sell as scrap.  We live in the world, not in the lab.

We're in luck then, because one advantage of fiber lines over copper is they're not good used for anything other than telecom Smiley

I'm no telecommunications specialist, but do have an electronics engineering background. Raise some issue with fundamental wave transmission and maybe I can weigh in. My understanding is it's easier to install fiber lines, for example, because there is no concern over electromagnetic interference. Indeed, the fiber lines I witnessed being installed a week ago were being strung right from power poles.

However, is such theoretical discussion even necessary? We have people being offered 2Gbps bandwidth over fiber not in theory but in practice in Japan, today.

That's already orders of magnitude over our starting bandwidth numbers. I agree with Gavin that demand for more bandwidth is inevitable. It's obvious all networks are converging - telephone, television, radio, internet. We'll eventually send all our data over the internet, as we largely do now, but in ever increasing bandwidth usage. To imagine progress in technology will somehow stop for no apparent reason, when history is chock full of people underestimating what technological capacity we actually experience is not only shortsighted, it borders unbelievable.

Perhaps few disagree that Bitcoin can be improved by a plan for block size maximum adjustment.  My issues with the proposals are less what it achieves (a good thing) but what it doesn't (preventing this from happening in the future).

There are myriad external realities that we can not know about.  The development of the telecom technology is perhaps less the issue than what the world has in store for us in the coming decades.  I don't know, and no one else does either, but that shouldn't stop us from striving to achieve what has not been done before.

Undersea cables are cut accidentally, and by hostile actions, economic meltdowns and military conflicts halt or destroy deployments, plagues, natural disasters etc, OR new developments can accelerate everything, robots might do this all for us.  We can't know by guessing today what the right numbers will be.  We could be high or low.  I am just hoping that some more serious thought goes into avoiding the need to guess or extrapolate (an educated guess but still a guess).  We do not have a crisis today other than some pending narrow business concerns (some of which are on the board of TBF and possibly suggested that Gavin "do something").  I am also thankful that he is doing so.  This is an effort that deserves attention (even with the other mitigating efforts already in development).  Gavin is a forward thinking man, and is serving his role well.  We should be all glad that he is not alone in this, and that no one person has the power to make such decisions arbitrarily for others.

The difficulty adjustment algorithm works without knowing the future.  We should similarly look for a way that can also work for many generations, come what may, and save Bitcoin from as many future hard forks as we can. 

This is our duty, to our future, by virtue of us being here at this time.
Pages:
Jump to: