Pages:
Author

Topic: Increasing the block size is a good idea; 50%/year is probably too aggressive - page 4. (Read 14297 times)

hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
NewLiberty, thanks again for taking the time to explain your point of view.

The reason, by the way, I was asking the earlier questions was because I actually didn't know the answers. In particular, this answer (happily) surprised me:

Excluding DOS flooding or other malicious actors, do you believe it would ever be a beneficial thing to have the blocksize limit hit?

Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.

Regarding Nielsen's law:
Yes, I disagree.  
Both Block size and transaction fee may be better tools than Nielsen's law, the combination may be even more so.  Dismissing inquiry on the matter, is a missed opportunity.

I don't disagree that Nielsen's law is inaccurate, however I remain quite skeptical that there's something in the blockchain that can more accurately predict grandma's computing resources. Having said that, I think I'm misunderstanding your goal here (and I'm maybe OK with that): it seems as though you're not interested in using grandma's computing resources as a block size limit, you'd prefer a much lower bound at times when transaction volume isn't growing.

My biggest concern with the alternatives discussed in this thread isn't the potential for unchecked growth, but rather the potential for miners creating forced artificial scarcity (hence my first question, for which I expected a different response).

For example in the first algorithm you suggested, a majority mining cartel could artificially limit the max block size, preventing a mining minority from including transactions. It's this lack of free-market choice that I'd disagree with.

If the difference between average block size and max block size were a magnitude or two of order away, I'd find it much more agreeable.

My (ideal) goals, in particular, would be to (1) never kick out grandma, and (2) never prevent a minor from including a legitimate transaction. (edited to add: those are in priority order)
member
Activity: 129
Merit: 14
Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.

Gavin
Thanks for being so responsive on this issue.  Although, I am still not fully convinced by the blockszie economics post.

Suppliers of “commodities” need to make a profit and in this case if mining is competitive the difficulty will adjust and miners profit will reach a new equilibrium level.  The question is what is the equilibrium level of difficulty?  Letting normal market forces work means the price reaches some level, however this market has a “positive externality” which is network security.  Using an artificially low blocksize limit could be a good, effective and transparent way of manipulating the market to ensure network security.

Network security can be looked at in two ways:
1.   The network hashrate
2.   Aggregate mining revenue per block (as in theory at least, the cost of renting the bitcoin network’s hashrate to attack it could be related to mining revenue)

Mining revenue is therefore an important factor in network security.  Please try to consider this carefully when considering the maximum blocksize issue.  To be clear, I am not saying it shouldn’t increase above 1MB, I think it should.  However please consider mining incentives once the block reward falls, as one of the factors.  Bandwidth and the related technical issues should not be the only consideration.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?

The Socratic inquiry is a bit pedantic, don't you think?
Skip to your point please.

Very well.

No metric that can be gleaned from the blockchain has a causal relationship with "the bandwidth and disk space an average enthusiast can afford", and therefore any such predictor has a high danger of being either too restrictive or not restrictive enough.

Using Nielsen's Law also has a danger of being inaccurate, however given that it has at least been historically accurate, I find this danger much lower.

Do you disagree? (let's leave ossification out of this just for the moment, if you don't mind)

Thank you.  You saved yourself a lot of time.  I had enough socratic in law school.  And we'll set aside ossification for your benefit even though it cuts against your position here.

Yes, I disagree. 
Both Block size and transaction fee may be better tools than Nielsen's law, the combination may be even more so.  Dismissing inquiry on the matter, is a missed opportunity.

Having worked in multinational telcos for a few decades designing resilient scalable systems serving 193+ countries, managing teams of security software engineers, and responsibility for security and capacity management, the concepts are not so foreign.  The benefit of something like the block chain to provide consolidated data to rightsize applications over time for their audience, is a ripe fruit.


Neilson's law is less fit for purpose. 
1) It has measured fixed line connections.
- Usage demographics have changed over the period of history it covers.  More connections are mobile now than previous, and telco resources and growth have shifted.  There are other shifts to come.  These are not accommodated in the historical averages, nor are they factored into the current ones under Neilson.

2) It is not a measure of the average enthusiast.
- It measures a first world enthusiast, whose means have improved with age, in a rich nation with good infrastructure in time of peace.  This is not average with respect to place and time through history.

3) Following bandwidth growth is not the only function of max block size, though tying it to the average enthusiast capabilities (if that were possible) would be a suitable way of addressing other functions.
- ultimately it must accommodate the transactions of sufficient fees to maintain the network, and to not constrain reasonable commerce.  These will be business decisions which may be depending on the capacity and cost of the Bitcoin network and its associated fees.  These may radically bend the curve in one way or another.  A fixed non-responsive rate can not be flexible to a changing environment.  Avoiding a requirement for central decision makers to accommodate (or not) puts perverse incentives on Bitcoin developers.

I get that the core devs, (and former core devs) do have do deal with a lot of crazies.  But what is not needed is the "either you agree with me or your are stupid, crazy, or lazy" dismissals of doing real science instead of merely technicians work.  Science is hard, but it is often worth it.

I recall Gavin's talk in San Jose in 2013 being a lot more nuanced on this matter, and it looked like there were real solutions coming, with a future-proof market sensitive approach.  That conference was better in many ways than TBF did this year in Amsterdam.

That earlier stance was optimistic and well founded, it was abandoned.  The explanations for why it was abandoned don't seem compelling at all.


In my first proposal in this thread, I replicated Gavin's Nielsen's law approach with a simple algorithm that replicated it in effect, but took its cues from the block chain to accomplish that (so growth would stop or accelerate if real world circumstances changed).  This was simply an exercise to show that it would be easy enough to do so.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?

The Socratic inquiry is a bit pedantic, don't you think?
Skip to your point please.

Very well.

No metric that can be gleaned from the blockchain has a causal relationship with "the bandwidth and disk space an average enthusiast can afford", and therefore any such predictor has a high danger of being either too restrictive or not restrictive enough.

Using Nielsen's Law also has a danger of being inaccurate, however given that it has at least been historically accurate, I find this danger much lower.

Do you disagree? (let's leave ossification out of this just for the moment, if you don't mind)
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?

The Socratic inquiry is a bit pedantic, don't you think?
Skip to your point please.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.

Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.

Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?

I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?

Yes.  Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.

Thank  you.

Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer

I don't know why there's so much discussion about the max block size when the real issue should be how are you going to increase adoption (more people willing to do more txs and pay more fees to keep the network secure) so that Bitcoin is sustainable without a much lower or zero block subsidy in the future.

There are WAY more people working on the adoption issue, than there are on this one.  If you point is that I should go do something else.  Granted.  If this is done properly I would certainly be doing something else.  But consider that doing it properly, is a matter accretive to adoption as well.

In the later days, Bitcoin will be supported by its tx fees.  Currently the fees support 1/300th of the miner payment.
We are doing about 1 tx per second or so, the limit is about 7tx per second, so now is the time to address this.

It takes time to do things right.  The alternative is that we just patch it and move on, sweeping the problem under the rug for the next time it needs to be patched.  I think that it is likely, and that we (or our children) may regret that.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
NewLiberty, I have a quick question for you which will hopefully clarify your position in my mind.

Excluding DOS flooding or other malicious actors, do you believe it would ever be a beneficial thing to have the blocksize limit hit?

Primarily the limit is a safeguard, a backstop.  It is not meant to be a constraint on legitimate commerce.
It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable.
Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.
legendary
Activity: 1260
Merit: 1019
Quote
It is certainly true that nobody can predict the future with 100% accuracy.

I can.
Bitcoin will die in 5 months.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
NewLiberty, I have a quick question for you which will hopefully clarify your position in my mind.

Excluding DOS flooding or other malicious actors, do you believe it would ever be a beneficial thing to have the blocksize limit hit?
sr. member
Activity: 252
Merit: 250
Coin Developer - CrunchPool.com operator

I don't know why there's so much discussion about the max block size when the real issue should be how are you going to increase adoption (more people willing to do more txs and pay more fees to keep the network secure) so that Bitcoin is sustainable without a much lower or zero block subsidy in the future.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Are you simply unaware of the other ways scalability is limited and only focused on this one?
We can go into it if you like.
I was looking to keep this discussion on the more narrow issue.

Start a new thread.  HAVE you read my Scalability Roadmap blog post?

I read it, I offered some criticisms in the thread by that title a while back.

It is nice and theoretical.  There are practical things it misses (such as the zero cost mining that does occur in the real world from time to time when the equipment is not owned by the person controlling it)
There are also non-economic actors that do things for reasons other than money, and those working in larger economies (in which Bitcoin is only a minor part) with different agenda entirely.

It was a nice blog post and explained things in a simple way under ideal conditions.  I would refer people to it that need a primer on the matter.


In basic physics we give students problems that assume they are operating in a vacuum.  Basic economics also does this.  The real world is more complex.
legendary
Activity: 1652
Merit: 2301
Chief Scientist
Are you simply unaware of the other ways scalability is limited and only focused on this one?
We can go into it if you like.
I was looking to keep this discussion on the more narrow issue.

Start a new thread.  HAVE you read my Scalability Roadmap blog post?
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Yes, I think Bitcoin can surpass this.  There are other problems with the way that scalability is limited than the block size to get there, this is just one.
And we agree on the purpose of the block size limit.  Just not on how to set it.

You don't know what the future market will look like.  You don't know what bandwidth or storage will be available.  Neither do I or anyone else.

When you respond to me saying patronizing things like "there are other problems with the way scalability is limited," I have trouble not thinking you are either confused or insane. Or just lazy, and did not read my "Scalability Roadmap" blog post.

It is certainly true that nobody can predict the future with 100% accuracy. We might get hit by an asteroid before I finish this sentence. (whew! didn't!)

But extrapolating current trends seems to me to be the best we can do-- we are just as likely to be too conservative as too aggressive in our assumptions.

Are you simply unaware of the other ways scalability is limited and only focused on this one?
We can go into it if you like.
I was looking to keep this discussion on the more narrow issue.
legendary
Activity: 1652
Merit: 2301
Chief Scientist
Yes, I think Bitcoin can surpass this.  There are other problems with the way that scalability is limited than the block size to get there, this is just one.
And we agree on the purpose of the block size limit.  Just not on how to set it.

You don't know what the future market will look like.  You don't know what bandwidth or storage will be available.  Neither do I or anyone else.

When you respond to me saying patronizing things like "there are other problems with the way scalability is limited," I have trouble not thinking you are either confused or insane. Or just lazy, and did not read my "Scalability Roadmap" blog post.

It is certainly true that nobody can predict the future with 100% accuracy. We might get hit by an asteroid before I finish this sentence. (whew! didn't!)

But extrapolating current trends seems to me to be the best we can do-- we are just as likely to be too conservative as too aggressive in our assumptions.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
I continue to maintain that market forces can rightsize MAX_BLOCK_SIZE if an algorithm with a feedback mechanism can be introduced, and that doing so introduces both less centralization risk than an arbitrary patch, and less risk of future manual arbitrary adjustments.
Fix it right, fix it once.

I think you are confusing MAX_BLOCKSIZE with the floating, whatever-the market-demands blocksize.

MAX_BLOCKSIZE is, in my mind, purely a safety valve-- a "just in case" upper limit to make sure it doesn't grow faster than affordable hardware and software can support.

Ideally, we never bump into it. If we go with my proposal (increase to 20MB now, then double ten times over the next twenty years) I think it is reasonably likely the market-determined size will never bump into MAX_BLOCKSIZE.

I think it is very unlikely that in 20 years we will need to support more Bitcoin transactions than all of the cash, credit card and international wire transactions that happen in the world today (and that is the scale of transactions that a pretty-good year-2035 home computer and network connection should be able to support).


No, I am not confused on this matter.  I don't know why you would imagine this.  
It seems weird and bizzare (as if you imagine anyone that disagrees with your proposal must obviously be confused or insane...)


Visa today is about 2000 tx per second in average (non-peak) times.
Yes, I think Bitcoin can surpass this.  There are other problems with the way that scalability is limited than the block size to get there, this is just one.
And we agree on the purpose of the block size limit.  Just not on how to set it.

You don't know what the future market will look like.  You don't know what bandwidth or storage will be available.  Neither do I or anyone else.
legendary
Activity: 1652
Merit: 2301
Chief Scientist
I continue to maintain that market forces can rightsize MAX_BLOCK_SIZE if an algorithm with a feedback mechanism can be introduced, and that doing so introduces both less centralization risk than an arbitrary patch, and less risk of future manual arbitrary adjustments.
Fix it right, fix it once.

I think you are confusing MAX_BLOCKSIZE with the floating, whatever-the market-demands blocksize.

MAX_BLOCKSIZE is, in my mind, purely a safety valve-- a "just in case" upper limit to make sure it doesn't grow faster than affordable hardware and software can support.

Ideally, we never bump into it. If we go with my proposal (increase to 20MB now, then double ten times over the next twenty years) I think it is reasonably likely the market-determined size will never bump into MAX_BLOCKSIZE.

I think it is very unlikely that in 20 years we will need to support more Bitcoin transactions than all of the cash, credit card and international wire transactions that happen in the world today (and that is the scale of transactions that a pretty-good year-2035 home computer and network connection should be able to support).
legendary
Activity: 3878
Merit: 1193
Yeah, something that looks at block sizes over a past period of time to determine the next max block size for a certain period would be ok for example.

Any feedback loop can be gamed. You might as well just pick a fixed +xx% per year and be done with it.
Pages:
Jump to: