Author

Topic: Konrad S. Graf finally weighs in on the Bitcoin block size debate (Read 709 times)

member
Activity: 115
Merit: 10
It's very dangerous to increase the max blocksize at all. As we know, miners would simply include all transactions to infinity and bloat up the blocks, hurting the decentralisation of rasb pi nodes on rural ISPs.

Unfortunately, Bitcoin was never designed to rely on free market incentives for security. We are lucky that Gregory Maxwell has taken the burden upon himself to secure the system by setting miner's production levels and tx pricing for them. This keeps us safe and censorship resistant.

With segwit in April and Lightning Hubs out this summer, enough of the transaction volume will have been moved off the Bitcoin blockchain that it won't be a concern any longer.

Tl;dr:
This Konrad Graf guy doesn't know what he's talking about.
legendary
Activity: 2674
Merit: 2965
Terminated.
What do you think would be "good limitations"?
You're asking quite a difficult question. I can't really tell you what the perfect combination is. There were several proposals now that are different, some would give miners more control, some were basing on the average size of previous block size in X period of time and so on. More research and testing is needed in this area to determine what the right proposal would be.
newbie
Activity: 37
Merit: 0
So Bitcoin.com just published an interview with Konrad S. Graf (https://news.bitcoin.com/konrad-graf-bitcoin-block-size-economy/). He's very well respected in this community for his economic/historical work on the origin of Bitcoin's value, so I think his views on the block size are important. Apparently, he views the 1 MB limit as something similar to a government-mandated output ceiling, and that it disrupts a block size 'free market,' so to speak. What do you guys think?

The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.

...

Did you read the article? He doesn't advocate for removal of the block size limit. He does seem in favor of keeping the limit well above the average block size, whether by a series of hard forks or a dynamically adjusting block size limit, but he doesn't really take a strong position one way or another. He does briefly discuss what he thinks would happen in the absence of a block size limit on the last page, but I certainly wouldn't say he's pushing for it.

If you didn't read, I would recommend checking it out - I think he presents some nuanced and logical arguments about what is happening and could happen under different scenarios.

No, I barely skimmed the title.

My comments are speaking about block size removal in general, as that is the pertinent issue on peoples minds with this whole Craig Wright fiasco.

The issue at hand is that Gavin's BIP101 proposal had an absolutely absurd block size limit of 8MB which doubled every two years to 8GB near the end of its term, and I want people to realize that this is essentially the same as removing the block size limit all together.

I was not making any comments on his specific opinion if that is what you thought.

You should definitely give it a read when you get the chance.
sr. member
Activity: 281
Merit: 250
The Gold Standard of Digital Currency.
The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.
Correct.

numpties!

even with the 1mb still in place this summer.. segwit allows for upto 1.8mb of data (think about it a hard rule saying 1mb is the limit being abused to actually allow 1.8mb).. then comes confidential payment codes which combined allow over 2.5mb of data whilst the blocksize limit is still set to 1mb..

can anyone see the hypocracy of the blockstreamers yet..

2.5mb of their features (meaning you have to use their software and the different signing algorithm) all for what.. well it definetly wont be 2.5x current capacity. infact its more like 1.8x capacity.

then in 2017 they will finally give in to moving the coded block limit to 2mb.. but with their features it will be REAL data of over 5mb.
so we ask will this 5mb be 5x todays capacity??
nope.

will it be 4x todays capacity..
nope

it will be 3.6x todays capacity..

so next time a block streamer tells you that right now 2mb is bad because the network cannot cope. ask them these questions.

1. ignoring the light, pruned, no witness fluff.. concentrating on a true full relay, full archival node how much REAL data is being pushed when segwit and confidential payment codes is released.

2. is allowing people to be blindly told that running pruned, light, no witness mode is no threat/harm, knowing that they are not true full nodes?

3. knowing the answers to 1 and 2. how can blockstreamers really argue the data propogation debate and the full node count dilution debate, the capacity per mb debate... and still think that blockstreams roadmap is better then other simpler solutions

Right now, the network bandwidth can handle a 5MB block every 10 minutes without cutting off major portions of the world from being able to run full nodes(feasibly- ie: without saturating their pipes). This basically amounts to an ISDN or higher connection. This in itself is reasonable.

The additional problem with increasing the block size to 5MB (or anything substantially large for that matter) is that then some clever exploiters (mining farms mainly), may spam the network with cheap transactions in order to pad the blocks to their maximum size, which would then pose a problem for casual computer's storage, which may centralize the network to those particular farms, which then gives them the ability to dictate the chain's direction.

I'm all for a 2-4MB block size for bitcoin though.. At the very least it'd be a stop gap measure for the current transaction volume issues.

legendary
Activity: 3136
Merit: 1116
...

No, I barely skimmed the title.
...

I was not making any comments on his specific opinion if that is what you thought.

Yea, I thought you might be, you know, commenting on what the thread is actually about Tongue
legendary
Activity: 4396
Merit: 4755
The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.
Correct.

numpties!

even with the 1mb still in place this summer.. segwit allows for upto 1.8mb of data (think about it a hard rule saying 1mb is the limit being abused to actually allow 1.8mb).. then comes confidential payment codes which combined allow over 2.5mb of data whilst the blocksize limit is still set to 1mb..

can anyone see the hypocracy of the blockstreamers yet..

2.5mb of their features (meaning you have to use their software and the different signing algorithm) all for what.. well it definetly wont be 2.5x current capacity. infact its more like 1.8x capacity.

then in 2017 they will finally give in to moving the coded block limit to 2mb.. but with their features it will be REAL data of over 5mb.
so we ask will this 5mb be 5x todays capacity??
nope.

will it be 4x todays capacity..
nope

it will be 3.6x todays capacity..

so next time a block streamer tells you that right now 2mb is bad because the network cannot cope. ask them these questions.

1. ignoring the light, pruned, no witness fluff.. concentrating on a true full relay, full archival node how much REAL data is being pushed when segwit and confidential payment codes is released.

2. is allowing people to be blindly told that running pruned, light, no witness mode is no threat/harm, knowing that they are not true full nodes?

3. knowing the answers to 1 and 2. how can blockstreamers really argue the data propogation debate and the full node count dilution debate, the capacity per mb debate... and still think that blockstreams roadmap is better then other simpler solutions
sr. member
Activity: 281
Merit: 250
The Gold Standard of Digital Currency.
So Bitcoin.com just published an interview with Konrad S. Graf (https://news.bitcoin.com/konrad-graf-bitcoin-block-size-economy/). He's very well respected in this community for his economic/historical work on the origin of Bitcoin's value, so I think his views on the block size are important. Apparently, he views the 1 MB limit as something similar to a government-mandated output ceiling, and that it disrupts a block size 'free market,' so to speak. What do you guys think?

The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.

...

Did you read the article? He doesn't advocate for removal of the block size limit. He does seem in favor of keeping the limit well above the average block size, whether by a series of hard forks or a dynamically adjusting block size limit, but he doesn't really take a strong position one way or another. He does briefly discuss what he thinks would happen in the absence of a block size limit on the last page, but I certainly wouldn't say he's pushing for it.

If you didn't read, I would recommend checking it out - I think he presents some nuanced and logical arguments about what is happening and could happen under different scenarios.

No, I barely skimmed the title.

My comments are speaking about block size removal in general, as that is the pertinent issue on peoples minds with this whole Craig Wright fiasco.

The issue at hand is that Gavin's BIP101 proposal had an absolutely absurd block size limit of 8MB which doubled every two years to 8GB near the end of its term, and I want people to realize that this is essentially the same as removing the block size limit all together.

I was not making any comments on his specific opinion if that is what you thought.
newbie
Activity: 37
Merit: 0
Why does everybody keep focusing on the blocksize issue, when the real issue is block generation frequenc?,. Is it just that blocksize is an apparently trivial change that everybody can understand, but they don't appreciate the broader implications such as possible mining centralisation.

I think a lot of people would argue that mining centralization is caused by economies of scale found in the bitcoin mining industry and wouldn't be affected much in either direction by any change (or lack thereof) in the block size.
legendary
Activity: 3136
Merit: 1116
So Bitcoin.com just published an interview with Konrad S. Graf (https://news.bitcoin.com/konrad-graf-bitcoin-block-size-economy/). He's very well respected in this community for his economic/historical work on the origin of Bitcoin's value, so I think his views on the block size are important. Apparently, he views the 1 MB limit as something similar to a government-mandated output ceiling, and that it disrupts a block size 'free market,' so to speak. What do you guys think?

The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.

...

Did you read the article? He doesn't advocate for removal of the block size limit. He does seem in favor of keeping the limit well above the average block size, whether by a series of hard forks or a dynamically adjusting block size limit, but he doesn't really take a strong position one way or another. He does briefly discuss what he thinks would happen in the absence of a block size limit on the last page, but I certainly wouldn't say he's pushing for it.

If you didn't read, I would recommend checking it out - I think he presents some nuanced and logical arguments about what is happening and could happen under different scenarios.
sr. member
Activity: 281
Merit: 250
The Gold Standard of Digital Currency.
Why does everybody keep focusing on the blocksize issue, when the real issue is block generation frequenc?,. Is it just that blocksize is an apparently trivial change that everybody can understand, but they don't appreciate the broader implications such as possible mining centralisation.

Frequency raises the issue of time-warp attacks/network propagation delays. This problem has been solved in GoldCoin (GLD), which can handle 10X Bitcoin's current transaction volume.

See most chains accept blocks that are hours into the future or in the past.. GLD only accepts blocks that are at most 45 seconds into the future and 2 min 45 sec into the past. This makes it very difficult to do a time warp attack on our chain.

It is not a simple change for bitcoin as it would invalidate some miner blocks most likely, and that may cause controversy.
legendary
Activity: 2814
Merit: 2472
https://JetCash.com
Why does everybody keep focusing on the blocksize issue, when the real issue is block generation frequenc?,. Is it just that blocksize is an apparently trivial change that everybody can understand, but they don't appreciate the broader implications such as possible mining centralisation.
newbie
Activity: 37
Merit: 0
The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.
Correct. This would only work if all participants of the system were honest and mean no harm. This is obviously not the case with Bitcoin as it keeps being attacked via various methods (FUD campaigns, spam, TX spam, etc.)

It is however, far too early for this to happen. What is needed right now is a dynamic block size limit, that expands using a predictive model of the same function which models the minimum increase in network bandwidth over time around the world.
There was talk about this being implemented at a later point. However, one has to set up very good 'limitations' so that the dynamic block size limit can't (hopefully) be abused in any way.

What do you think would be "good limitations"?
hero member
Activity: 1694
Merit: 502
★Bitvest.io★ Play Plinko or Invest!
So Bitcoin.com just published an interview with Konrad S. Graf (https://news.bitcoin.com/konrad-graf-bitcoin-block-size-economy/). He's very well respected in this community for his economic/historical work on the origin of Bitcoin's value, so I think his views on the block size are important. Apparently, he views the 1 MB limit as something similar to a government-mandated output ceiling, and that it disrupts a block size 'free market,' so to speak. What do you guys think?

The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.

It is possible, in a system whereby there are billions/trillions of clients or more, to remove the block size limits, and simply let non-full node clients connect to whatever is the most popular network out there, and this may have been Satoshi's final intent.

It is however, far too early for this to happen. What is needed right now is a dynamic block size limit, that expands using a predictive model of the same function which models the minimum increase in network bandwidth over time around the world.

For  me as amateur this is very simple  and helpful explanation. You cleared up some things that confused me before about block size.
I see logic  in your explanation and good side of limited block size.  I hope everything will end up in best positive way for bitcoin.
legendary
Activity: 2674
Merit: 2965
Terminated.
The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.
Correct. This would only work if all participants of the system were honest and mean no harm. This is obviously not the case with Bitcoin as it keeps being attacked via various methods (FUD campaigns, spam, TX spam, etc.)

It is however, far too early for this to happen. What is needed right now is a dynamic block size limit, that expands using a predictive model of the same function which models the minimum increase in network bandwidth over time around the world.
There was talk about this being implemented at a later point. However, one has to set up very good 'limitations' so that the dynamic block size limit can't (hopefully) be abused in any way.
newbie
Activity: 37
Merit: 0
So Bitcoin.com just published an interview with Konrad S. Graf (https://news.bitcoin.com/konrad-graf-bitcoin-block-size-economy/). He's very well respected in this community for his economic/historical work on the origin of Bitcoin's value, so I think his views on the block size are important. Apparently, he views the 1 MB limit as something similar to a government-mandated output ceiling, and that it disrupts a block size 'free market,' so to speak. What do you guys think?

The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.

It is possible, in a system whereby there are billions/trillions of clients or more, to remove the block size limits, and simply let non-full node clients connect to whatever is the most popular network out there, and this may have been Satoshi's final intent.

It is however, far too early for this to happen. What is needed right now is a dynamic block size limit, that expands using a predictive model of the same function which models the minimum increase in network bandwidth over time around the world.

Good points.
sr. member
Activity: 281
Merit: 250
The Gold Standard of Digital Currency.
So Bitcoin.com just published an interview with Konrad S. Graf (https://news.bitcoin.com/konrad-graf-bitcoin-block-size-economy/). He's very well respected in this community for his economic/historical work on the origin of Bitcoin's value, so I think his views on the block size are important. Apparently, he views the 1 MB limit as something similar to a government-mandated output ceiling, and that it disrupts a block size 'free market,' so to speak. What do you guys think?

The potential for abuse without a block-size limit is too high, if the block size limit is removed, then it will be the end of all chains that adopted the removal.

It is possible, in a system whereby there are billions/trillions of clients or more, to remove the block size limits, and simply let non-full node clients connect to whatever is the most popular network out there, and this may have been Satoshi's final intent.

It is however, far too early for this to happen. What is needed right now is a dynamic block size limit, that expands using a predictive model of the same function which models the minimum increase in network bandwidth over time around the world.
newbie
Activity: 37
Merit: 0
So Bitcoin.com just published an interview with Konrad S. Graf (https://news.bitcoin.com/konrad-graf-bitcoin-block-size-economy/). He's very well respected in this community for his economic/historical work on the origin of Bitcoin's value, so I think his views on the block size are important. Apparently, he views the 1 MB limit as something similar to a government-mandated output ceiling, and that it disrupts a block size 'free market,' so to speak. What do you guys think?
Jump to: