Pages:
Author

Topic: Bitcoin 20MB Fork - page 77. (Read 154787 times)

legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 10, 2015, 06:07:58 AM
Lets implement a maximum block size that can adjust to what is needed rather than just guessing at what might be needed?  Lets build a protocol for the next 100 years, not the next 100 weeks?  I think we can do better, and we currently have the time to do so. 

The frustration with arguing with many of you is that you come at this issue as though it were an economic problem.  It's not an economic problem.  Economically, the block size should not be artificially limited.

We agree on this, or were you just strawmanning? Wink


What it is, instead, is a technical problem, or a political problem, or an existential problem.  The problem is not whether miners will continue to get paid.  That hasn't been a problem for years.  Mining is so ridiculously huge at this point that any "security" achieved by marginal increase in hashing power pales in comparison to other, much larger, existential threats to Bitcoin.  And those *do* exist.  What Bitcoin is attempting will not be a cakewalk.

Gavin has done a good job of laying out the technical limitations, which, frankly, are few.  He says the technical limit is somewhere beyond 16.7GB.  I have no reason to dispute this.  And I have seen no one actually attempt to dispute it.  If you think 20MB blocks are too large, you probably have sub-standard internet service.  I'm right there with you.  Most people probably have sub-standard internet service.

Which brings us to the real issue.  No one has done a decent job of laying out the political problems, and the existential threats posed by a block size increase.  A lot of people have made various insinuations that there is a plot against Bitcoin, which, if you read my posts, I would even tend to agree with.  Yet there is little concrete discussion of what that threat even is.  The threat is usage?  The threat is growth?  The threat is voluntary centralization?

*One* person has suggested that 2MB blocks are acceptably large.  Come on, be realistic.  2MB or 1MB, really just doesn't matter at all.  Such a limit is simply laughable.  What an idiotic hill to choose to die on.  Anyone who insists on such a limit would be part of the real "plot" against Bitcoin, as far as I'm concerned.  For all of your crying about "decentralization," to insist on crippling Bitcoin at a rate that is only useful for gigantic financial institutions is just embarrassing.  At that point, if that's the best you all can come up with, then it will be time to move on to plan B, because this iteration of Bitcoin will have failed.

This is just not a serious discussion, at all.  There are a dozen different possible outcomes, here.  There are a dozen different ways that Bitcoin can evolve in the future.  So far we have explored three, maybe four of them.  Please try to think a little outside the box.

Gavin has made a good start but it is only a beginning.  He's run some software testing and made some proposals.
He also looked up the historic data network growth rates in North America and decided that that pattern is good enough to base the protocol upon.  Nielson's law.   That is where we diverge.

Gavin would argue that since it is an upper boundary, it can only be "too low" and never "too high".  Where his reasoning fails is that there is a cost to the transaction data set and this cost is borne out by the number of times it must be replicated across the network.

There are pernicious effects of permitting too large blocks to be confirmed.  Increased orphaning, bandwidth attacks on smaller nodes, excessive spamming would be a few of these.

We agree that his proposal is 'the simplest that could possibly work'.  However, it is not so much to ask for better than 'could possibly'?  We'll end up settling for 'the best we can do by the time we need to do something', and a number of folks would agree that his proposal isn't the best we can do.  It is merely an expedient one.
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
February 10, 2015, 05:26:02 AM
Guys i am trying to understand the whole thing, but please ELI5 how will be my coins, stored in paper wallets affected??? I will have to sell them and buy new coins on the new fork or i will be able to use my coins on both blockchains....? I don't get it  Huh If this is the case, that we will have to sell our coins and buy new ones, i don't want that....
Bitcoin will always remain backwards compatible to cold storage. This change will only affect miners and nodes.
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
February 10, 2015, 05:23:00 AM
Wow 56 pages and 500ish votes laters..
Still no clear view about the issue.

Good.. i'm sure bitcoin's anti-fragility will empower US even more. Cool


PS: but try to keep it civilized and healthy debate folks Smiley

This is a designated troll thread. The original debate thread was over and closed.
hero member
Activity: 748
Merit: 500
February 10, 2015, 05:20:29 AM
Guys i am trying to understand the whole thing, but please ELI5 how will be my coins, stored in paper wallets affected??? I will have to sell them and buy new coins on the new fork or i will be able to use my coins on both blockchains....? I don't get it  Huh If this is the case, that we will have to sell our coins and buy new ones, i don't want that....
legendary
Activity: 1260
Merit: 1002
February 10, 2015, 05:16:34 AM
Wow 56 pages and 500ish votes laters..
Still no clear view about the issue.

Good.. i'm sure bitcoin's anti-fragility will empower US even more. Cool


PS: but try to keep it civilized and healthy debate folks Smiley
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
February 10, 2015, 04:55:05 AM
Lets implement a maximum block size that can adjust to what is needed rather than just guessing at what might be needed?  Lets build a protocol for the next 100 years, not the next 100 weeks?  I think we can do better, and we currently have the time to do so. 

The frustration with arguing with many of you is that you come at this issue as though it were an economic problem.  It's not an economic problem.  Economically, the block size should not be artificially limited.
...

Neither Gavin nor you nor any of the hard-fork crowd seem anxious to answer this.

"We are targeting the top {n}% of world population by gross income being able to perform {n} native Bitcoin transactions per year and pay {x}% of their transaction values to miners as fees.  Here is our roadmap."

It seems unrealistic to complain about a lack of decent analysis without such a basic goal being stated.

It's interesting you use "roadmap" as your analogy. You are thinking in two dimensions. Bitcoin isn't about money management. Bitcoin will open transit systems (to extend your analogy) never thought possible. Bitcoin doesn't need to replace coinage because I imagine material scientists will design very difficult to counterfeit physical bills and tokens. Bitcoin doesn't need to replace lending or credit, because trust is how people help each other. Instead it will create trustless contracts between normally unreliable and even hostile producers and consumers. It will be a tool for economic expansion, not bean counting. So to answer your fast-food managerial level question about who gets paid: it isn't how big the block reward or fees are, it's how much you save and allow to deflate that will pay the biggest dividends.
hero member
Activity: 1328
Merit: 563
MintDice.com | TG: t.me/MintDice
February 10, 2015, 03:19:05 AM
who is the #1 most credible poster or human in favor of keeping 1mb?
legendary
Activity: 4760
Merit: 1283
February 10, 2015, 02:05:05 AM
Lets implement a maximum block size that can adjust to what is needed rather than just guessing at what might be needed?  Lets build a protocol for the next 100 years, not the next 100 weeks?  I think we can do better, and we currently have the time to do so. 

The frustration with arguing with many of you is that you come at this issue as though it were an economic problem.  It's not an economic problem.  Economically, the block size should not be artificially limited.
...

Neither Gavin nor you nor any of the hard-fork crowd seem anxious to answer this.

"We are targeting the top {n}% of world population by gross income being able to perform {n} native Bitcoin transactions per year and pay {x}% of their transaction values to miners as fees.  Here is our roadmap."

It seems unrealistic to complain about a lack of decent analysis without such a basic goal being stated.

sed
hero member
Activity: 532
Merit: 500
February 10, 2015, 01:42:34 AM
I'm following this topic closely nevertheless.
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
February 09, 2015, 11:38:23 PM
Don't forget, there are altcoins with 1MB blocks every minute. They don't seem to be complaining about bandwidth problems.
legendary
Activity: 1330
Merit: 1000
February 09, 2015, 11:31:12 PM
Lets implement a maximum block size that can adjust to what is needed rather than just guessing at what might be needed?  Lets build a protocol for the next 100 years, not the next 100 weeks?  I think we can do better, and we currently have the time to do so. 

The frustration with arguing with many of you is that you come at this issue as though it were an economic problem.  It's not an economic problem.  Economically, the block size should not be artificially limited.

What it is, instead, is a technical problem, or a political problem, or an existential problem.  The problem is not whether miners will continue to get paid.  That hasn't been a problem for years.  Mining is so ridiculously huge at this point that any "security" achieved by marginal increase in hashing power pales in comparison to other, much larger, existential threats to Bitcoin.  And those *do* exist.  What Bitcoin is attempting will not be a cakewalk.

Gavin has done a good job of laying out the technical limitations, which, frankly, are few.  He says the technical limit is somewhere beyond 16.7GB.  I have no reason to dispute this.  And I have seen no one actually attempt to dispute it.  If you think 20MB blocks are too large, you probably have sub-standard internet service.  I'm right there with you.  Most people probably have sub-standard internet service.

Which brings us to the real issue.  No one has done a decent job of laying out the political problems, and the existential threats posed by a block size increase.  A lot of people have made various insinuations that there is a plot against Bitcoin, which, if you read my posts, I would even tend to agree with.  Yet there is little concrete discussion of what that threat even is.  The threat is usage?  The threat is growth?  The threat is voluntary centralization?

*One* person has suggested that 2MB blocks are acceptably large.  Come on, be realistic.  2MB or 1MB, really just doesn't matter at all.  Such a limit is simply laughable.  What an idiotic hill to choose to die on.  Anyone who insists on such a limit would be part of the real "plot" against Bitcoin, as far as I'm concerned.  For all of your crying about "decentralization," to insist on crippling Bitcoin at a rate that is only useful for gigantic financial institutions is just embarrassing.  At that point, if that's the best you all can come up with, then it will be time to move on to plan B, because this iteration of Bitcoin will have failed.

This is just not a serious discussion, at all.  There are a dozen different possible outcomes, here.  There are a dozen different ways that Bitcoin can evolve in the future.  So far we have explored three, maybe four of them.  Please try to think a little outside the box.
legendary
Activity: 1470
Merit: 1004
February 09, 2015, 09:47:34 PM
^ Gavin is actually proposing increasing the limit to ~16.8 MB, and then having it increase by 1.4X every year, for 20 years.

He picked this number after running experiments to see how a full node would function with 20 MB blocks, and seeing that it could easily handle it.

good. create Bitcoin 2. it will be fun Smiley
hero member
Activity: 772
Merit: 501
February 09, 2015, 09:45:58 PM
^ Gavin is actually proposing increasing the limit to ~16.8 MB, and then having it increase by 1.4X every year, for 20 years.

He picked this number after running experiments to see how a full node would function with 20 MB blocks, and seeing that it could easily handle it.
hero member
Activity: 687
Merit: 500
February 09, 2015, 07:52:14 PM
Lets implement a maximum block size that can adjust to what is needed rather than just guessing at what might be needed?  Lets build a protocol for the next 100 years, not the next 100 weeks?  I think we can do better, and we currently have the time to do so. 

Yes, I agree with this. This seems a bit rushed. And why 20MB? Seems like an odd number. Anyone willing to explain?

hero member
Activity: 772
Merit: 501
February 09, 2015, 07:09:33 PM
Yes, we have effective spam/bloat countermeasures.  That's why at present most blocks aren't full.

And Bitcoin certainly sees sudden spurts in adoption.  Thus my concern with the ultimate form of bloat: widespread actual usage.   Shocked

We need to understand how the system reacts to heavy actual usage.

Will anything break, or rather, what will degrade/break first?  How will the markets react?  What can be optimized and/or substituted given proper incentives such as the removal of free riders and their subsidized blockchain space?

It's nice we agree on a geometric increase, but wouldn't it be great to have actual data on which to better determine the optimum initial increase and eventual rate of increase?

Before changing the max_blocksize constant, we should know what happens to the BTC function at (and over) the 100% limit of the tx/block variable.

We do have some data on what happens when blocks hit a limit, as they've hit a soft limit at 250 KB before. It wasn't an extended experiment, but it did give us an idea what happens at the outset. What happened is that people had to wait a few blocks to get a confirmation, which was inconveniencing people.

Doing a longer experiment on block subsidy would provide with even more data, and create incentives to come up with off-chain solutions. However, I don't think it's wise to run such an experiment, because it comes with some pretty big risks. Here's why I don't think now is the time for a block subsidy experiment:

1. it could turn a lot of people away from Bitcoin as they have difficulty getting txs to confirm.

2. off-chain solutions might not get invented. We actually don't know if there are adequate off-chain solutions for scalability. We hope there are, but it's possible they are all inadequate. If that's the case, then during the entire block scarcity experiment, users would be left without a way to transact in bitcoin

3. due to 1 and 2, it could hurt Bitcoin's momentum. Deploying a technology and achieving mass adoption is about maintaining momentum. Eventually a technology will fade from the public consciousness, so it's important the opportunity to achieve mass adoption is seized. Further, governments are creating new legislation all the time put Bitcoin under greater restrictions. Mass adoption is the best defense against new legislation, as we saw with the internet.

4. we have plenty of opportunities to conduct scarcity experiments when we actually need to limit block sizes. There will come a time when Bitcoin blocks simply cannot get larger without harming decentralization. At that point, blocks will have to be artificially limited in size, and then we will get all of that experimental data you want on how a system behaves under scarcity.

In conclusion: at 1 MB, 4 MB, or 5 MB, is absolutely not the right time to conduct a block subsidy experiment. Additionally, we don't need a hard limit to run a scarcity experiment. A hard limit is a crude and dangerous way to limit block space, because if we find the experiment is harmful, a hard limit is difficult to quickly remove. Experiments with block scarcity should be done with soft limits, not with unchangeable protocol rules.
legendary
Activity: 924
Merit: 1132
February 09, 2015, 04:33:26 PM
Look, I'm ready to vote with my feet. 

Where can I download the source (or patch, or signed executable) for a bitcoind / bitcoin-qt / bitcoin-cli that has the 20MB block limit + annual growth?  I will start running it, immediately.

After all, it's not going to reject any blocks that current versions accept.  When a larger block comes along, I don't want to be one of the nodes that rejects it.

legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
February 09, 2015, 04:23:25 PM
I just think that rule should be very well thought.

It's not a rule that's needed, like tvbcof said, a line needs to be drawn.

You're more than welcome to draw whatever lines you want.  Stay on your limited chain on your side of the line if you like.  Don't expect everyone else to stick around, though.  Everyone who supports the fork is going to leave you behind.  We'll see how long your line stays intact before you come crawling back.   Tongue
legendary
Activity: 1284
Merit: 1001
February 09, 2015, 04:15:25 PM
Bitcoin has survived for 6 years without the block size limit effectively constraining the block size, why will this established pattern suddenly reverse if the limit is maintained high enough to not constrain the block size (or removed entirely?)

Do you think the hashing power that is currently protecting the bitcoin chain would be the same if the 25 BTC subsidy wasn't there? Because eventually it won't be. I find it hard to believe that you don't know this.

That would certainly be the case, if multiplication didn't exist.
Fee revenue is (number of transactions) * (transaction fee)
Because of the magic of multiplication, having high transaction fees isn't the only way to have high fee revenue.
As a matter of fact, the producers of every product and service in the economy (except a few minor corner cases) maximize their revenue by increasing volume, not price.

The recent oil price development is a good example of how wrong your simple minded economic model can be. I advice you to look up the term price elasticity.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 09, 2015, 02:35:25 PM
If this limit increases faster than than the interest in spending money for transactions then it will kill Bitcoin.
Bitcoin has survived for 6 years without the block size limit effectively constraining the block size, why will this established pattern suddenly reverse if the limit is maintained high enough to not constrain the block size (or removed entirely?)

I wouldn't go so far as to claim that it WILL kill Bitcoin, but a too-high max block size does permit bad behaviors, which can have pernicious effects.  There really isn't a reason to have the capability for miners (or pools) have the ability to include 60x the normal amount of transactions creating huge blocks that they can propagate quicker than small sites can download and verify.  This gives yet another edge to the largest miner pools, and it is unnecessary.

Gavin's solution is a simple one, which is a good thing, but it is obviously not the BEST solution.

Implementing anything whilst IN a crisis, we will end up with a sub-optimal solution, because folks will do whatever it takes to get out of the crisis.

We are NOT in a crisis now, our average block size is well less than 50% of the max, there isn't any massive queueing, things are fine today, so lets do it right rather than just what it expedient?

Lets implement a maximum block size that can adjust to what is needed rather than just guessing at what might be needed?  Lets build a protocol for the next 100 years, not the next 100 weeks?  I think we can do better, and we currently have the time to do so. 
newbie
Activity: 14
Merit: 0
February 09, 2015, 02:23:43 PM

tl;dr: MPEx worked. That is where the money is.
MP et al. saved his investors and some of us eejits from Pirate, Gox, Stamp, Ethereum, GAW, the lot of them...
Then he put his foot down again, and said Gavin, bitcoin core, Vessenes, the phoundation all that crap too is a scam, a United fucking Soviet fucking States fucking Government scam.
And it shall not pass.
Whatever 'we the community' think or want, he can outspend us, and he will if he needs to.
So let's all stop twittering as if our say mattered; get our heads down and study the code that's so royally fucked up by USSG; and keep our axes sharp for the real life and death wars to come.
Login under your normal username next time and perhaps we'll give you the time of day.  Roll Eyes

I think you just did.
Anyhoo, why would I make it easy for your friend the USG? They have big(gish) guns.
Pages:
Jump to: