Pages:
Author

Topic: A Scalability Roadmap - page 5. (Read 14938 times)

hero member
Activity: 1008
Merit: 531
October 08, 2014, 11:16:27 PM
#33
Let's keep in mind some important factors:

1) It will be impossible to limit the block size in the future, if bitcoin is very successful.  Governments and the ultra rich will control all of the nodes that are powerful enough to mine blocks.  They will also (due to their power) have indirect influence over the other nodes.
 
If the 10000 bitcoin enthusiasts decide that the block size is too big it won't matter; they don't control any of the 10 nodes capable of mining blocks.  If the enthusiasts decide to start rejecting blocks that are too big they will find themselves on a fork that no one cares about.  We need to assume that the miners of the future will be hostile to the principle of decentralization.

If bandwidth is the bottleneck, the big players can simply take action to slow the rate of bandwidth increase.  No one will notice if bandwidth increases by 49% per year instead of 50%.  One by one nodes will drop out until there are only a few left and then the crisis will be resolved, by implementing a Central Bitcoin Bank to determine the block size.

2) It will be possible to increase the block size in the future, if it has to be done.  The big players will press for it, and the small players will be convinced that they need to go along for the good of the system.

3) Trends don't continue forever, and even if they do it isn't always relevant.  Right now bandwidth is the foreseeable bottleneck.  Perhaps the the growth of bandwidth will slow.  Perhaps it will continue, but something else (that we aren't thinking of right now) will become the bottleneck.

So putting these three principles together here is what I see:

increase the block size by 2X% per year, where X is the block reward.  So we'd have a couple more years at 50%, then four at 25%, then four at 12.5%, and so on.  This is still an astounding growth rate.
legendary
Activity: 3920
Merit: 2349
Eadem mutata resurgo
October 08, 2014, 07:34:27 PM
#32
Put another way:  I think we are behind the curve.  Making a steeper curve to compensate could really throw us off in the future, but adjusting the Y-intercept to put us back on track, and then making a good guess about the future would be better.

+1

A correction now and then less aggressive growth going forward could be a better idea.

... or begin with 50% growth then have a halving of blocksize growth every 4 years?
legendary
Activity: 1246
Merit: 1011
October 08, 2014, 07:05:23 PM
#31
I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Would 40% initial size and growth make you support the proposal?

40%/year = 96%/(2 years).  I hope that gets rounded to "double once every 2 years (105 000 blocks)".

Also, do you propose this growth be open-ended or terminate at some block-size commensurate with the transaction volume of an existing, mature payment system such as Visa?  Given an open-ended proposal I'd echo Theymos' concern.
hero member
Activity: 714
Merit: 500
Martijn Meijering
October 08, 2014, 06:15:32 PM
#30
Tree chains with a fixed block size could also work. Has anyone looked at the pros and cons compared to what's being proposed here?
member
Activity: 98
Merit: 10
October 08, 2014, 06:10:42 PM
#29
I agree that we need to increase the block size to prevent transactions getting stuck without a confirmation, but I strongly disagree with the rate that it would be increased. I would support any of the following:

  • 1MB increase per year. - Simple to implement and doesn't lead to exponential growth.
  • 10% increase per year. - Exponential, but with a much smaller constant than has been discussed in this post.
  • Based on fullness of last years blocks (Max 20% increase). - Exponential, but more tightly tied to actual usage. Allows for block size decreases. Miners essentially get to vote on blocksize.

Basically, my hope is that it gets easier to run a full node over time. The total number of transactions that the worlds population is making is not increasing exponentially so (IMO), the blocksize shouldn't either.
member
Activity: 129
Merit: 14
October 08, 2014, 03:34:32 PM
#28
Put another way:  I think we are behind the curve.  Making a steeper curve to compensate could really throw us off in the future, but adjusting the Y-intercept to put us back on track, and then making a good guess about the future would be better.

+1

A correction now and then less aggressive growth going forward could be a better idea.
member
Activity: 129
Merit: 14
October 08, 2014, 03:31:25 PM
#27
Would 40% initial size and growth make you support the proposal?

I don’t think the distinction between 50% and 40% is that much, the issue may be whether there should be permanent exponential growth in the blocksize limit or some other model.  Nothing grows exponentially forever, consumer bandwidth speeds won’t and neither will demand for Bitcoin transactions, therefore why does one require permanent exponential growth?  Why not consider a model where say the blocksize grows by a fixed percentage each year, however the rate of increase falls by 50% for example when the block reward drops?

Consider the example below:

Year
Blocksize limit MB
Growth Rate
2015
1.0
100%
2016
2.0
100%
2017
3.0
50%
2018
4.5
50%
2019
6.8
50%
2020
10.1
50%
2021
12.7
25%




Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

Anybody know economists who specialize in this sort of problem? Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely. ("...elect a Board of Blocksize Governors that decides on an Optimal Size based on market supply and demand conditions as measured by an independent Bureau of Blocksize Research....")


Theymos, I agree that the maximum blocksize is also an economic/game theory problem, that’s why a drew those supply and demand curves to try to analyse this using an economic framework.  If the blocksize limit increases and transaction fees fall, this can increase the velocity of money, boost inflation and stimulate the economy.  In contrast if the blocksize limit falls, the velocity of money can fall, causing deflation and an economic slowdown.  I agree with Gavin that having a “Blocksize policy committee” to manage the economy, is not very consistent with Bitcoin’s values.  There should not be a problem here as all the economic variables (volume of transactions, transaction fee data) are in the blockchain and therefore economic policy could be automated.  Perhaps a simple fixed 50% increase per year is the best solution or maybe a more complicated economic formula is required.  However, obviously consumer bandwidth speeds cannot be obtained from blockchain data, therefore it could be important to act with caution and keep the blocksize growth somewhat restricted.  Permanent exponential growth should be avoided for this reason.
legendary
Activity: 1176
Merit: 1020
October 08, 2014, 03:24:16 PM
#26
I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Would 40% initial size and growth make you support the proposal?

I made a chart showing how some different slopes and y-intercepts compare over time.  As you might be able to guess from the chart, I am partial to the idea to jump starting the process with an initial increase of a few MB, then having a slightly more conservative growth rate going forward.  We know 10 MB blocks (70 TPS) can be supported by many of today's home internet connections.  I would argue it is quite a bit less certain if 3,000 MB blocks would be realistic in 20 years.  Put another way:  I think we are behind the curve.  Making a steeper curve to compensate could really throw us off in the future, but adjusting the Y-intercept to put us back on track, and then making a good guess about the future would be better.



legendary
Activity: 1176
Merit: 1020
October 08, 2014, 02:19:06 PM
#25
Is there a relationship between hashrate and bandwidth? If by increasing blocksize and thereby increasing bandwidth, would that eat into the available bandwidth for hashing? For example, if a peta-miner maxes out their bandwidth with just hashrate, then increasing blocksize would lower their hashrate. They would have to buy more bandwidth if it is available. It might favor miners living where there is better internet connectivity rather than cheap electricity or cold climate. It could help decentralize mining by enlarging the blocksize.

There is a subtle, but important relationship between hash power and bandwidth.  The hash power / bandwidth relationship can best be looked at in terms of orphan blocks.  A miner needs to have adequate bandwidth to make orphan blocks unlikely.  What is adequate?  What is unlikely?  Assuming 10-minute blocks, each and every second there is 0.16 % chance a miner will discover a block.  If you're a miner, and you find a block, you obviously need to broadcast it to the network ASAP.  A 6-second delay in publishing would mean 1% chance that someone else finds a block in the meantime.  Diminishing returns help to keep the bandwidth 'race' to a minimum.  Having enough bandwidth to keep your orphan rate under 1% would probably be good enough.  If your mining operation was big enough that 1% was a substantial sum, perhaps you would buy bandwidth to push the orphan rate down to 0.1%, but at some point it no longer makes financial sense to invest in more bandwidth.  Consider that being able to push out blocks infinitely fast would mean paying for infinite bandwidth, but at best would mean 1% increase in returns over the miner who took 6-seconds to publish a block.  Also, as miners are not required to include any transactions in blocks they publish, they have another way to compensate for low bandwidth bottlenecks - publish empty blocks (this is because - currently - the block reward is so much bigger than transaction fees are).  Since orphan blocks divert hash power away from the main chain, they are a security consideration, and if the system design encourages miners to publish empty blocks, that certainly doesn't help the network to thrive either.

The block-propagation-race issue could certainly be impacted by larger block sizes.  Fortunately a protocol is being (has been?) implemented that allows blocks to essentially be per-constructed in real time, by each miner, as transactions flow across the network.  If that system was perfectly successful, the miner who finds a new block would just need to transmit the block header, and the block header does not scale with transaction volume.  Another way of thinking about it is this:  Currently blocks are pushed out all at once, creating massive peak loads on miner bandwidth.  The new design spreads out the load over the whole 10 minutes between blocks (or however long).

Miners would always need to have enough bandwidth to comfortably process the entire transaction load on the network.  I currently have a 20 Mbps down / 5 Mbps up cable connection here in Washington State, USA.   It cost me $45 per month.  That is 1,500 MB down and 375 MB up every 10 minutes.  That equates to 10,000 TPS down and 2500 TPS up, respectively.

newbie
Activity: 42
Merit: 0
October 08, 2014, 02:16:37 PM
#24
This is what 50% growth per annum looks like.  How will miners earn income when the block reward is low and the block size limit is increasing at such an exponential rate, that transaction fees will also be low, even if demand grows at say 40% per annum?


My Opinion:
Larger blocks means more transactions.
The fee per transaction stays low, but the net fee can grow along with block size.
legendary
Activity: 3066
Merit: 1147
The revolution will be monetized!
October 08, 2014, 02:13:32 PM
#23
...Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all...

I think what he meant was that it would be like SWIFT in that it would mostly be for large international transfers. A fork like this will have to happen sooner or later.
full member
Activity: 153
Merit: 100
October 08, 2014, 02:06:56 PM
#22
raise the block size too slowly and you discourage transactions and increase their price. The danger is Bitcoin becomes irrelevant for anything besides huge transactions, and is used only by big corporations and is too expensive for individuals. Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all though: it's incredibly clunky, and only works with government-issued currencies.

If anything, it'd be like reinventing the gold standard, but a digital, cryptographically verifiable, lightning fast and relatively cheap to use version. (In many implementations of the gold standard, gold wasn't actually used in day-to-day trade.)

Furthermore, SWIFT is not decentralized, and certain transfers (to specific countries for instance) can technically be censored. Nor is it anonymous or even pseudonymous.

See:

http://www.swift.com/news/press_releases/SWIFT_disconnect_Iranian_banks
http://www.bloomberg.com/news/2014-08-29/u-k-wants-eu-to-block-russia-from-swift-banking-network.html
http://www.spiegel.de/international/world/spiegel-exclusive-nsa-spies-on-international-bank-transactions-a-922276.html

Quote
Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely.

LOL Cheesy

(Perhaps mathematicians, though?)
legendary
Activity: 1652
Merit: 2311
Chief Scientist
October 08, 2014, 12:36:07 PM
#21
Lowering the limit afterward wouldn't be a soft-forking change if the majority of mining power was creating too-large blocks, which seems possible.

When I say "soft-fork" I mean "a majority of miners upgrade and force all the rest of the miners to go along (but merchants and other fully-validating, non-mining nodes do not have to upgrade)."

Note that individual miners (or sub-majority cartels) can unilaterally create smaller blocks containing just higher-fee transactions, if they think it is in their long-term interest to put upward pressure on transaction fees.

I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Would 40% initial size and growth make you support the proposal?


Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

Anybody know economists who specialize in this sort of problem? Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely. ("...elect a Board of Blocksize Governors that decides on an Optimal Size based on market supply and demand conditions as measured by an independent Bureau of Blocksize Research....")
full member
Activity: 179
Merit: 151
-
October 08, 2014, 12:08:34 PM
#20
Ordinary people aren't supposed to run full nodes anyway  Tongue

This is absurd and false. Bitcoin is deliberately a publically verifiable system.
newbie
Activity: 6
Merit: 0
October 08, 2014, 10:58:47 AM
#19
50% is conservative based on extrapolated storage and computing power cost efficiencies, decreasing 100% and 67% annually.

50% is very risky based on extrapolated bandwidth cost decreases, decreasing 50% annually.

I would dial it back from 50% to 40%.  Hobbyists will want to download full nodes remotely, and that is just too close for comfort.
sr. member
Activity: 252
Merit: 250
Skoupi the Great
October 08, 2014, 10:13:43 AM
#18
Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

Ordinary people aren't supposed to run full nodes anyway  Tongue
legendary
Activity: 1232
Merit: 1094
October 08, 2014, 10:07:06 AM
#17
What is the plan for handling the 32MB message limit?

Would the 50% per year increase be matched by a way to handle unlimited block sizes?

Blocks would have to be split over multiple messages (or the message limit increased)
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
October 08, 2014, 05:51:55 AM
#16
Is there a relationship between hashrate and bandwidth? If by increasing blocksize and thereby increasing bandwidth, would that eat into the available bandwidth for hashing? For example, if a peta-miner maxes out their bandwidth with just hashrate, then increasing blocksize would lower their hashrate. They would have to buy more bandwidth if it is available. It might favor miners living where there is better internet connectivity rather than cheap electricity or cold climate. It could help decentralize mining by enlarging the blocksize.
sr. member
Activity: 362
Merit: 262
October 08, 2014, 05:25:25 AM
#15
The post is a great read on the direction of ongoing development. Such posts are really helpful for hobbyists such as myself to get an idea where things are headed.   Keen on testing and supporting some of the new stuff.  I'd love to test out headers first for example.

Quote
After 12 years of bandwidth growth that becomes 56 billion transactions per day on my home network connection — enough for every single person in the world to make five or six bitcoin transactions every single day. It is hard to imagine that not being enough; according the the Boston Federal Reserve, the average US consumer makes just over two payments per day.

I have no idea but the average consumer is not the only one making transactions.  There are also business.  So the 2 per day stat is not all that's relevant.   But 5 to 6 bn p.p.p.d. should cover that also Smiley

Small typo:
Quote
I expect the initial block download problem to be mostly solved in the next relase or three of Bitcoin Core. The next scaling problem that needs to be tackled is the hardcoded 1-megabyte block size limit that means the network can suppor only approximately 7-transactions-per-second.

legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
October 08, 2014, 02:27:56 AM
#14
I'm not really qualified to comment on the merits of Gavin's plan, but on the surface it sounds like a thoughtful proposal.  I must say, it is exciting to see solutions to scalability being proposed, and I'm sure it is encouraging to the greater Bitcoin community at large.  Just the fact that a plan is on the table should be a nice jab to the naysayers/skeptics who have been "ringing the alarm bell" on this issue.

Although, in a sense they are correct that issues require action.  I would like to thank Gavin and the other developers for all the great work they've done and continue to do for Bitcoin.  

Hats off to you sir.
Pages:
Jump to: