Pages:
Author

Topic: A Scalability Roadmap - page 3. (Read 14938 times)

hero member
Activity: 588
Merit: 500
October 13, 2014, 06:12:53 AM
#72
just do it...
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
October 12, 2014, 02:04:53 PM
#71
I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.
They can only do this as long as network bandwidth is donated and the consumers of it do not pay the suppliers.

Fix that problem and we'll never need to have this debate again.
It doesn't need to be fixed, it needs to be offered by suppliers. If there is a demand, they will supply it. They are not supplying it because vendors and consumers are not aware of the issue. Education is what's needed. If they can be shown the profitability, then they will fill the niche.
legendary
Activity: 1246
Merit: 1011
October 12, 2014, 01:01:13 PM
#70
I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

I've read the initial post of that thread several times and I think that its headline is a bit misleading. Essentially what Peter Todd is saying is that an large blocksize limit in general encourages the miners to drive out low-bandwidth competition. He is actually opposing Gavin's plan as well:

It was simply that many heavy-hitters were expressing opposing views that I found the thread informative.

According to Peter Todd it is essential that miners do not control the blocksize limit. He argues based on the assumption of an rolling average mechanism that takes its data from the previous observed block sizes. But that's not an argument against a dynamic block size limit (increase/decrease) in general. The point is, that the dynamic block size limit should not be able to be (substantially) influenced by miners, but instead by the transacting parties. So if it would be possible to determine the dynamic block size limit based on the number of transactions multiplied by a fixed "reasonably large" size constant plus safety margin you would get rid of the problem.

Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.
legendary
Activity: 1400
Merit: 1013
October 12, 2014, 11:27:55 AM
#69
I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.
They can only do this as long as network bandwidth is donated and the consumers of it do not pay the suppliers.

Fix that problem and we'll never need to have this debate again.
legendary
Activity: 1153
Merit: 1012
October 12, 2014, 11:03:48 AM
#68
I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

I've read the initial post of that thread several times and I think that its headline is a bit misleading. Essentially what Peter Todd is saying is that an large blocksize limit in general encourages the miners to drive out low-bandwidth competition. He is actually opposing Gavin's plan as well:

I primarily want to keep the limit fixed so we don't have a perverse incentive. Ensuring that everyone can audit the network properly is secondarily.

If there was consensus to, say, raise the limit to 100MiB that's something I could be convinced of. But only if raising the limit is not something that happens automatically under miner control, nor if the limit is going to just be raised year after year.

According to Peter Todd it is essential that miners do not control the blocksize limit. He argues based on the assumption of an rolling average mechanism that takes its data from the previous observed block sizes. But that's not an argument against a dynamic block size limit (increase/decrease) in general. The point is, that the dynamic block size limit should not be able to be (substantially) influenced by miners, but instead by the transacting parties. So if it would be possible to determine the dynamic block size limit based on the number of transactions multiplied by a fixed "reasonably large" size constant plus safety margin you would get rid of the problem.


Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.

There's risk in everything and nothing is absolute.  This attitude would yield the obvious answer: "Don't ever raise the block limit at all".

I'd better say: "Only raise block size limit if required by the minimum amount necessary."
hero member
Activity: 1008
Merit: 531
October 11, 2014, 10:50:13 PM
#67
I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Powerful entities would game the system, turning it into a proof-of-bandwidth system, which would be a bad thing.
hero member
Activity: 1008
Merit: 531
October 11, 2014, 10:47:25 PM
#66
Just exposing some ideas:

Gavin plan appear to me to be VERY conservative (maybe too much).

To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

You're not thinking about safety.  Yes it would be nice for bitcoin to be able to handle 2000 as many transactions as it can now, however that is not as important as keeping bitcoin decentralized.  Let's keep in mind why bitcoin was created: to create a digital gold standard so that people could protect their assets from central banks.  If bitcoin also becomes a ubiquitous payment system that would be great, but not if it comes at the expense of decentralization.
legendary
Activity: 1246
Merit: 1011
October 11, 2014, 05:24:21 PM
#65
I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I'm not aware of more recent discussions but I found the first three pages of this Feb 2013 thread good food for thought.

Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.

There's risk in everything and nothing is absolute.  This attitude would yield the obvious answer: "Don't ever raise the block limit at all".
legendary
Activity: 1153
Merit: 1012
October 11, 2014, 03:51:19 PM
#64
I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?

Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?

I imagine that a dynamic mechanism would be much better at encouraging responsible (resource preserving) network use.

I'm very sceptical regarding a fixed-percentage increase, because there is zero assurance that Moore's "law" will remain true in the future. Because - as you know - past performance is no indicator of future results. And we're quickly approaching the atomic level in storage solutions for example. Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.
legendary
Activity: 1148
Merit: 1014
In Satoshi I Trust
October 11, 2014, 02:28:58 PM
#63
...Hurray, we just reinvented the SWIFT or ACH systems.

SWIFT doesn't work like this at all...

I think what he meant was that it would be like SWIFT in that it would mostly be for large international transfers. A fork like this will have to happen sooner or later.


sooner please. lets do the major changes in these days and the rest on top of bitcoin in other layers. there will be no more changes (hopefully) if we reach a market cap of 100 or 500 billions.
member
Activity: 129
Merit: 14
October 11, 2014, 01:25:11 PM
#62
Limiting block size creates an inefficiency in the bitcoin system.  Inefficiency = profit.  This is a basic law of economics, though it is usually phrased in such a way as to justify profits by pointing out that they eliminate inefficiencies.  I am taking the other position, that if we want mining to be profitable then there needs to be some artificial inefficiency in the system, to support marginal producers.  Of course that profit will attract more hashing power thus reducing/eliminating the profit, but at a higher equilibrium.  However, I am not too worried about this aspect of large block sizes.  It is a fairly minor problem and one that is a century away.

+1

Very good point hello_good_sir.  I was trying to say this but you put it in a far more articulate way. I think we may need some artificial inefficiency at some point.



If supply is not constrained, transaction fees fall to the marginal cost, mining profit falls and then miners exit and the difficulty falls.  The remaining miners can then find blocks more easily, but they don’t necessarily get compensated more for this, because the fees would still be low.
If there are fewer miners competing for the same amount of transaction fees, then each miner's revenue has increased. The process you describe will continue until the oversupply of miners is corrected and equilibrium is restored.

Yes, but a key factor to consider is what equilibrium?  Will this be at a high enough difficulty and if not, do we need to manipulate the market?  Users pay transaction fees for there transactions to be included in a block, users are not directly paying for network security or network consensus. After the block reward falls, the incentive for network consensus can be considered as an indirect consequence of users paying for their transactions to be included in blocks, and therefore a pure unrestricted competitive market may not be an effective mechanism for determining transaction fees.  Getting a transaction included in a block and the network reaching consensus about the longest chain may be two slightly different things.  There is a mismatch here which I think some people miss.  This could be somewhat analogous to the classic tragedy of the commons problem.
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
October 11, 2014, 05:26:36 AM
#61
Has the dust attack threat been abated? Block size was an issue at one time.
sr. member
Activity: 420
Merit: 250
October 11, 2014, 05:21:31 AM
#60
Great Idea Gav, I know its is ridiculously hard to choose appropriate protocol changes when there are MANY different people thinking differently, I like that you guys took your time discussed it with each other and came to the conclusion that the pros outweigh the cons.

I think BTC is growing and this will encourage innovation and more on-chain transactions and take btc to the next level
legendary
Activity: 1400
Merit: 1013
October 10, 2014, 09:46:11 AM
#59
Right.  The majority of the miners, working together, will always have the ability to set a lower block limit.

The block limit embedded in the reference client is there to prevent massive blocks that cannot be handled in a distributed way.  Large enough blocks mean that miners who don't have a high speed connection can't keep up.

As long as the average VPS can handle the block size, then centralisation risk is low.

Miners might still decide to keep the block size small to push up fees.  There is an inherent cost for larger blocks, since they take longer to distribute (though that depends on low latency optimisations).
Limits are still the wrong word to use.

Each miners will decide to include which transactions into their block. The point at which they decide to stop adding transactions to a block will depend on their own best guess of where it is no longer to profitable to do so.

This is not a limit - it's an equilibrium.

The "centralization risk" everybody keeps talking about is an artifact of the the P2P network lacking price discovery and operating entirely via donated bandwidth. Fix that problem and we'd never need to have these debates ever again.
legendary
Activity: 1246
Merit: 1011
October 10, 2014, 09:45:17 AM
#58
If 1 MB blocks give us, say, 3 transactions per second, then 20 years of "double every 2 years" growth starting from 20 MB would leave us with about 60 million transactions per second.  That's about 25 transaction per hour per human (assuming a world population of 8.5 billion in 20 years time).

This sounds a bit excessive to me but then again I've not thought seriously about how such a volume of transactions could be utilised.  https://en.bitcoin.it/wiki/Scalability doesn't speculate beyond a few hundred thousand transactions per second.  I'd certainly appreciate a link if a discussion on the utility of millions of transactions per second exists.

I like this line of thinking. What TPS are we shooting for and when? That's what will determine what size blocks we need and how to grow to that target.

Simple growth rates like "50% increase per year" are guaranteed to end up with blocks that are too large, which will require another hard fork. Hard forks are bad, mkay?

Apologies.  I miscalculated, the figure should be 60 000 transactions per second, not 60 million (so about 4 transactions per human per week).

More meaningfully, the maximum block size would rise to a final value of 20 GB each.
legendary
Activity: 1232
Merit: 1094
October 10, 2014, 09:19:23 AM
#57
Higher limits does not imply a suddenly larger blocks, just the possibility for larger blocks to be created when the need exists and the price is right.

Right.  The majority of the miners, working together, will always have the ability to set a lower block limit.

The block limit embedded in the reference client is there to prevent massive blocks that cannot be handled in a distributed way.  Large enough blocks mean that miners who don't have a high speed connection can't keep up.

As long as the average VPS can handle the block size, then centralisation risk is low.

Miners might still decide to keep the block size small to push up fees.  There is an inherent cost for larger blocks, since they take longer to distribute (though that depends on low latency optimisations).
legendary
Activity: 1400
Merit: 1013
October 10, 2014, 08:53:41 AM
#56
Can people stop talking about increasing the block size?

It's the block size limit that needs to increase or be abolished.

Higher limits does not imply a suddenly larger blocks, just the possibility for larger blocks to be created when the need exists and the price is right.
legendary
Activity: 1246
Merit: 1011
October 10, 2014, 08:51:05 AM
#55
Gavin plan appear to me to be VERY conservative (maybe too much).

To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

Thank you.  You awakened me to a calculation error I made earlier.

I believe the proposal involves an initial jump in block size followed by temporary exponential growth with fixed parameters.  If the blocksize were increased to say 20 MB and then grown at 50% per year we'd be up 2 GB blocks in 11-12 years.  At 40% (double every 2 years) it would take 13-14 years.
sr. member
Activity: 453
Merit: 254
October 10, 2014, 08:08:32 AM
#54
Just exposing some ideas:

Gavin plan appear to me to be VERY conservative (maybe too much).

To be able to process the same number of transaction of VISA, Bitcoin should grow x2,000.
The size of blocks should go up ~1.000x at least to accommodate so many transactions.
And we will not just want to take VISA burden, we want, also, offer a service to the currently unbanked (being humans or DACs).
In the block size increase 50% every year, it will take 20 years to take over VISA alone; never mind the unbanked and DAcs.

With a 100% increase every year, it will need 11 years to take over VISA. This will bring us to 2025, when the inflation rate will be around 1% and the coins mined will be 3.125 BTC/block.

If we suppose the income of the miners will be the same as now, when the block reward will become irrelevant and Bitcoin will have the same number of transactions of VISA, we need  0.01 $ value per transaction (200 K transactions x 0.01 cent = 2M$/day- like today).

We must keep in mind, the low cost of Bitcoin transactions will cause a greater use of them, not a lower use of them.
If now it is uneconomic to bill people daily, because VISA cost 20 cents+2% per transaction, but tomorrow, with a deployed BTC infrastructure, you could have people paying daily for a lot of things today they pay weekly or monthly.

We must also keep in mind BTC transactions will be what keep the network working. We could compensate with larger fees (10 cent per transaction?), but this will weed out marginal transactions and push them in some other coin.

I would suggest some schedule to go from 1 to 32 MB in the next 5 years at most, better in two years and plan to increase exponentially from there until 1 GByte block (at least).

Because as the merchants and adopters reach a critical mass, there will be an explosion of transactions.
Today, the great majority of BTC holders have little chance to spend their bitcoins or being paid in bitcoins in their daily lives.
But, as BTC become popular and adopted, they will start to use it more frequently and the number of transactions will explode. We have see nothing until now. The growth was pretty linear in the last two years.


The big problem with Gavin plan is exponential growth is tricky to manage.
Too slow and in grow too slowly initially to keep up with the demand.
Too fast and will not keep up with the resources available.
legendary
Activity: 2128
Merit: 1073
October 10, 2014, 01:47:03 AM
#53
Limiting block size creates an inefficiency in the bitcoin system.  Inefficiency = profit.  This is a basic law of economics, though it is usually phrased in such a way as to justify profits by pointing out that they eliminate inefficiencies.  I am taking the other position, that if we want mining to be profitable then there needs to be some artificial inefficiency in the system, to support marginal producers.  Of course that profit will attract more hashing power thus reducing/eliminating the profit, but at a higher equilibrium.  However, I am not too worried about this aspect of large block sizes.  It is a fairly minor problem and one that is a century away.
This is fairly common misconception that the only way to pay for the space in a mined Bitcoin block is with fees denominated in bitcoins. But this is not true when a miner is integrated with an exchange, because an exchange can shave commissions on both sides of the transactions.

Imagine for a moment that Bitfury branches out into Consolidated Furies and spawns Hryvnafury, Roublefury, Eurofury, DollarFury, etc.; all of them being exchanges. It can then easily outcompete pure Bitcoin miners because it can directly funnel fiat commissions into electric utility bills without having to go twice through the fiat<->coin exchange.

Edit: In fact opportunities for integration are not limited to mining + coin exchange. Imagine for example Marijuanafury which does two things demanding lots of electricity: Bitcoin mining and indoor marijuana growing. If only somebody could come up with new optical ASIC technology that is pumped with energy via photons at the same wavelength that stimulate photosynthesis...
Pages:
Jump to: