Pages:
Author

Topic: Thoughts on raising the Hard 1Mb block size limit (Read 2803 times)

legendary
Activity: 1988
Merit: 1012
Beyond Imagination
By default Bitcoin will not created blocks larger than 250kb even though it could do so without a hard fork. We have now reached this limit. Transactions are stacking up in the memory pool and not getting cleared fast enough.

Just reiterating my prediction so we can see how it plays out. We are currently on #2, a lot of unconfirmed transactions and starting to see #3. We should see transaction fees increase and also more and more blocks larger than 250kb as miners uncap the soft limit.

The amount of unconfirmed transactions is not larger than average, over a 24 hour period.

A snapshot of the mempool -- like the blockchain.info link above -- does not fit the thesis for two reasons:

  • Never-will-confirm transactions and low priority transactions bloat the mempool
  • Some miners sweep far more than 250k worth of transactions, so some miners already sweep large swaths into blocks

This situation has been ongoing for months now.

legendary
Activity: 1400
Merit: 1013
By default Bitcoin will not created blocks larger than 250kb even though it could do so without a hard fork. We have now reached this limit. Transactions are stacking up in the memory pool and not getting cleared fast enough.
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
So far I haven't heard about any problem
You haven't been paying attention. Use the search feature.

If I have to use search feature then it is not a problem, or people can solve it by themselvs, otherwise it will pop up in the hotest threads
legendary
Activity: 1400
Merit: 1013
k folks, well judging by the sentiment a hard fork will probably be happening at some point... should be interesting times.
This was posted on the Bitcoin Foundation blog before you started the thread:

https://bitcoinfoundation.org/blog/?p=135
sr. member
Activity: 451
Merit: 250
Honestly, I'm sick of people ignoring all the optimizations that have already been identified and are just waiting to be coded, as if we're going to scale to 2000 tps without anyone bothering to implement any of them.
+1


What an apt post to mark Gavin's 2000th

k folks, well judging by the sentiment a hard fork will probably be happening at some point... should be interesting times.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Honestly, I'm sick of people ignoring all the optimizations that have already been identified and are just waiting to be coded, as if we're going to scale to 2000 tps without anyone bothering to implement any of them.
+1


What an apt post to mark Gavin's 2000th
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
This is there to create demand/limit supply, and to allow miners to collect fees for securing transactions in the network.

Where did Satoshi mention coding the 1MB limit so that miners can collect more fees?

does it matter?
My point is that a sensible version of Bitcoin will have a limit on how much data is put into the blockchain, i.e. that scarce resources (disk space and network bandwidth) are allocated in an optimal way. The optimal way will be determined by what transactions miners voluntarily include and which blocks the market considers valid bitcoin transactions.

Think about this scenario.

A hard fork is created with a self-adjusting block size limit. All users have a balance of original, authentic 1Mb-limit bitcoins, and also, a balance of 1+XMb limit, bigblockchaincoins (BBCcoins). Exchanges will be created that allow people to exchange these two types of coins between forks.

As the two forks compete, the BBCcoins will see their blockchain keep getting larger at an exponential rate, while the bitcoin blockchain is pinned to linear growth at 1MB/10mins. People will start realizing that the vast size of the BBCblockchain is leading to lots of centralization (only few nodes verifying everything, only large mining ops able to compete). Now they are wondering whether the original bitcoin blockchain is preferable to mine/verify due to its smaller data size, and the presence of less spam, and higher fees/MB mined.

Inevitably, the original bitcoin miners are less centralized, due to the smaller, spam-free blockchain. The added security makes bitcoins more valuable than the BBCcoins, and people start selling their BBCcoins. The lower fees/Mb in mining BBCcoins also makes people want to mine bitcoins for the higher fees, and over time, BBCcoins become worth a lot less, having a 4TB blockchain, while bitcoins are worth a lot more, having a 500GB blockchain. The smart investors win, the free market wins, BBCcoin holders lose, SD bots lose.

I'm sensing some serious deja vu reading this. It sounds like another 14 point plan...

Let's be generous and assume an average 90% probability that each step in the predicted chain of events occurs as described...

Event  Probability
1      100%
2      90%
3      81%
4      73%
5      66%
6      59%
7      53%
8      48%
9      43%
10    39%
11    35%
12    31%
13    28%
14    25%

End result:

25%  Smooth transition.  All: "Hail misterbigg"
75%  Train wreck, emergency block size increase. misterbigg: "Sorry, my next prediction will be better!"

>totaleclipseofthebank
sr. member
Activity: 294
Merit: 250
Honestly, I'm just sick of people saying it is feasible to have a 2000tps distributed protocol, where everyone is downloading and storing half-gig blocks every 10 minutes so. (see the wiki) https://en.bitcoin.it/wiki/Scalability#Current_bottlenecks

Don't people understand that this would mean thousands of GB each year in blockchain size? How do you think this is reasonable? There are limits to data storage, and thats just one of the economic constraints that have to be faced.

That is completely feasible to do TODAY. Would it ultimately be as distributed as you or I want it to be? Nope.

Is there a big difference between 7tps and 2000tps? Yup.

Would it require investments in bandwidth and storage to be a participant as a full node? Yup.

Is storage and bandwidth going to get cheaper over time? Ab-so-fucking-lutely

Should we forever cripple Bitcoin because of a failure to allow for reasonable scaling capacity? Sure if you hate Bitcoin and want it to fail.

Do you have any idea how valuable a single bitcoin would be if we were able to get the network to 2000tps while maintaining transaction fees in the 10ths of cents to single cents range? They would be worth thousands.

I don't need to run a full node on my watch. What I do need to do is use disruptive technology to unseat Paypal, Western Union, ACH, SWIFT, Visa, and the FED. We don't have to get there overnight. We do have to get there.




those mirror my thoughts ...I don't even need to post anymore since you do all the work
legendary
Activity: 1176
Merit: 1015
It seems you either have to be a full node or nothing at all.

Could a function be built where people can connect into a distributed node? So based on my bandwidth and space, I connect into a node that contains 1,000 computers (a node swarm) and thus I can continue to verify blocks and and be a apart of the network but only need 1/1000 the bandwidth a full node needs

Also you don't need the entire blockchain either, and these node swarms can be all different sizes so the poor can still participate.

Any ideas?
legendary
Activity: 1400
Merit: 1013
So far I haven't heard about any problem
You haven't been paying attention. Use the search feature.
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
We need to first see the effect of the 250k soft limit breaching. So far I haven't heard about any problem, means even 250K limit will work?
sr. member
Activity: 247
Merit: 250
This is there to create demand/limit supply, and to allow miners to collect fees for securing transactions in the network.

Where did Satoshi mention coding the 1MB limit so that miners can collect more fees?

does it matter?

It wouldn't if you hadn't made it the basis of your argument. 

I was under the impression the 1MB limit was a safe guard put in place because Satoshi was worried early on someone would abuse the network.  But I believe we are now well beyond the point.  If someone tried to submit a well-above average block through the network, it would fail to propagate & be orphaned.  Of course, in time, when average bandwidths are larger & HDD space is cheaper, submitting that same block may not be a big deal because it will propagate fine.  Which is why setting a limit at all is only kicking the problem down the road to be "fixed" again.
sr. member
Activity: 310
Merit: 250


Mine/verify what you like, but I think the number of nodes on a BBCcoin, or should we say FoundationCoin network is going to keep decreasing because the blockchain size is literally too damn high.

Chicken little, YOU DON'T EVEN KNOW WHAT THE NEW LIMITS ARE GOING TO BE.

It could be increased 10x, today, then maxed out, with no new optimizations AT ALL, and I can still run a full node for $20 a year in storage costs (and decreasing).




what if they are increased 10x every year for 3 years? For 20 years?

As long as the limit increase mimics the cost decrease in storage / bandwidth it isn't a problem for anyone.

As long as the limit increase does not prevent profitability for miners, it isn't a problem for them. And guess who gets to decide which transactions to include in their blocks?

You saying, what if they are increased 10x every year, is that same as you saying, what if we have no limit at all. We do, and we will, and even if it is adjustable its not going parabolic.
sr. member
Activity: 451
Merit: 250


Mine/verify what you like, but I think the number of nodes on a BBCcoin, or should we say FoundationCoin network is going to keep decreasing because the blockchain size is literally too damn high.

Chicken little, YOU DON'T EVEN KNOW WHAT THE NEW LIMITS ARE GOING TO BE.

It could be increased 10x, today, then maxed out, with no new optimizations AT ALL, and I can still run a full node for $20 a year in storage costs (and decreasing).




what if they are increased 10x every year for 3 years? For 20 years?
sr. member
Activity: 310
Merit: 250


Mine/verify what you like, but I think the number of nodes on a BBCcoin, or should we say FoundationCoin network is going to keep decreasing because the blockchain size is literally too damn high.

Chicken little, YOU DON'T EVEN KNOW WHAT THE NEW LIMITS ARE GOING TO BE.

It could be increased 10x, today, then maxed out, with no new optimizations AT ALL, and I can still run a full node for $20 a year in storage costs (and decreasing).


That would represent 40x the transactional capacity we are using right now, while transactional fees are already rising.
sr. member
Activity: 451
Merit: 250
This is still just a really crappy time to go ballistic about this, because network upgrades have been stalled for about a year and the end of the stalling is not yet really in firm sight.

For good or bad, upgrading hashing hardware to ASIC technology came first, and that disrupted all upgrades for about a year because upgrading to obsolete technology is pointless and the fact that the obsoleting would not take place for a year or more was concealed.

Basically the whole network has been conned into putting off upgrades for a year or so.

There are now, at last, signs that it might not be a whole 'nother year before this ASIC upgrade phase is over, whereupon we can start on a bandwidth upgrade phase.

But until we can order ready to be shipped ASIC units to upgrade our hashing capabilities it is just too soon to start ordering more bandwidth. For all we know only a few people who ordered ASIC units during development will ever get any, all the manufacturers could yet decide they now have their development costs paid and can keep all further ASICs to themselves.

We just do not yet know whether the ASIC upgrade is even going to really happen. Until we do budgeting any other upgrades is simply premature.

-MarkM-


You bring up a very interesting and important point, that the demand for bitcoin mining could ultimately drive the expansion of internet bandwidth. I just think we are a long way away from that given the length of time it takes to install such bandwidth around the world. Basically I just don't think the hardware is going to be able to scale fast enough for FoundationCoins to work, whereas the limits in Bitcoin allow the market to develop over several years, being limited to O(time) rather than O(time^spamgrowth)
legendary
Activity: 2940
Merit: 1090
This is still just a really crappy time to go ballistic about this, because network upgrades have been stalled for about a year and the end of the stalling is not yet really in firm sight.

For good or bad, upgrading hashing hardware to ASIC technology came first, and that disrupted all upgrades for about a year because upgrading to obsolete technology is pointless and the fact that the obsoleting would not take place for a year or more was concealed.

Basically the whole network has been conned into putting off upgrades for a year or so.

There are now, at last, signs that it might not be a whole 'nother year before this ASIC upgrade phase is over, whereupon we can start on a bandwidth upgrade phase.

But until we can order ready to be shipped ASIC units to upgrade our hashing capabilities it is just too soon to start ordering more bandwidth. For all we know only a few people who ordered ASIC units during development will ever get any, all the manufacturers could yet decide they now have their development costs paid and can keep all further ASICs to themselves.

We just do not yet know whether the ASIC upgrade is even going to really happen. Until we do budgeting any other upgrades is simply premature.

-MarkM-
sr. member
Activity: 451
Merit: 250
Honestly, I'm just sick of people saying it is feasible to have a 2000tps distributed protocol, where everyone is downloading and storing half-gig blocks every 10 minutes so. (see the wiki) https://en.bitcoin.it/wiki/Scalability#Current_bottlenecks

Don't people understand that this would mean thousands of GB each year in blockchain size? How do you think this is reasonable? There are limits to data storage, and thats just one of the economic constraints that have to be faced.

That is completely feasible to do TODAY. Would it ultimately be as distributed as you or I want it to be? Nope.

Is there a big difference between 7tps and 2000tps? Yup.

Would it require investments in bandwidth and storage to be a participant as a full node? Yup.

Is storage and bandwidth going to get cheaper over time? Ab-so-fucking-lutely

Should we forever cripple Bitcoin because of a failure to allow for reasonable scaling capacity? Sure if you hate Bitcoin and want it to fail.

Do you have any idea how valuable a single bitcoin would be if we were able to get the network to 2000tps while maintaining transaction fees in the 10ths of cents to single cents range? They would be worth thousands.

I don't need to run a full node on my watch. What I do need to do is use disruptive technology to unseat Paypal, Western Union, ACH, SWIFT, Visa, and the FED. We don't have to get there overnight. We do have to get there.




Mine/verify what you like, but I think the number of nodes on a BBCcoin, or should we say FoundationCoin network is going to keep decreasing because the blockchain size is literally too damn high.
legendary
Activity: 1400
Merit: 1013
People are not going to stick with a protocol if it involves lugging huge amounts of data around for no reason.
Too bad nobody has thought about how to drastically reduce the amounts of data people need to transmit and store.
Pages:
Jump to: