Pages:
Author

Topic: Once again, what about the scalability issue? - page 7. (Read 11259 times)

donator
Activity: 1218
Merit: 1079
Gerald Davis
People, stop saying that scalability is not a problem and writing about how cheap hard drives are.
The scalability is the number one problem stopping Bitcoin from becoming mainstream.
It doesn't matter how fast the drives are growing, right now the blockchain keeps all the old information which is not even needed, and grows indefinitely, how hard is it to understand that it is a non-scalable non-future-friendly scheme?
I am sure the devs know this and are doing their best to address it and I am grateful for that. But saying that it's not a problem is just ignorant and stupid.

Quote
We won't get some real big transaction volume because of this issue.
I can't see how anybody is even arguing against this. I mean, it's even in the wiki: https://en.bitcoin.it/wiki/Scalability

The historical storage is a non-issue and the scalability page points that out.  Bandwidth (for CURRENT blocks) presents a much harder bottleneck to extreme transaction levels and after bandwidth comes memory as fast validation requires the UXTO to be cached in memory.  Thankfully dust rules will constrain the growth of the UXTO however both bandwidth and memory will be an issue much sooner and quicker than the storing the blockchain on disk.  

The the idea that today's transaction volume is held back because of the "massive" blockchain isn't supported by the facts.  Even the 1MB block limit provides for 7 tps and the current network isn't even 0.5 tps sustained.  We could see a 1,300% increase in transaction volume before even the 1MB limit became an issue.  At 1 MB per block the blockchain would grow by 50 GB per year.  It would take 20 years of maxed out 1MB blocks before the blockchain couldn't fit on an "ancient" (in the year 2033) 1TB drive.  

Beyond 1MB the storage requirements will grow but they will run up against memory and bandwidth long before disk space becomes too expensive.  Still as pointed out eventually most nodes will not maintain a copy of the full blockchain, that will be a task reserved for "archive nodes" and instead will just retain the block headers (which is ~4MB per year) and a deep enough section of the the recent blockchain.

so as far as addressing the bandwidth bottleneck problem you are in the off chain transaction camp correct?

No although I believe regardless off-chain tx will happen.  They happen right now.  Some people leave their BTC on MtGox and when they pay someone who also has a MtGox address it happens instantly, without fees, and off the blockchain.  Now imagine MtGox partners with an eWallet provider and both companies hold funds in reserve to cover transfers to each other's private books.  Suddenly you can now transfer funds

So off chain tx are going to happen regardless.

I was just pointing out between the four critical resources:
bandwidth
memory
processing power
storage

storage is so far behind the other ones that worrying about that is kinda silly.  We will hit walls in memory and banwidth at much lower tps then it would take before disk space became critical.  The good news is last mile bandwidth is still increasing (doubling every 18-24 months) however there is risk of centralization due to resources if tx volume grows beyond what the "average" node can handle.  If tx volume grows so fast that 99% of nodes simply can't maintain a full node because they lack sufficient bandwidth to keep up with the blockchain then you will see a lot of full nodes go offline and they is a risk that the network is now in the handles of a much smaller number of nodes (likely in datacenters with extreme high bandwidth links).  Since bandwidth is both the tightest bottleneck AND the one where many users have the least control over. As an example I recently paid $80 and doubled by workstation's ram to 16GB.  Lets say my workstation is viable for another 3 years.  $80/36 = ~3 per month.  Even if bitcoind today was memory constrained on 8GB systems I could bypass that bottleneck for a mere $3 a month.  I like Bitcoin, I want to see it work, I will gladly pay $3 to make sure it happens.  However I can't pay an extra $3 a month and double my upstream (and for residential connections that is the killer) bandwidth.  So hypothetically if Bitcoin wasn't memory or storage constrained by bandwidth constrained today I would be "stuck" I am either looking at much higher cost, or a need for more exotic solutions (like running my node on a server).

Yeah that was longer than I intended. 

TL/DR: Yes scalability will ALWAYS be an issue as long as tx volume is growing however storage is the least of our worries.  The point is also somewhat moot because eventually most nodes won't maintain full blocks back to the genesis block.  That will be reserved for "archive" nodes.  There likely will be fewer of them but as long as there are a sufficient number to maintain a decentralized consensus the network can be just as secure and users have a choice (full node, full headers & recent blocks, lite client) depending on their needs and risk.


legendary
Activity: 1722
Merit: 1217
People, stop saying that scalability is not a problem and writing about how cheap hard drives are.
The scalability is the number one problem stopping Bitcoin from becoming mainstream.
It doesn't matter how fast the drives are growing, right now the blockchain keeps all the old information which is not even needed, and grows indefinitely, how hard is it to understand that it is a non-scalable non-future-friendly scheme?
I am sure the devs know this and are doing their best to address it and I am grateful for that. But saying that it's not a problem is just ignorant and stupid.

Quote
We won't get some real big transaction volume because of this issue.
I can't see how anybody is even arguing against this. I mean, it's even in the wiki: https://en.bitcoin.it/wiki/Scalability

The historical storage is a non-issue and the scalability page points that out.  Bandwidth (for CURRENT blocks) presents a much harder bottleneck to extreme transaction levels and after bandwidth comes memory as fast validation requires the UXTO to be cached in memory.  Thankfully dust rules will constrain the growth of the UXTO however both bandwidth and memory will be an issue much sooner and quicker than the storing the blockchain on disk. 

The the idea that today's transaction volume is held back because of the "massive" blockchain isn't supported by the facts.  Even the 1MB block limit provides for 7 tps and the current network isn't even 0.5 tps sustained.  We could see a 1,300% increase in transaction volume before even the 1MB limit became an issue.  At 1 MB per block the blockchain would grow by 50 GB per year.  It would take 20 years of maxed out 1MB blocks before the blockchain couldn't fit on an "ancient" (in the year 2033) 1TB drive.  

Beyond 1MB the storage requirements will grow but they will run up against memory and bandwidth long before disk space becomes too expensive.  Still as pointed out eventually most nodes will not maintain a copy of the full blockchain, that will be a task reserved for "archive nodes" and instead will just retain the block headers (which is ~4MB per year) and a deep enough section of the the recent blockchain.

so as far as addressing the bandwidth bottleneck problem you are in the off chain transaction camp correct?
legendary
Activity: 1498
Merit: 1000
Leaving it offline too long?

Aye, I'm a non-hardcore casual bitcoiner. But that was an example of an issue related to slow downloading/uploading speed. Freshly mined blocks can't be pruned.

If you are a casual user unable to keep the client online why not just use a SPV client.  You aren't contributing to the decentralization of the network if your node has an uptime of ~3%.

You do realize not everyone is doing it to contrib to the network some just want the most trust-less solution but I guess the core devs don't want those people using bitcoin-qt.
donator
Activity: 1218
Merit: 1079
Gerald Davis
People, stop saying that scalability is not a problem and writing about how cheap hard drives are.
The scalability is the number one problem stopping Bitcoin from becoming mainstream.
It doesn't matter how fast the drives are growing, right now the blockchain keeps all the old information which is not even needed, and grows indefinitely, how hard is it to understand that it is a non-scalable non-future-friendly scheme?
I am sure the devs know this and are doing their best to address it and I am grateful for that. But saying that it's not a problem is just ignorant and stupid.

Quote
We won't get some real big transaction volume because of this issue.
I can't see how anybody is even arguing against this. I mean, it's even in the wiki: https://en.bitcoin.it/wiki/Scalability

The historical storage is a non-issue and the scalability page points that out.  Bandwidth (for CURRENT blocks) presents a much harder bottleneck to extreme transaction levels and after bandwidth comes memory as fast validation requires the UXTO to be cached in memory.  Thankfully dust rules will constrain the growth of the UXTO however both bandwidth and memory will be an issue much sooner and quicker than the storing the blockchain on disk. 

The the idea that today's transaction volume is held back because of the "massive" blockchain isn't supported by the facts.  Even the 1MB block limit provides for 7 tps and the current network isn't even 0.5 tps sustained.  We could see a 1,300% increase in transaction volume before even the 1MB limit became an issue.  At 1 MB per block the blockchain would grow by 50 GB per year.  It would take 20 years of maxed out 1MB blocks before the blockchain couldn't fit on an "ancient" (in the year 2033) 1TB drive.  

Beyond 1MB the storage requirements will grow but they will run up against memory and bandwidth long before disk space becomes too expensive.  Still as pointed out eventually most nodes will not maintain a copy of the full blockchain, that will be a task reserved for "archive nodes" and instead will just retain the block headers (which is ~4MB per year) and a deep enough section of the the recent blockchain.


legendary
Activity: 1722
Merit: 1217
Once again, people are working on scalability. Donate if you really care about the problem and want to help:

http://utxo.tumblr.com/

so is the idea here to just to expand the max block size for miners when ever we hit a wall and make it so that non-mining nodes dont need that level of bandwidth to audit transactions?

sorry i have put a lot of work into understanding bitcoin in the abstract but im not computer scientist. a lot of the technical minutia goes over my head, especially with proposed alterations to bitcoin when all my effort has been put towards understanding bitcoin as it stands.
full member
Activity: 203
Merit: 100
People, stop saying that scalability is not a problem and writing about how cheap hard drives are.
The scalability is the number one problem stopping Bitcoin from becoming mainstream.
It doesn't matter how fast the drives are growing, right now the blockchain keeps all the old information which is not even needed, and grows indefinitely, how hard is it to understand that it is a non-scalable non-future-friendly scheme?
I am sure the devs know this and are doing their best to address it and I am grateful for that. But saying that it's not a problem is just ignorant and stupid.

Quote
We won't get some real big transaction volume because of this issue.
I can't see how anybody is even arguing against this. I mean, it's even in the wiki: https://en.bitcoin.it/wiki/Scalability
legendary
Activity: 3878
Merit: 1193
The blockchain on my phone is 1.06 MB. I think the blockchain is just fine in size.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Leaving it offline too long?

Aye, I'm a non-hardcore casual bitcoiner. But that was an example of an issue related to slow downloading/uploading speed. Freshly mined blocks can't be pruned.

If you are a casual user unable to keep the client online why not just use a SPV client.  You aren't contributing to the decentralization of the network if your node has an uptime of ~3%.
legendary
Activity: 2142
Merit: 1010
Newbie
Leaving it offline too long?

Aye, I'm a non-hardcore casual bitcoiner. But that was an example of an issue related to slow downloading/uploading speed. Freshly mined blocks can't be pruned.
legendary
Activity: 1498
Merit: 1000
What 3rd party server.  I run my own node and will be able to for at least a century at this rate in blockchain growth.

That is what you commented on my post about, that is what I was talking about. 3rd Party servers will never be a factor in this system.
hero member
Activity: 630
Merit: 500
Yesterday I spent a whole hour downloading blocks for last 4 weeks. Not very convenient! Seems my 1 TB drive didn't help much.

What am I doing wrong?

Leaving it offline too long?
legendary
Activity: 2142
Merit: 1010
Newbie
Yesterday I spent a whole hour downloading blocks for last 4 weeks. Not very convenient! Seems my 1 TB drive didn't help much.

What am I doing wrong?
sr. member
Activity: 354
Merit: 250
Are any miners considering allowing people to sweep their 0.00000001 amounts to addresses with larger amounts fee-free? Though I guess if there's only 100 MB of addresses with positive balances no one might care.
donator
Activity: 1218
Merit: 1079
Gerald Davis
What 3rd party server.  I run my own node and will be able to for at least a century at this rate in blockchain growth.
legendary
Activity: 1498
Merit: 1000
We won't get some real big transaction volume because of this issue.

It will just force the economies around the system to change.  Not everyone will be able to maintain a real block chain.  We need to work on trust issues with relying on 3rd parties to verify transactions for us....

this doesn't slow the machine down, just causes change.

WOW so we should just give up and forgot about the core of bitcoin. We should just turn over and die I guess. I see another person that drinks the core dev team juice. The blockchain needs to be reworked to fix a very simple problem with the need for a complex solution.
legendary
Activity: 1498
Merit: 1000
We won't get some real big transaction volume because of this issue.

It will just force the economies around the system to change.  Not everyone will be able to maintain a real block chain.  We need to work on trust issues with relying on 3rd parties to verify transactions for us....

this doesn't slow the machine down, just causes change.

WOW so we should just give up and forgot about the core of bitcoin. We should just turn over and die I guess. I see another person that drinks the core dev team juice. The blockchain needs to be reworked to fix a very simple problem with the need for a complex solution.

Satoshi believed from day 1 that not every user would maintain a full node.  That is why his paper includes a section on SPV.

Their a huge difference between a 3rd party server and SPV clients. Yes one day, when it takes 100's of GBs and their is no more optimizations that can be done.
donator
Activity: 1218
Merit: 1079
Gerald Davis
We won't get some real big transaction volume because of this issue.

It will just force the economies around the system to change.  Not everyone will be able to maintain a real block chain.  We need to work on trust issues with relying on 3rd parties to verify transactions for us....

this doesn't slow the machine down, just causes change.

WOW so we should just give up and forgot about the core of bitcoin. We should just turn over and die I guess. I see another person that drinks the core dev team juice. The blockchain needs to be reworked to fix a very simple problem with the need for a complex solution.

Satoshi believed from day 1 that not every user would maintain a full node.  That is why his paper includes a section on SPV.  Decentralized doesn't have to mean every single human on the planet is an equal peer in a network covering all transactions for the human race.   tens of thousands or hundreds of thousands of nodes in a network used by millions or tens of millions provides sufficient decentralization that attacks to limit or exploit the network becomes infeasible.
full member
Activity: 157
Merit: 101
We won't get some real big transaction volume because of this issue.

It will just force the economies around the system to change.  Not everyone will be able to maintain a real block chain.  We need to work on trust issues with relying on 3rd parties to verify transactions for us....

this doesn't slow the machine down, just causes change.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Blockchain size - 8.3 GB

UXTO ~0.24 GB and growing linearly by about 0.1 GB per year.

Seeing as my workstation has 16GB of RAM and 3TB of storage I will probably need to upgrade my system by the year 2130.  I put it on my google calender.
legendary
Activity: 2142
Merit: 1010
Newbie
Blockchain size - 8.3 GB
Pages:
Jump to: