Pages:
Author

Topic: what are they going to do when the bitcoin chain becomes prohibitively large? (Read 2391 times)

donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
I will have my blockchain gold-plated and wear it on my bronzed bare chest.
hero member
Activity: 868
Merit: 1000
newbie
Activity: 42
Merit: 0
all valid and good points, thanks franky1!
legendary
Activity: 4424
Merit: 4794
I was wondering about an other angle actually, the download speed is not going to get significantly faster to catch up with the size growth.
Average Joes will not bother to download something say 200-300Gb, even if it gets bigger. Even mining enthusiast will give up (not join mining any more)

so, at the end we could end up with a few centralized servers running the blockchain (when it is the size of 10Tb+ ) and this could cause all sort of other issues.

what is your view on this?

you say download speeds when blockchain is 200-300GB..
ok rebuttle time.
at the moment the transactions are no where near the 52gb yearly potential so a 4-6 year timescale is a minimum, im thinking maybe 8 years before the blockchain gets to that size.

so 8 years ago alot of people were on dial-up, and only just getting into ADSL at maybe half a mb/s... now we are a majority on ADSL ranging 2mb in the country, 8-12mb in most industrial towns and cities and upto 30mb in singapore..

so lets look 8 years into the future. .... hmmm it looks pretty fast to me, enough to cope with 300gb of data.

but with that said the majority of people will use phone apps attached to places like blockchain.info and other clones. but those serious about bitcoin will have no problem downloading the blockchain.

short story: don't worry about something in years to come, it may never happen
legendary
Activity: 1536
Merit: 1000
electronic [r]evolution
but I'm sure there are better technical solutions possible
The concepts you presented are not solutions necessarily applicable to cryptocurrency. You need to understand the way the blockchain works before you can begin suggesting alternative storage mechanisms. If you want to read a viable proposal then read my post on the first page of this thread and follow the link.
legendary
Activity: 1001
Merit: 1005
Oh no.. I did some research and we are all screwed! 


sr. member
Activity: 434
Merit: 250
In Hashrate We Trust!
Downloading all blockchain means less decentralization as it grows bigger which is against the ideal of bitcoin.
Besides that it's an ugly architecture. Statistically you would be fine with a fraction of the data.

It's convenient to not touch the current design of the blockchain, but I'm sure there are better technical solutions possible:

Distributed Hash Table, example: freenet, bittorrent
http://en.wikipedia.org/wiki/Distributed_hash_table

Graph database, example: neo4j, titan
http://en.wikipedia.org/wiki/Graph_database

Distributed data store, example: Voldemort, cassandra
http://en.wikipedia.org/wiki/Distributed_data_store
newbie
Activity: 42
Merit: 0
I can only say amen to this, (and I am not even religious)
legendary
Activity: 4760
Merit: 1283

The blockchain will never become to large...for Google, Facebook, etc.  They can 'peer' with one another so we still have the much hyped 'p2p' system.

At that point we can just hook up with our clients as planned.  The cool thing is that transaction fees will likely be out the window since the value of the user data harvested will far exceed the cost of operation.

Not only that, but the these companies can handle all the laborious work of keeping track of real-world identities, bio-metric authentication, honoring blacklists, etc.  Hell, they'll probably get things so nailed down that they'll fund charge-backs out of their own profits as does VISA when my credit card gets mysteriously used to bye gasoline all over Northern California.  No more losing BTC by being an idiot about security!

What's not to love?!?

newbie
Activity: 42
Merit: 0
I was wondering about an other angle actually, the download speed is not going to get significantly faster to catch up with the size growth.
Average Joes will not bother to download something say 200-300Gb, even if it gets bigger. Even mining enthusiast will give up (not join mining any more)

so, at the end we could end up with a few centralized servers running the blockchain (when it is the size of 10Tb+ ) and this could cause all sort of other issues.

what is your view on this?
legendary
Activity: 1512
Merit: 1049
Death to enemies!
Bitcoin will become specialized. Miners (pool operators) need to have a full blockchain. Miners already have specialized hardware that is expensive. Large storage devices to complement the mining are cheap compared to ASICs. I still see no problem.
legendary
Activity: 1536
Merit: 1000
electronic [r]evolution
Don't worry it will be many years before 0,1% of the blockchain is too big
Lol... how long it takes is irrelevant, unless it's longer than the lifetime of the sun.

Facebook uses distributed databases on hadoop (hdfs) which is 21 PB (PetaByte) that's 1 million times bigger than the size of our beloved blockchain.
Typically the best way to store large amounts of data is on a distributed file system where each system stores a small part of the whole thing. But that is not true for bitcoin because of the reasons I already mentioned, mainly that it would require a prohibitive amount of network bandwidth to transmit the same data over and over again. It is also a bad idea because it opens the possibility for crucial data to be lost. Unless every node has a copy of the same data then all that needs to happen to lose parts of the blockchain is that those nodes holding the data leave the network and never return. That is not an acceptable situation because it could wipe out the balances of coin holders and break the network.
legendary
Activity: 1512
Merit: 1049
Death to enemies!
As I'm writing his message I am running full bitcoin node on 12 years old Pentium4 computer that was made in year 2002. It might not be fast but it works. I think that any Core2Duo or AMD FX based desktop computer with 1TB+ hard drive will be able to run full node for at least next 10 years. Long after these machines will become useless for running latest computer games or watching hyper-HD videos.

If not, use SPV clients like Electrum or Multibit. And at some point Bitcoin-Qt will have pruning. The basics to do this already are in the design and code. The blockchain size is not a problem as I see it.
sr. member
Activity: 434
Merit: 250
In Hashrate We Trust!
Proposal: a distributed database, Each Node keeps track of random blocks of the blockchain
Each node should probably keep at least 0,1% of the blockchain to prevent data corruption, data loss and double spending.
There are multiple problems with this idea. First of all it doesn't solve anything in the long term because eventually people will be asking "what happens when 0.1% of the blockchain becomes prohibitively large". Secondly, nodes require most of the blockchain in order to verify transactions and calculate address balances. If they only held a small portion of the blockchain then every time they needed to verify a transaction or make a transaction they would need to search around for other nodes which had the data they needed, and that would dramatically increase the amount of network traffic. It's just a totally impracticable way of going about things.
Don't worry it will be many years before 0,1% of the blockchain is too big.

Facebook uses distributed databases on hadoop (hdfs) which is 21 PB (PetaByte) that's 1 million times bigger than the size of our beloved blockchain.
http://hadoopblog.blogspot.se/2010/05/facebook-has-worlds-largest-hadoop.html

It's of course two completely different use cases, but as you see such architectures can handle huge datasets.
http://hbase.apache.org/

Here is another thread suggesting a similar solution:
Bitcoin addon: Distributed block chain storage
https://bitcointalksearch.org/topic/bitcoin-addon-distributed-block-chain-storage-197810
legendary
Activity: 1536
Merit: 1000
electronic [r]evolution
Proposal: a distributed database, Each Node keeps track of random blocks of the blockchain
Each node should probably keep at least 0,1% of the blockchain to prevent data corruption, data loss and double spending.
There are multiple problems with this idea. First of all it doesn't solve anything in the long term because eventually people will be asking "what happens when 0.1% of the blockchain becomes prohibitively large". Secondly, nodes require most of the blockchain in order to verify transactions and calculate address balances. If they only held a small portion of the blockchain then every time they needed to verify a transaction or make a transaction they would need to search around for other nodes which had the data they needed, and that would dramatically increase the amount of network traffic. It's just a totally impracticable way of going about things.
legendary
Activity: 4424
Merit: 4794
Buy 8TB external HDD for Bitcoin data directory and be prepared for next 6 or 7 years of blockchain growth! Or buy the second-hand 8TB drive 5 years from now for couple of dollars.

dont be silly with your bad maths.

each block =1mb so 6 blocks a hour, 24 hours a day = 144mb a day
365 days a year = 52,560mb (52gb)

thats 52GB per year, so lets multiply that by 10 years... oh look half a terrabyte not 8.. but 16 times less.

to add another point. 10 years ago people complained that a game using 1GB was epic. now COD and crises use more then 15GB. so imagine in 10 years time.

hard drive capacity use to be 100gb max ten years ago, now you can buy a 2TB HD for under £$100.

imagine what will be the norm in 10 years.

member
Activity: 84
Merit: 10
are u aware that they make bigger and bigger hard drives each year   Cool
sr. member
Activity: 434
Merit: 250
In Hashrate We Trust!
The blockchain is a very primitive method of distributing data among many nodes.
Instead we need a distributed database.

Current numbers for the bitcoin network:
Nodes = 100,000
Blocks = 300,000

Blockchain: Each nodes keeps track of all blocks as a reference

Proposal: a distributed database, Each Node keeps track of random blocks of the blockchain
Each node should probably keep at least 0,1% of the blockchain to prevent data corruption, data loss and double spending.
legendary
Activity: 952
Merit: 1005
--Signature Designs-- http://bit.ly/1Pjbx77
I don't like the huge size of the blockchain,
not because of it taking too much space on my harddisk,
it's the time it takes to download it (or update it).

Storage is cheap, and it will continue to get cheaper, so it is not a problem
It's a real hassle to wait and wait to synchronise...

Time is much more valuable Cheesy
legendary
Activity: 1540
Merit: 1000
This is becoming a serious problem now, rather than adapting their code programmers everywhere just shrug and expect their customers to buy high end PCs that easily cost over £800, for a lot of people it isn't realistic to do that for most things because they're on a budget. Games are a perfect example that I'm worried about because rather than just deciding on a certain number of polygons etc. to use programmers now are just vomiting effects all over the screen and expecting people to buy high end systems to cope with it, this is just sloppy coding practice and bad forethought it will also drive a lot of customers away that don't want to spend the money on these systems just to play usually one or two games on it a bit like the whole console exclusives problem.

Nevermind games, if you're going to do this to someone with a business with software they use and force them to get several hundred pounds of equipment just so they can cope with it until the next software comes out which will require another machine you're just going to get the middle finger. I think that the Bitcoin dev team are more intelligent than this but this is precisely why I support altcoins and wouldn't have taken an interest in Bitcoin if it wasn't open source, somebody will find a more efficient solution to the blockchain taking up so much space if they won't.

You might make companies like Intel and AMD happy with that kind of attitude but everybody else is going to hate it.

Yeah, there was a time when games were written in a relatively high-level language such as C++, but then they would go in and optimize specific sections of the code in a much lower level language (typically assembly language) to make it as efficient as possible. Nowadays the thinking is that since computers are so powerful, it's a waste of time and expertise to do that, so they just churn out sloppy code in a high level language and waste a ton of resources.  I do not like this trend either.

Also, the level of skill in the average programmer now vs 10 years ago is appalling... now just about any retard that managed to graduate college with even the most rudimentary understanding of programming can get a job.

I'm really glad I'm not the only one noticing this Tongue the thing is though, I'm not even really a programmer, but it's so easy to tell the more you poke and prod at software these days.
Pages:
Jump to: