where does he get this?:
But 50% per year growth is really good. According to my rough back-of-the-envelope calculations, my above-average home Internet connection and above-average home computer could easily support 5,000 transactions per second today.
That works out to 400 million transactions per day. Pretty good; every person in the US could make one Bitcoin transaction per day and I’d still be able to keep up.
After 12 years of bandwidth growth that becomes 56 billion transactions per day on my home network connection — enough for every single person in the world to make five or six bitcoin transactions every single day. It is hard to imagine that not being enough; according the the Boston Federal Reserve, the average US consumer makes just over two payments per day.
Using 5,000 tps and an average transaction size of 512 bytes (
https://en.bitcoin.it/wiki/Scalability#Network), that is 2.44 MB/s, which would require a 19.53 megabit/s connection, which is indeed reasonable. Signature verification can be handled by a 2.2GHz i7 at a rate of 4k tps, so 5k isn't too far out of the water for a high end CPU.
So yes, he's right that his computer can handle the throughput. What he doesn't mention is that this amount of data fills up ~206 GB of hard drive space per day while it is also saturating your bandwidth and CPU throughput.
I may have a ridiculous setup with 6 hard drives and way more storage than I need, but even I can't handle 1TB of information every 5 days without laying out serious cash on a network storage solution for my mining computer to utilize to store the blockchain.
I agree and I would like to add.
You cannot store 20 years old unspent output into archive. You need to have instant access to those bitcoin if you are miner.
Don't be so sure that you won't be able to store 20 years of UTXO dust 15 years from now. You'll probably be able to do it in a $5 chip the size of a postage stamp.
http://www.tweaktown.com/reviews/6815/sandisk-ulltradimm-ddr3-400gb-ssd-enterprise-review/index.htmlStorage density is EXCEEDING Moore's law for the last 10+ years...What's your source, because my source (Computer Architecture: A Quantiative Approach 5th edidion, 2012), says that magnetic storage has been growing at about 40% per year since 2004:
Even flash is doubling every 2 years, not the 18 month timeline for moore's law.
Edit: not sure why my image is choking the proxy,
http://yrral.net/storage_growth.pngI've been fiddling with computers since the early 80's. The magnetic capacities reasonably available to me as a relatively interested enthusiast are falling off a cliff. I posted about it in this thread earlier. So when it comes to Moore's Law, 'show me the beef', but don't lead me to NASA's supercomputer cluster to do so. Anyway, Bitcoin is in it's tiny head-of-the-pin phase and already it has already gone through significant phases of struggle.
I will also say that people who not have actually tried transferring big wads of data around the global internet should keep their mouth's shut. That includes transferring said data chunks from outfits like Amazon through very high-end connectivity to outfits like {very large tech company here.} Even very experienced engineers who had not tried it were shocked when theory and practice didn't come anywhere close to meeting. It comes down to basic physics and how TCP/IP works. Yes, there are tricks to make the problems somewhat diminish but there are no silver bullets here...or at least ones which are very applicable to lower end users. Maybe there will be, or they will develop, but absent a market demand it is not likely. Opening 10.000 sockets is not going to be very welcome by one's ISP (even if they do have an OS and router and modem which handles it with grace.)
I once asked Gavin if he thought about trying to get the testnet up to maximum capacity. Blank stare followed. I shit you not! That was back in 2011. To bad nobody ever did try it or the emergency hard fork for the BDB mis-config (NOT 'bug') might not have been such a shocker. At the very least one should try to run the 10-year out capacities (adjusted for a realistic effect of Moore's Law') as a test and make sure they can get it working before inserting the exponential system growth that Gavin promotes. Not holding my breath for that one. I expect that TBF's party line will be 'Don't worry, code will improve.'
edits: minor