when will that be feasible? probably not in the next 10 years right?
That's 390 zettabytes. Various estimates (linked below) put
global storage at around 175-200 zettabytes by 2025. So globally we will be storing 390 zettabytes by around 2030, I would imagine. How long will it take to turn the storage for 8 billion people in to a medium which can be bought, owned, and operated by a single person? I would say well over 100 years.
dna could store that in about 3 kilograms apparently. dna data storage has its issues though. so it won't make the cut to users desktops.
I'm sure when hard drives were first developed they had a high research cost to write to them also. but obviously it goes without saying price has to come down to reach consumers desktop. it always does.
The price to read/write to "DNA" will come down, but the cost to process that much data will exceed the available amount of resources required to process that much data. See the above
picture posted by NeuroticFish above.
in order to check n private keys, the computer would need to perform n calculations.
It's far more than a single calculation per private key to arrive at an address which can be checked for balance. And if you don't want to perform those calculations every single time you want to check for balance and would rather just have a list of addresses to look up, then you are going to need to multiply your storage capacity several times if you want to cover every address type.
You are right. I was thinking in terms of
Big O Notation for the time complexity of calculating an address, based on a private key. So if you want to calculate
j addresses from their private keys, you will perform
p *
j calculations, and if you want to calculate
(j +1) addresses from their private keys, you will need to perform
p *
(j + 1) calculations. Or, to put it another way, for every additional address you want to calculate from a private key, you will need to perform a consistent additional number of calculations, with the consistent being a positive integer.