Pages:
Author

Topic: FACT CHECK: Bitcoin Blockchain will be 700GB in 4 Years - page 7. (Read 9345 times)

legendary
Activity: 1806
Merit: 1024
wow it will take me almost 1 week to finish the download i suppose rofl or even more than that

And that's the problem, not everyone will be able or willing to download the full blockchain. True decentralization can only be achieved if everyone has the same opportunity to download the full blockchain, if only rich countries can afford to run a full node, already it's the beginnings of centralization. Basically back to square one.

Yes it may become a problem. And the problem of bandwidth is much more pressing than that of storage, because most people in the industrialized countries are greatly overestimating the real upload speeds available worldwide. Many providers - even in countries with very good infrastructure - have explicit or implicit bandwidth limits and in many developing countries, sufficient bandwidth is either not reliably available or extremely expensive (sometimes even at Western standards - without taking into account the significantly lower buying power of the population in these countries).

So to preserve decentralization, Bitcoin must be scaled in the least resource expensive way possible. In my opinion, segregated witness and lightning networks are a step in the right direction, because they address the issue of microtransactions in a responsible way. Trying to store all transactions into blocks in the current format by allowing unlimited block sizes would be the end of a decentralized Bitcoin network.

ya.ya.yo!
legendary
Activity: 1148
Merit: 1000
wow it will take me almost 1 week to finish the download i suppose rofl or even more than that

And that's the problem, not everyone will be able or willing to download the full blockchain. True decentralization can only be achieved if everyone has the same opportunity to download the full blockchain, if only rich countries can afford to run a full node, already it's the beginnings of centralization. Basically back to square one.
hero member
Activity: 1568
Merit: 511
wow it will take me almost 1 week to finish the download i suppose rofl or even more than that
legendary
Activity: 1456
Merit: 1000
Well in 5 years or more the production process will be much more refined than it is right now and the price of electronics will be considerably cheaper - I would say more than 50% cheaper than it is now. For example the price of SSDs almost halved in recent years so I would assume the price of HDDs in 5 or 10 years from now will be more or less dirt cheap. Plus not everyone is running or planning to run a Bitcoin node on their home computer. Heck you could buy a 5TB home server right now and not break the bank.

It's not Moore's Law that's the issue.

It's Nielsen's Law that's the issue.

legendary
Activity: 2604
Merit: 1036
Well in 5 years or more the production process will be much more refined than it is right now and the price of electronics will be considerably cheaper - I would say more than 50% cheaper than it is now. For example the price of SSDs almost halved in recent years so I would assume the price of HDDs in 5 or 10 years from now will be more or less dirt cheap. Plus not everyone is running or planning to run a Bitcoin node on their home computer. Heck you could buy a 5TB home server right now and not break the bank.
legendary
Activity: 2674
Merit: 3000
Terminated.
Imagine 100 gigabytes 20 years ago, terrifyingly large, not such a big deal of a file today, in 20 years I imagine a similar issue.
We can't predict technological growth that far out into the future.

To be honest I have found 100 gigs to be fairly large to download, however it isn't all that big of a file once it is downloaded and ready to go. It might be something like a 1/10th of a terabyte, but that isn't that bad is you do enough for data management and so.
It isn't just the download, it's also the validation time. A full client may occasionally get broken for whatever reason, and that usually requires a reindex which takes a painful amount of time on non high-end hardware.

Well, in the future only big miners will mining bitcoin...
This is also known as ignoratio elenchi (irrelevant conclusion).
sr. member
Activity: 434
Merit: 250
And what about 10, 20 years?

1000TB?? Anyone have a bigger projection about this info?


Imagine 100 gigabytes 20 years ago, terrifyingly large, not such a big deal of a file today, in 20 years I imagine a similar issue.
To be honest I have found 100 gigs to be fairly large to download, however it isn't all that big of a file once it is downloaded and ready to go. It might be something like a 1/10th of a terabyte, but that isn't that bad is you do enough for data management and so.

In 20 years, chances are 20 terabytes will just be moderate amounts of data.

Hope the technology can raise exponentially too however it's 2016 and I have only 300gb hd... with a 5mb internet lol

Well, in the future only big miners will mining bitcoin...
legendary
Activity: 1218
Merit: 1007
And what about 10, 20 years?

1000TB?? Anyone have a bigger projection about this info?


Imagine 100 gigabytes 20 years ago, terrifyingly large, not such a big deal of a file today, in 20 years I imagine a similar issue.
To be honest I have found 100 gigs to be fairly large to download, however it isn't all that big of a file once it is downloaded and ready to go. It might be something like a 1/10th of a terabyte, but that isn't that bad is you do enough for data management and so.

In 20 years, chances are 20 terabytes will just be moderate amounts of data.
legendary
Activity: 1484
Merit: 1004
And what about 10, 20 years?

1000TB?? Anyone have a bigger projection about this info?


Imagine 100 gigabytes 20 years ago, terrifyingly large, not such a big deal of a file today, in 20 years I imagine a similar issue.
yeah 700gb in 20 years later maybe like 20gb now?
as technology growing so faster , you will not to think 700gb as a large size then
people will find another technology to make the 700gb as cheap as possible 
really not a big deal .
legendary
Activity: 2674
Merit: 3000
Terminated.
And what about 10, 20 years?

1000TB?? Anyone have a bigger projection about this info?
This question is pointless and so would the follow up questions be (30, 40, 50 years, etc.). Just do your own math, it's really simple: average expected block size x number of blocks a day x number of days a year x number of years that you want. You won't get the most accurate projection, but you will get a rough picture and will avoid redundant questions.
sr. member
Activity: 322
Merit: 250
And what about 10, 20 years?

1000TB?? Anyone have a bigger projection about this info?

legendary
Activity: 2674
Merit: 3000
Terminated.
segwit will rebuild the blocks ... after 1 year.
and then, the whole local blockchain will be reduce.
and then, olds clients will not be able to connect to eradicate 0-confirmation trap.

it's a revolution.

Local blockchain will be at 30 Go at the end of the 2017.
What are you talking about? I'm pretty sure that nothing like that is going to happen.
legendary
Activity: 1512
Merit: 1012
segwit will rebuild the blocks ... after 1 year.
and then, the whole local blockchain will be reduce.
and then, olds clients will not be able to connect to eradicate 0-confirmation trap.

it's a revolution.

Local blockchain will be at 30 Go at the end of the 2017.

it's really a revolution.











legendary
Activity: 2674
Merit: 3000
Terminated.
Hey, thanks for chipping in. I'm concerned that if I apply and show good moderator skills, I might get asked to join the gang  Grin
I wouldn't worry about it, that isn't likely to happen just because you run self-moderated threads. Cheesy

Anyway, to your substantive point. SegWit is not yet implemented, so I want to see the evidence before making a change to my assumptions on the ~4% monthly growth rate.
False. Segwit is implemented in 0.13.0, there are just no activation parameters (i.e. it is not active). They're making some final changes for 0.13.1 which should be released soon.

Another, for instance to counter your point. If multi-sig keeps growing in popularity, does that wipe out the benefits of SegWit, making the need for a block size increase more likely?
Well, the same could be asked for a block size increase, could it not (e.g. what if bigger average TX size becomes more popular)? Multisig with P2SH should be safer with Segwit, so I expect even more usage as well.Actually, if more people used multi-sig Segwit should be able to provide a bit more headroom IIRC.
It is expected to see a capacity of anywhere between 1.6 MB to 2 MB on average. The exact numbers are debatable, and yet to be seen in practice (you can/could check testnet blocks).  Someone did the math a few months back on the mailing list; I used to quote that, but can't find it at this time. If I do, I'll post it here.
legendary
Activity: 1456
Merit: 1000
I have no idea how this thread didn't catch my eye yet. I guess there's too much crap in the section.

in some ways this is very technical so it really requires a lot of analytical thinking.
No, all that this graph and 'calculation' requires is a brain.

But as a not so technical type of person like me i also know that this thing would really happens but as we can see processor's of the computers are also innovating right? so i think their system could still afford to do so.
Speculating on an ever dying Moore's "law" (which isn't even a law in the traditional sense) can and will lead to trouble.

@OP: There's too much crap in this thread that isn't worth reading; please start using self-moderated threads in the future. That said, you're assuming a growth rate in % or have you done your calculations based on X average block size per day/week/month? Did you factor in Segwit? If not, you will have to update that chart.

Hey, thanks for chipping in. I'm concerned that if I apply and show good moderator skills, I might get asked to join the gang  Grin

Anyway, to your substantive point. SegWit is not yet implemented, so I want to see the evidence before making a change to my assumptions on the ~4% monthly growth rate.

Another, for instance to counter your point. If multi-sig keeps growing in popularity, does that wipe out the benefits of SegWit, making the need for a block size increase more likely?

However, I have put out a challenge on the assumptions I have made...and so far you have made the best challenge. Thanks.

legendary
Activity: 2674
Merit: 3000
Terminated.
I have no idea how this thread didn't catch my eye yet. I guess there's too much crap in the section.

in some ways this is very technical so it really requires a lot of analytical thinking.
No, all that this graph and 'calculation' requires is a brain.

But as a not so technical type of person like me i also know that this thing would really happens but as we can see processor's of the computers are also innovating right? so i think their system could still afford to do so.
Speculating on an ever dying Moore's "law" (which isn't even a law in the traditional sense) can and will lead to trouble.

@OP: There's too much crap in this thread that isn't worth reading; please start using self-moderated threads in the future. That said, you're assuming a growth rate in % or have you done your calculations based on X average block size per day/week/month? Did you factor in Segwit? If not, you will have to update that chart.
legendary
Activity: 1456
Merit: 1000
Do to continuous usage of bitcoin transaction it's not surprising to hear that blockchain's size will be 700GB
 It's of normal size. Even a medium e commerce site would generate 1TB data in 1 year. As I had worked in a live project so it's of normal size.

We are talking about Bitcoin generating 1TB every month, at current levels. So at 700GB it would generate 10TB per month.
full member
Activity: 210
Merit: 100
Do to continuous usage of bitcoin transaction it's not surprising to hear that blockchain's size will be 700GB
 It's of normal size. Even a medium e commerce site would generate 1TB data in 1 year. As I had worked in a live project so it's of normal size.
hero member
Activity: 1246
Merit: 588
in some ways this is very technical so it really requires a lot of analytical thinking. But as a not so technical type of person like me i also know that this thing would really happens but as we can see processor's of the computers are also innovating right? so i think their system could still afford to do so.
legendary
Activity: 4354
Merit: 3614
what is this "brake pedal" you speak of?
yeah that initial download is a killer. i have a 1.5 gigabit connection, yes gigabit, not gigabyte. downloading it from scratch at this size is not realistic for me. only reason i have it at all is because i started in 2013 or so when the size wasnt too bad and kept with it every day.

my bandwidth is so bad that i regularly backup the blockchain to several places so i dont have to download it again if it gets corrupted, just start  from last backup which is about every week or so.
Pages:
Jump to: