Pages:
Author

Topic: FACT CHECK: Bitcoin Blockchain will be 700GB in 4 Years (Read 9345 times)

legendary
Activity: 1456
Merit: 1000
Well, not every node requires 146 connections. While I commend that user on his beneficial fanout, it is little wonder that his meager 20Mb/s upload rate is a limitation. While I've not seen a survey, I would assume 'a handful of up connections and a balanced number of down connections' might be a more pervasive usage model. Maybe divide by ten?
6th degree of separation/kevin bacon logic:
knowing theres under 6000 nodes
if everyone had 146 connections. then everyone would receive data within under 2 hops(relays)
146*146=21316 nodes get it
if everyone had 6 connections. then everyone would receive data within 5 hops(relays)
6*6*6*6*6=7776 nodes get it
and so on
9 connections is 4 hops(relays) needed 9*9*9*9=6561 nodes get it
18 connections is 3 hops(relays) needed 18*18*18=5832 nodes get it
75 connections is 2 hops(relays) needed 75*75=5625 nodes get it

i would suggest anything over 75 is overkill and only for pools.
and if you do have limited bandwidth you can bring the numbers down to save bandwidth while still connecting to the network and sending data happily and healthily.

@franky1

That guidance was put into the white paper, thanks.

It's all now part of a model that Satoshi predicted, but with some new stuff we've introduced to add encrypted security for those launching a Bitcoin full node using SPV's.

We're also going to be looking at adding 2FA and hardware key generation, but that needs more time to spec out.

http://heliumpay.com
legendary
Activity: 2674
Merit: 3000
Terminated.
Going into Jan 2017, the 100GB prediction is accurate, so still on track for 700GB blockchain by the time of the next halving.

SegWit acceptance and or an increase in the block size will cement the prediction.
Wasn't your prediction already debunked, without Segwit or a block size increase?[1] The maximum that it can grow per year is around ~52 GBs right now (assuming all blocks are 1 MB and found every 10 minutes on average). Due to finite rules, you can't assume a % monthly growth (since the 'amount it grew' will be increasing as well).

[1] I don't remember where we left this thread at and I don't want to read all that franky1 stuff to verify either.

is not that a good thing  Huh
-snip-
Obviously it's a bad thing, and the rest of your post is useless.

-snip-
You should stop spamming since you're very obvious.
jr. member
Activity: 36
Merit: 2
Depends who gets their way. Could be much larger than that
hero member
Activity: 1414
Merit: 505
Backed.Finance
Going into Jan 2017, the 100GB prediction is accurate, so still on track for 700GB blockchain by the time of the next halving.

SegWit acceptance and or an increase in the block size will cement the prediction.

That 700GB is very huge file to download. It will takes time if your connection is slow,like most us here (maybe). I think its much better to used online wallet but the disadvantage is, you don't  have control with your  it.
full member
Activity: 784
Merit: 100
is not that a good thing  Huh
Well, it means that every year, bitcoin users is increasing, and I think blockchain need to upgrade the storage they have. I think it would be better in 2017, because I think, in 2017, people began making their wallet, whether it's in blockchain or anywhere else.
legendary
Activity: 1456
Merit: 1000
Going into Jan 2017, the 100GB prediction is accurate, so still on track for 700GB blockchain by the time of the next halving.

SegWit acceptance and or an increase in the block size will cement the prediction.
legendary
Activity: 1456
Merit: 1000
OK, thanks.

It is on topic because of the optimal use of bandwidth, especially as the chain grows.
legendary
Activity: 4424
Merit: 4794
No, I mean in relation to optimal bitcoin node numbers / distribution.

You can come up with a variation of the 6 degrees of separation concept for bitcoin. Then I can give you attribution
we are digressing off the topic of bloat.
but to answer your questions

many different devs say 9 is a nice round number as a minimum healthy node.

im not interested in attribution.
just use the theory from back in 1929 and make your own calculations.

but keep this in mind
the other thing about the degrees of separation theory is that, the separations are not always organised.

EG 3 degrees of separation, where they can all claim they are they are connected to 3 other people without having to mix with any new faces.
meaning
the concept is that its all organised where each person is a stranger (top part)
the reality is that there are only 4 in the clusters where each person does know 3 other people (bottom part) so both states are true



even in bitcoin we can have connection loops(clusters) where a few hops away a node is connected back to a node thats also a few hops behind him too. which is something that cannot be controlled much in a true decentralized network.
bitcoin has got some good ways to mitigate potential loops. but not perfect

and messes up the count of how many hops it really needs to get to everyone on the network due to some hops looping back on themselves

but anyway just use the theory from back in 1929 and make your own calculations.


legendary
Activity: 1456
Merit: 1000
Well, not every node requires 146 connections. While I commend that user on his beneficial fanout, it is little wonder that his meager 20Mb/s upload rate is a limitation. While I've not seen a survey, I would assume 'a handful of up connections and a balanced number of down connections' might be a more pervasive usage model. Maybe divide by ten?
6th degree of separation/kevin bacon logic:
knowing theres under 6000 nodes
if everyone had 146 connections. then everyone would receive data within under 2 hops(relays)
146*146=21316 nodes get it
if everyone had 6 connections. then everyone would receive data within 5 hops(relays)
6*6*6*6*6=7776 nodes get it
and so on
9 connections is 4 hops(relays) needed 9*9*9*9=6561 nodes get it
18 connections is 3 hops(relays) needed 18*18*18=5832 nodes get it
75 connections is 2 hops(relays) needed 75*75=5625 nodes get it

i would suggest anything over 75 is overkill and only for pools.
and if you do have limited bandwidth you can bring the numbers down to save bandwidth while still connecting to the network and sending data happily and healthily.

Can I use your 6 degrees of separation in a white paper? How can I attribute you or how do you want to be attributed? Pm if preferred.

6 degree's of separation is a theory that has existed for decades they have made movies about it, drinking games about it. google it:
https://en.wikipedia.org/wiki/Six_degrees_of_separation
Quote
It was originally set out by Frigyes Karinthy in 1929

its not my theory.. so use the source(Frigyes Karinthy in 1929) and then make your own calculations Cheesy

No, I mean in relation to optimal bitcoin node numbers / distribution.

You can come up with a variation of the 6 degrees of separation concept for bitcoin. Then I can give you attribution
legendary
Activity: 4424
Merit: 4794
This post is useless and is not relevant at all to anything that has been said here. Just let the franky1 from the other shift do the talking, you're only ruining your own image.

Once again, there is nothing wrong with a pruned node and it has been clearly stated that it is different from a full node.
EG there is nothing wrong with electrum. but electrums network status is not the same as a full node.
no point trying to say that cores pruned mode is the same as full node by saying "theres nothing wrong"

its what you dont say and what is not clarified makes just a bad case as what could have been said

after all people believe they are downloading core because its a full node.
so they need to be informed that using cores pruned mode is not a full node status.

also trying to sell the fiction that everyone can be a full node without worry of bloat is misleading and hiding the truth simply by "never saying" it doesnt make the a full node. because people think just running core makes them a full node no matter what features they run or dont run
EG saying that running an old node is perfectly fine. when it should be highlighted that updates reduce old nodes fullnode status.

seems you are sounding more like carlton again. im guessing he has retrained you as a fanboy sheep.

please think more about USERS, than you do about protecting core. your getting obviously too involved in the fanboy rhetoric more so in what USERS need to know to make educated decisions.

being a fullnode means you have to keep the full data to be a full part of the network.
meaning bloat (this topic) is relevant for full nodes

again before lauda derailed the topic
if you dont want to be a full node by storing the blockchain.. you might aswell just use electrum

end of
legendary
Activity: 2674
Merit: 3000
Terminated.
if ""core" want to pretend its the "core" program for bitcoin. then core should stick to full node status.
if core want to offer light node. then core should be VERY CLEAR that pruning reduces status a node has in the network.

seriously stop shoe shining the devs..

you say core have not said X. but they are clearly not saying Y either. so X needs to be clarified to make people aware of Y
This post is useless and is not relevant at all to anything that has been said here. Just let the franky1 from the other shift do the talking, you're only ruining your own image.

Once again, there is nothing wrong with a pruned node and it has been clearly stated that it is different from a full node.
legendary
Activity: 4424
Merit: 4794
the only problem with pruned mode is once you have pruned it.. you are then not part of the 5500 full nodes able to help the network.
Nobody claimed that they were, ergo strawman.

if people dont want to hold all the data.. then just download electrum. stop messing with full nodes!
There's nothing wrong with a pruned node.

if ""core" want to pretend its the "core" program for bitcoin. then core should stick to full node status.
if core want to offer light node. then core should be VERY CLEAR that pruning reduces status a node has in the network.

seriously stop shoe shining the devs..

you say core have not said X. but they are clearly not saying Y either. X needs to be clarified to make people aware of Y
EG
if they have not said pruned does not help the network. but also not clearly saying pruned enabling pruned reduces network support. then saying atleast something should be said.

but hey, brush it under the carpet like a good old fanboy right? and stick to the core is utopia rhetoric, right?

legendary
Activity: 2674
Merit: 3000
Terminated.
The problem with pruned mode is... you need to download the entire blockchain the first time in order to enable it, so the problem persists... will we be able to enable pruned mode without having to download all of the blockchain?
That's related to bandwidth then, not storage. I don't see how this is a problem though, as you're usually presented with two choices:
1) Run a full client and do full validation.
2) Run a SPV wallet without full validation.

The total blockchain size will keep getting higher which will keep making 1 harder.

the only problem with pruned mode is once you have pruned it.. you are then not part of the 5500 full nodes able to help the network.
Nobody claimed that they were, ergo strawman.

if people dont want to hold all the data.. then just download electrum. stop messing with full nodes!
There's nothing wrong with a pruned node.
legendary
Activity: 4424
Merit: 4794
The problem with pruned mode is... you need to download the entire blockchain the first time in order to enable it, so the problem persists... will we be able to enable pruned mode without having to download all of the blockchain?

the only problem with pruned mode is once you have pruned it.. you are then not part of the 5500 full nodes able to help the network.
in laymens terms no newbies to the network can leach the entire network data of the last 7 years from you. so you become second class and people drop their connection from you because you cant hand them what they desire.

this affects the 6 degree of separation and also the distribution of the blockchain.

if people dont want to hold all the data.. then just download electrum. stop messing with full nodes! edit: if you dont want to be a full node
legendary
Activity: 1204
Merit: 1028
devs should just build the clients with a built-in highly pruned blockchain, and let that be the starting point for syncing old blocks when running that client.

nodes dont need the history from day 1 to work, all they need is a starting point they know will agree with the rest of the network.
So exactly what is this supposed to do? Centralize Bitcoin in order to avoid downloading a part of the blockchain? That's a terrible idea. If you don't have enough storage space right now, you can run a pruned node.

it means placing some trust in the client you run, but that's already the case today, and it kinda always will be...
False.

The problem with pruned mode is... you need to download the entire blockchain the first time in order to enable it, so the problem persists... will we be able to enable pruned mode without having to download all of the blockchain?
legendary
Activity: 4424
Merit: 4794
Well, not every node requires 146 connections. While I commend that user on his beneficial fanout, it is little wonder that his meager 20Mb/s upload rate is a limitation. While I've not seen a survey, I would assume 'a handful of up connections and a balanced number of down connections' might be a more pervasive usage model. Maybe divide by ten?
6th degree of separation/kevin bacon logic:
knowing theres under 6000 nodes
if everyone had 146 connections. then everyone would receive data within under 2 hops(relays)
146*146=21316 nodes get it
if everyone had 6 connections. then everyone would receive data within 5 hops(relays)
6*6*6*6*6=7776 nodes get it
and so on
9 connections is 4 hops(relays) needed 9*9*9*9=6561 nodes get it
18 connections is 3 hops(relays) needed 18*18*18=5832 nodes get it
75 connections is 2 hops(relays) needed 75*75=5625 nodes get it

i would suggest anything over 75 is overkill and only for pools.
and if you do have limited bandwidth you can bring the numbers down to save bandwidth while still connecting to the network and sending data happily and healthily.

Can I use your 6 degrees of separation in a white paper? How can I attribute you or how do you want to be attributed? Pm if preferred.

6 degree's of separation is a theory that has existed for decades they have made movies about it, drinking games about it. google it:
https://en.wikipedia.org/wiki/Six_degrees_of_separation
Quote
It was originally set out by Frigyes Karinthy in 1929

its not my theory.. so use the source(Frigyes Karinthy in 1929) and then make your own calculations Cheesy
legendary
Activity: 1456
Merit: 1000
Well, not every node requires 146 connections. While I commend that user on his beneficial fanout, it is little wonder that his meager 20Mb/s upload rate is a limitation. While I've not seen a survey, I would assume 'a handful of up connections and a balanced number of down connections' might be a more pervasive usage model. Maybe divide by ten?
6th degree of separation/kevin bacon logic:
knowing theres under 6000 nodes
if everyone had 146 connections. then everyone would receive data within under 2 hops(relays)
146*146=21316 nodes get it
if everyone had 6 connections. then everyone would receive data within 5 hops(relays)
6*6*6*6*6=7776 nodes get it
and so on
9 connections is 4 hops(relays) needed 9*9*9*9=6561 nodes get it
18 connections is 3 hops(relays) needed 18*18*18=5832 nodes get it
75 connections is 2 hops(relays) needed 75*75=5625 nodes get it

i would suggest anything over 75 is overkill and only for pools.
and if you do have limited bandwidth you can bring the numbers down to save bandwidth while still connecting to the network and sending data happily and healthily.

Can I use your 6 degrees of separation in a white paper? How can I attribute you or how do you want to be attributed? Pm if preferred.
legendary
Activity: 4424
Merit: 4794
Well, not every node requires 146 connections. While I commend that user on his beneficial fanout, it is little wonder that his meager 20Mb/s upload rate is a limitation. While I've not seen a survey, I would assume 'a handful of up connections and a balanced number of down connections' might be a more pervasive usage model. Maybe divide by ten?
6th degree of separation/kevin bacon logic:
knowing theres under 6000 nodes
if everyone had 146 connections. then everyone would receive data within under 2 hops(relays)
146*146=21316 nodes get it
if everyone had 6 connections. then everyone would receive data within 5 hops(relays)
6*6*6*6*6=7776 nodes get it
and so on
9 connections is 4 hops(relays) needed 9*9*9*9=6561 nodes get it
18 connections is 3 hops(relays) needed 18*18*18=5832 nodes get it
75 connections is 2 hops(relays) needed 75*75=5625 nodes get it

i would suggest anything over 75 is overkill and only for pools.
and if you do have limited bandwidth you can bring the numbers down to save bandwidth while still connecting to the network and sending data happily and healthily.
legendary
Activity: 3080
Merit: 1688
lose: unfind ... loose: untight
We are likely to be in a transition state where home connections are going to reach max capacity. The transition then is going to be towards professionally hosted nodes.

This guy on r/btc posted some stats about his node consumption. He has 200mbit download and 20mbit upload, which is a very good connection in the western world:

He is reaching his connection limits, but that does depend on how he sets his in / out connections.

Well, not every node requires 146 connections. While I commend that user on his beneficial fanout, it is little wonder that his meager 20Mb/s upload rate is a limitation. While I've not seen a survey, I would assume 'a handful of up connections and a balanced number of down connections' might be a more pervasive usage model. Maybe divide by ten?
legendary
Activity: 2114
Merit: 1090
=== NODE IS OK! ==
what do you mean? there is a blocksize limit and blocks come in predefined frequency defined by difficulty
Pages:
Jump to: