Pages:
Author

Topic: Block chain size/storage and slow downloads for new users - page 23. (Read 228658 times)

legendary
Activity: 1526
Merit: 1134
You can use either. Electrum relies on special/custom servers and offloads most of the work to them. MultiBit uses the regular P2P network. There are pros and cons to both models. I prefer the pure SPV/P2P model of course (which is why I implemented it!) but if you have some particular reason why you want to use Electrum then go for it.
newbie
Activity: 50
Merit: 0
Thanks for the complete explanation. This is really a good reading, so at least I can tell some newbs about the reason why they should use Multibit for a start and not worry about it.. Smiley

A quick question: Under what reason should we use Multibit instead of Electrum?
legendary
Activity: 2282
Merit: 1050
Monero Core Team
At some point Bitcoin-Qt will change such that it's able to delete old blocks. The details are still being worked out, but most likely you'll be able to say "Use up to 10 GB of disk space" and it will never use more than that. Nodes will broadcast how much of the chain they have and are able to serve. New nodes that are starting from scratch will have to search out other nodes that still have the full chain and sync from them, but any node that just wasn't online for a while and needs to grab the latest parts of the chain will be able to use most of the others. By controlling disk space usage, you can also indirectly control bandwidth usage (you can't upload data you don't have).

What happens, if in a hundred years from now, almost everyone is using the "Use up to 10 GB of disk space" feature? All the old nodes can probably keep up, but new nodes starting from scratch won't be able to find a node with the full chain, or have difficulty doing so.

Or is that a problem we shouldn't worry about for the next 50 years? I can see full nodes "centralizing" then, but it's possible that there will be entities or countries that will maintain a full node independent of other full nodes.

Right now, everyone running QT or the reference client is a full node.

I can see a far distant future where no normal or regular individual has a full node (except for the geeks / enthusiasts / those who can afford), but there will still be at least maybe 1 full node per country with the bigger countries having several, maybe one per university or one per "department" or "agency" or one per "private corporation".

I mean, large companies already maintain servers for all sorts of purposes. What's another one just dedicated to just bitcoind with 500++ gigs of space (or whatever is the then current capacity of hard drives or equivalent)?

I disagree because we are forgetting about the past in a very big way. While the size of the blockchain is growing at an exponential rate the amount of resources required to store and transmit data is falling at an even greater exponential rate so the ability of running a full node will remain within the financial abilities of regular individuals that choose to for years to come. I actually expect the number of full nodes to increase over time. They will for the most part lie with those who choose freedom, Free Software and GNU / Linux over those who choose subservience, DRM, propriety software, and locked devices such as those running IOS and recently some running Microsoft Windows.

Let us instead of trying to look 100 years into the future take a look at 100 years into the past, and ask the following question: How much would it have cost to send 1MB of data over the telegraph network in 1913? At say 0.50 USD per 10 words http://eh.net/encyclopedia/article/nonnenmacher.industry.telegraphic.us, and let say we can encode 4 bytes per word. This gives us 250000 words or 12500 USD in 1913 dollars. This was way outside of the budget of the average person, but something a millionaire (billionaire in today's USD) such as J. P. Morgan http://en.wikipedia.org/wiki/J._P._Morgan could have afforded. Today of course sending 1MB of data is so close to free that it is very affordable to even some very poor people.

There is a real danger here for Bitcoin it that many for fear of loosing the decentralized nature of Bitcoin to a growing blockchain, will cripple Bitcoin with arbitrary limits, while ignoring the real threat to the decentralized nature of Bitcoin. This threat is that in the future regular people will not be able to run a full Bitcoin node because their computing devices are locked down and infected with DRM. Cost will not be the limitation in the future DRM will. Even today I can purchase a computer perfectively capable of running a full Bitcoin node and pay for it with Bitcoin at FreeGeek http://www.freegeekvancouver.org/ for way less than the price of an iPad or Windows 8 RT "computer".  


legendary
Activity: 1862
Merit: 1114
WalletScrutiny.com
Ok, so lets assume one of Mike Hearn's microtransaction channel per person with a one year timeout was used. This way only 2 transactions would need to hit the blockchain per person per year. All the rest could be instant, micro and off the chain anonymous transactions without third party risk. The dream of every bitcoin user but:
8billion * 2kB = 16TB of data per year. With 50k blocks per year 16TB/50k = 300MB per block. Lets get there in 10 years, keep incentives to move transactions off the blockchain and nobody will complain. Even Satoshi Dice would work better with microtransaction channels anyway.
newbie
Activity: 53
Merit: 0
Thanks, its nice to hear feedback like this, and get some more understanding of the more technical aspects.
newbie
Activity: 8
Merit: 0
Are there any stats someone could provide as to how much bandwidth and storage per month are we talking about to run full node like Blockchain?
My "blocks" folder is currently at 11GB. Since block size is limited, the growth can only be linear in time (even if we occasionally increase the block size limit), but storage per dollar increases exponentially over time. Therefore, I don't see it as a problem in the long run. Also, see http://blockchain.info/charts/blocks-size

Bandwidth is currently a tougher question. I typically see bitcoin-qt using tens of kB/s, sometimes running at 200-300 kB/s for minutes or hours (this is mostly my upload to new users downloading old blocks). Unfortunately, I don't have any solid data to offer, just these observations.

Thank you niko for the info. Those numbers look lite in all things considered.
hero member
Activity: 756
Merit: 501
There is more to Bitcoin than bitcoins.
Are there any stats someone could provide as to how much bandwidth and storage per month are we talking about to run full node like Blockchain?
My "blocks" folder is currently at 11GB. Since block size is limited, the growth can only be linear in time (even if we occasionally increase the block size limit), but storage per dollar increases exponentially over time. Therefore, I don't see it as a problem in the long run. Also, see http://blockchain.info/charts/blocks-size

Bandwidth is currently a tougher question. I typically see bitcoin-qt using tens of kB/s, sometimes running at 200-300 kB/s for minutes or hours (this is mostly my upload to new users downloading old blocks). Unfortunately, I don't have any solid data to offer, just these observations.
newbie
Activity: 8
Merit: 0
Are there any stats someone could provide as to how much bandwidth and storage per month are we talking about to run full node like Blockchain?
hero member
Activity: 980
Merit: 500
FREE $50 BONUS - STAKE - [click signature]
At some point Bitcoin-Qt will change such that it's able to delete old blocks. The details are still being worked out, but most likely you'll be able to say "Use up to 10 GB of disk space" and it will never use more than that. Nodes will broadcast how much of the chain they have and are able to serve. New nodes that are starting from scratch will have to search out other nodes that still have the full chain and sync from them, but any node that just wasn't online for a while and needs to grab the latest parts of the chain will be able to use most of the others. By controlling disk space usage, you can also indirectly control bandwidth usage (you can't upload data you don't have).

What happens, if in a hundred years from now, almost everyone is using the "Use up to 10 GB of disk space" feature? All the old nodes can probably keep up, but new nodes starting from scratch won't be able to find a node with the full chain, or have difficulty doing so.

Or is that a problem we shouldn't worry about for the next 50 years? I can see full nodes "centralizing" then, but it's possible that there will be entities or countries that will maintain a full node independent of other full nodes.

Right now, everyone running QT or the reference client is a full node.

I can see a far distant future where no normal or regular individual has a full node (except for the geeks / enthusiasts / those who can afford), but there will still be at least maybe 1 full node per country with the bigger countries having several, maybe one per university or one per "department" or "agency" or one per "private corporation".

I mean, large companies already maintain servers for all sorts of purposes. What's another one just dedicated to just bitcoind with 500++ gigs of space (or whatever is the then current capacity of hard drives or equivalent)?

Make paid (like in mining) full nodes, but, to make it really spread around make a rule of no more than x nodes per geographical area (city/county/country) or ISP  or per certain ip adress group. That way it will be as decentralised as possible, with lots of people simply co-hosting from their web servers, not some big company datacenter full of servers having a big chunk of the global node count. Also reward for hosting a node should be very smallish. Just to cover average costs by a 5-10%
legendary
Activity: 4760
Merit: 1283
...
tvbcof, much though I dislike industrial-scale spying, it has nothing to do with how the chain is stored or served in future. The chain is a public document by definition.

My comments were directed toward presenting a potential explanation to the 'mystery' of how 'public services' provided by corporate entities come to be.

Many of us never really saw the storage aspect of the block chain as much as a problem, and I'm pretty sure we've been through that before.  Access to the data both locally for functional purposes and for WAN transmission catch-up operations are somewhat more salient concerns, but they are surmountable.  The defining issue here is real-time and near real-time economic activity on the network, and how core Bitcoin is going to evolve to support this.  If it does so natively, I argue that it is almost certain to fall victim to abusive surveillance practices and be under constant threat of technical and legal attacks if it challenges other solutions.

That said, it is a perfectly valid point of view that most potential users don't really care about the privacy issues and the service providers are right to make a dime from their efforts anyway, and also that the solution does not necessarily need to present a challenge to other solutions if implemented 'correctly'.  It's not my point of view, but I can accept it as valid.

legendary
Activity: 1526
Merit: 1134
Yeah. That tech sounds cool, but even magnetic tape can store massive quantities of data for almost no money at all. You don't get fast access to it of course but to serve up the chain you don't need that, what you need is bulk streaming reads which is exactly what tape robots provide.

tvbcof, much though I dislike industrial-scale spying, it has nothing to do with how the chain is stored or served in future. The chain is a public document by definition.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
In theory, Bitcoin can still operate even if every copy of the old parts of the chain are destroyed. It means that to start a new node, you have to copy the directory of an existing node that you trust to be correct. This is obviously a problem if you don't have a node that you trust, and it weakens the "trust nobody" aspect of Bitcoin somewhat, but payments would still flow.

In practice, yes, if the chain became absolutely gigantic then you'd get a small number of large organisations that commit to keeping it around as a kind of public service I'm sure. Or they could sell copies of it on stacks of BluRays or magnetic tapes. Look at it this way. Several organisations keep entire histories of the web (google, archive.org, probably microsoft as well). That's a titanic amount of data too. Somehow it happens anyway.

Such a data storage storage scenario seems highly unlikely, with this kind of tech emerging:
"The [nanostructured glass] allows unprecedented parameters including 360 TB/disc data capacity, thermal stability up to 1000°C and practically unlimited lifetime."
http://www.southampton.ac.uk/mediacentre/news/2013/jul/13_131.shtml

It really is transmission bandwidth, and the in-memory (RAM) requirements, for handling the blockchain which are the limiting factors.
legendary
Activity: 4760
Merit: 1283
In theory, Bitcoin can still operate even if every copy of the old parts of the chain are destroyed. It means that to start a new node, you have to copy the directory of an existing node that you trust to be correct. This is obviously a problem if you don't have a node that you trust, and it weakens the "trust nobody" aspect of Bitcoin somewhat, but payments would still flow.

In practice, yes, if the chain became absolutely gigantic then you'd get a small number of large organisations that commit to keeping it around as a kind of public service I'm sure. Or they could sell copies of it on stacks of BluRays or magnetic tapes. Look at it this way. Several organisations keep entire histories of the web (google, archive.org, probably microsoft as well). That's a titanic amount of data too. Somehow it happens anyway.

Typically through monetizing the intelligence information coming off user access and generally enticing traffic for other properties which do so, though in the case of archive.org I think they may be primarily a philanthropic organization.  One way or another, if they are US based it is a fair assumption that any network interaction will be analyzed by various state and private organizations.  Indeed, unless one is exceedingly careful that assumption should be made about any Internet activity anywhere.

Bringing things back to Bitcoin, I see this sort of utility as a threat to the solution.  Any crypto-currency will be a rich source of intelligence information in proportion to it's success.  There is probably a role for a solution which welcomes such analysis for users who don't care and a role for a different implementation which focuses on hardening against such analysis (and the potential for sister monitoring and legal infrastructure to attack the solution), but each implementation will have one opportunity to decide what camp they wish to be in.  Being the first, Bitcoin get's the 'first round draft choice' here, though it looks to me like the principle movers of Bitcoin have already made the decision.  For my part, I'm falling back into the mode of deciding to live with whatever decision is made...as if I have a choice...

legendary
Activity: 1526
Merit: 1134
In theory, Bitcoin can still operate even if every copy of the old parts of the chain are destroyed. It means that to start a new node, you have to copy the directory of an existing node that you trust to be correct. This is obviously a problem if you don't have a node that you trust, and it weakens the "trust nobody" aspect of Bitcoin somewhat, but payments would still flow.

In practice, yes, if the chain became absolutely gigantic then you'd get a small number of large organisations that commit to keeping it around as a kind of public service I'm sure. Or they could sell copies of it on stacks of BluRays or magnetic tapes. Look at it this way. Several organisations keep entire histories of the web (google, archive.org, probably microsoft as well). That's a titanic amount of data too. Somehow it happens anyway.
legendary
Activity: 3416
Merit: 1912
The Concierge of Crypto
At some point Bitcoin-Qt will change such that it's able to delete old blocks. The details are still being worked out, but most likely you'll be able to say "Use up to 10 GB of disk space" and it will never use more than that. Nodes will broadcast how much of the chain they have and are able to serve. New nodes that are starting from scratch will have to search out other nodes that still have the full chain and sync from them, but any node that just wasn't online for a while and needs to grab the latest parts of the chain will be able to use most of the others. By controlling disk space usage, you can also indirectly control bandwidth usage (you can't upload data you don't have).

What happens, if in a hundred years from now, almost everyone is using the "Use up to 10 GB of disk space" feature? All the old nodes can probably keep up, but new nodes starting from scratch won't be able to find a node with the full chain, or have difficulty doing so.

Or is that a problem we shouldn't worry about for the next 50 years? I can see full nodes "centralizing" then, but it's possible that there will be entities or countries that will maintain a full node independent of other full nodes.

Right now, everyone running QT or the reference client is a full node.

I can see a far distant future where no normal or regular individual has a full node (except for the geeks / enthusiasts / those who can afford), but there will still be at least maybe 1 full node per country with the bigger countries having several, maybe one per university or one per "department" or "agency" or one per "private corporation".

I mean, large companies already maintain servers for all sorts of purposes. What's another one just dedicated to just bitcoind with 500++ gigs of space (or whatever is the then current capacity of hard drives or equivalent)?
legendary
Activity: 4760
Merit: 1283
The more I think I understand about bit coin the less I actually do! Undecided
Good read though!

Unsurprising.  The 'marketing' is not what I would consider to be particularly up-front.  Many of the claims about privacy, peer2peer, scalability, centralization, etc were at best temporarily true.  Many of the legitimate advocates of the solution honestly believed the 'hype',  and very few of those who understood the technical aspects of the solution sufficiently to see the looming issues did much toward disabusing the crowd about some of some of the misconceptions.  We all have at least a financial stake in growing the userbase after all, and it would probably be disingenuous to neglect that as a factor in how the solution has been presented.

full member
Activity: 238
Merit: 100
KUPO!
The more I think I understand about bit coin the less I actually do! Undecided
Good read though!
hero member
Activity: 675
Merit: 507
Freedom to choose
i have also found that updating my client to the current blockchain seems to "timeout" someones.. and i have to restart my wallet to get it to start DLing the chain again.
legendary
Activity: 1526
Merit: 1134
I did a 'Run as Administrator' (which is noted as being needed under Win8 - I'm on Win7 64 bit) and it started up and works.

That's weird. You don't have to run the app as administrator on MacOS or Linux. OK, it'd be nice to fix that. I guess the issue is directory permissions somewhere. If you start  a thread in the MultiBit forum, perhaps Jim will explain a bit more.

Quote
Anyone working on a C implementation, preferably without a GUI?

Jeff Garzik started on one called picocoin. However you're crazy if you prefer C to Java for this work. The Java installer on Windows is obnoxious and Jim shouldn't rely on a pre-installed JVM, but you can easily bundle a stripped down VM with an app that doesn't have any kind of crapware with it and is self-contained inside the application itself. The JVM is open source and has multiple competitors anyway, so this will always be an option.

Once you fix Oracle's bad distribution habits, the advantages are pretty major - specifically, a guarantee against buffer, heap or stack corruptions that could allow a remote peer to compromise your wallet. Software written in C has a long history of containing such bugs. Also, you don't have to write platform specific code.

The right fix for the Java+SPV wallet problem is just to improve MultiBit so it bundles its own custom JVM instead of using the system provided one.
legendary
Activity: 4760
Merit: 1283

Anyone working on a C implementation, preferably without a GUI?

I'd (with regret) run a SPV client in certain circumstances (compiled by myself on remote systems), but I'll never run anything Java and re-installed my one and only Windows machine from scratch to make sure I got completely rid of it some months ago.  Among other things I was creeped out by that 'Look' toolbar which got on there 'by accident' and could not be removed from Chrome.  That was well before Snowden informed the rest of the world how vulnerable we all are to PRISM machinations and the like.

Pages:
Jump to: