Pages:
Author

Topic: Once again, what about the scalability issue? - page 3. (Read 11259 times)

hero member
Activity: 658
Merit: 500
Sorry if it wasn't clear enough. The point of this thread is to get a solution to a problem that exists objectively. Not to argue with those who don't agree that the problem does exist.

Once again, I don't see you proposing any solution to this objective problem.
newbie
Activity: 25
Merit: 0
so the issue is internet speeds?

yet i do not hear people complaining that they had to download 15gb for last years call of duty, and 20gb for this years call of duty and more so for the next call of duty via steam.

Well, those games entertains them.. It is not the same.
legendary
Activity: 2142
Merit: 1010
Newbie
There is not a problem.

so the issue is internet speeds?

yet i do not hear people complaining that they had to download 15gb for last years call of duty, and 20gb for this years call of duty and more so for the next call of duty via steam.

Sorry if it wasn't clear enough. The point of this thread is to get a solution to a problem that exists objectively. Not to argue with those who don't agree that the problem does exist.
legendary
Activity: 4410
Merit: 4766
so the issue is internet speeds?

yet i do not hear people complaining that they had to download 15gb for last years call of duty, and 20gb for this years call of duty and more so for the next call of duty via steam.
legendary
Activity: 1036
Merit: 1000
Thug for life!
Blockchain size has crossed 10000 MB mark. I think it's time to close this thread until we see 20000 MB...

Sorry for bad timing, I missed the moment when the blockchain was 20000 MB. It's larger than 22000 MB now, could anyone point me to a solution of the problem (if it's implemented)?
There is not a problem. Bandwidth that is available for ~$40 per month is increasing at a faster rate then the blockchain is growing by, the same is true with both hard drive storage and RAM memory.
hero member
Activity: 658
Merit: 500
Good old BitcoinTalk... Ok, I'll come back when we cross 30 GB mark, maybe you'll have a solution by that time.

You say that as if we're obliged to comply with you. Why don't you come up with something instead of complaining?
legendary
Activity: 2142
Merit: 1010
Newbie
Good old BitcoinTalk... Ok, I'll come back when we cross 30 GB mark, maybe you'll have a solution by that time.
hero member
Activity: 510
Merit: 500
Blockchain size has crossed 10000 MB mark. I think it's time to close this thread until we see 20000 MB...

Sorry for bad timing, I missed the moment when the blockchain was 20000 MB. It's larger than 22000 MB now, could anyone point me to a solution of the problem (if it's implemented)?

Is this a joke or are you serious? Honestly you should try and keep up on the news or searching for an answer.


:S This sounds like it could be a massive show stopper.

maybe searching the forum and seeing that there is a plan of action means the show will continue
https://bitcointalksearch.org/topic/gavin-andresen-proposes-bitcoin-hard-fork-to-address-network-scalability-816298
http://www.coindesk.com/gavin-andresen-bitcoin-hard-fork/


The average person can get bandwidth of 30 mpbs download and 5 mpbs upload for less than 45 dollars per month in some areas of the U.S.

Nodes will be fine as storage becomes cheaper paired with cheaper technology and cheaper and cheaper internet. Pretty soon phone carriers will be going towards 5G as the next thing then 6G and 7 G and whatever. I used to pay 87.99 a month for my 30 mpbs internet and now its a lot cheaper. I also used to pay 79.99 for 15 mbps internet. There is no problem and you need to stop making things up. Point us to an actual problem if you are so convinced there is one. The potential problems you point out are not problems and never were, you just got a bunch of idiots commenting on it because they don't know any better.


/thread

In addition to that these types of issue should be put into a context of the overall risk analysis.  Just pulling out one issue and saying "something should be done" is not the way to manage risks.  A first cut at such a report is found at https://bitcoinfoundation.org/static/2014/04/Bitcoin-Risk-Management-Study-Spring-2014.pdf

I think they left out some risks of the Bitcoin Foundation itself but it is a start.
hero member
Activity: 700
Merit: 500
Blockchain size has crossed 10000 MB mark. I think it's time to close this thread until we see 20000 MB...

Sorry for bad timing, I missed the moment when the blockchain was 20000 MB. It's larger than 22000 MB now, could anyone point me to a solution of the problem (if it's implemented)?

Is this a joke or are you serious? Honestly you should try and keep up on the news or searching for an answer.


:S This sounds like it could be a massive show stopper.

maybe searching the forum and seeing that there is a plan of action means the show will continue
https://bitcointalksearch.org/topic/gavin-andresen-proposes-bitcoin-hard-fork-to-address-network-scalability-816298
http://www.coindesk.com/gavin-andresen-bitcoin-hard-fork/


The average person can get bandwidth of 30 mpbs download and 5 mpbs upload for less than 45 dollars per month in some areas of the U.S.

Nodes will be fine as storage becomes cheaper paired with cheaper technology and cheaper and cheaper internet. Pretty soon phone carriers will be going towards 5G as the next thing then 6G and 7 G and whatever. I used to pay 87.99 a month for my 30 mpbs internet and now its a lot cheaper. I also used to pay 79.99 for 15 mbps internet. There is no problem and you need to stop making things up. Point us to an actual problem if you are so convinced there is one. The potential problems you point out are not problems and never were, you just got a bunch of idiots commenting on it because they don't know any better.


/thread
full member
Activity: 182
Merit: 100
★Bitin.io★ - Instant Exchange
It seems to me Bitcoin core devs prefer ostrich policy. The blockchain keeps growing, pruning is not implemented yet (is it possible btw?), Gavin spoke about everything except the scalability issue on Bitcoin 2013 conference...
Is there any progress? Or is the game over?

Bitcoin technology develop to the present, progress is very slow, perhaps they pleased with achievement have already achieved, but it is not a good thing, they need to write high quality code!
legendary
Activity: 2142
Merit: 1010
Newbie


This doesn't solve bandwidth problem, blockchain size doubled within a year while connections increased only by 50%. Not sustainable even without increasing 7 TPS limit.

newbie
Activity: 28
Merit: 0
Need to advance technologically and managerially.

It's all bureaucracy at the moment.
legendary
Activity: 4410
Merit: 4766
could anyone point me to a solution of the problem (if it's implemented)?

legendary
Activity: 2142
Merit: 1010
Newbie
Blockchain size has crossed 10000 MB mark. I think it's time to close this thread until we see 20000 MB...

Sorry for bad timing, I missed the moment when the blockchain was 20000 MB. It's larger than 22000 MB now, could anyone point me to a solution of the problem (if it's implemented)?
legendary
Activity: 1120
Merit: 1152
Oreally!  That is interesting.  I only looked at Litecoin early on and have lived under the assumption that they would have the same issues as Bitcoin (or worse) if they achieved the same utilization.  Although there were other things to like about Litecoin, generally I've been to lazy to have paid much attention since my initial scan.  If they are getting serious about scalability and retaining lower end users as a critical component of the network infrastructure I better take another look.

Warren Togami is the major driver of Litecoin development right now, and he's got very strong feelings about spam and scalability. I don't necessarily always agree with his approach to the issues on a technical level, but his heart is in the right place.

What is the best way to follow your work?

Anything concrete will be posted to the bitcoin-development email list.
legendary
Activity: 4690
Merit: 1276
FWIW Ive been asked to write a (funded) proposal to make improvements to UTXO scalability for Litecoin. I'm still working on the proposal - and won't be able to finish it until I finish a semi-related job for another client - but it looks like it will be an implementation of pruning with nodes also storing some amount of archival data for bootstrapping. I've also got some more complex changes that would for the most part eliminate concerns about UTXO growth entirely at the expense of a soft-fork. (though I have no idea if the changes would be politically acceptable in Bitcoin) I'm still thinking through the latter however, and how a pruning implementation would work in conjunction with it; I'll publish soonish. Ideal end-goal would be to eliminate the notion of a SPV client in exchange for a model where everyone validates/contributes between 0% and 100% of the blockchain resource effort and can pick that % smoothly.

Oreally!  That is interesting.  I only looked at Litecoin early on and have lived under the assumption that they would have the same issues as Bitcoin (or worse) if they achieved the same utilization.  Although there were other things to like about Litecoin, generally I've been to lazy to have paid much attention since my initial scan.  If they are getting serious about scalability and retaining lower end users as a critical component of the network infrastructure I better take another look.

What is the best way to follow your work?

legendary
Activity: 1498
Merit: 1000
The world will move to off blockchain wallets and sites like inputs.io

Why wait? Do it now, use a bank!
legendary
Activity: 1120
Merit: 1152
The thing I was referring to was not block chain compression (that's not making a big difference) but rather pruning, i.e. deleting of old data from disk. There has been no progress on that front. Sipa worked on other things instead.

That is what I meant.  It is to bad that it's not being worked on.  Again, it was one of the things in the whitepaper which gave me some hope for the sustainability of the system as a more realistic community maintained solution.  I've suspected for some time now that once the perception of possible scalability in this respect was implanted, the goal of that text was achieved.

FWIW Ive been asked to write a (funded) proposal to make improvements to UTXO scalability for Litecoin. I'm still working on the proposal - and won't be able to finish it until I finish a semi-related job for another client - but it looks like it will be an implementation of pruning with nodes also storing some amount of archival data for bootstrapping. I've also got some more complex changes that would for the most part eliminate concerns about UTXO growth entirely at the expense of a soft-fork. (though I have no idea if the changes would be politically acceptable in Bitcoin) I'm still thinking through the latter however, and how a pruning implementation would work in conjunction with it; I'll publish soonish. Ideal end-goal would be to eliminate the notion of a SPV client in exchange for a model where everyone validates/contributes between 0% and 100% of the blockchain resource effort and can pick that % smoothly.
legendary
Activity: 4690
Merit: 1276
The thing I was referring to was not block chain compression (that's not making a big difference) but rather pruning, i.e. deleting of old data from disk. There has been no progress on that front. Sipa worked on other things instead.

That is what I meant.  It is to bad that it's not being worked on.  Again, it was one of the things in the whitepaper which gave me some hope for the sustainability of the system as a more realistic community maintained solution.  I've suspected for some time now that once the perception of possible scalability in this respect was implanted, the goal of that text was achieved.

If your problem is you can't afford to even download 10G of data then you're better off using an SPV client instead.

It's easy to get spoiled when working on high capacity networks and neglect to consider the various use-cases.  I've never argued that POTS or satellite should be supported as a baseline, but have argued that if they could it would result in a system which was much more difficult to subvert.  I personally don't think it is worth the tradeoff though.

As soon as a simple SPV client implemented in a language which does not effectively require un-trusted dependencies exists I likely use it for certain things.  Of course it adds no value to the Bitcoin network other than 'headcount' perhaps so I guess that the operators of the network will be extracting value from clients like this in other ways.

I'm pretty sure almost any VPS could run bitcoind - where did you find a VPS that has <20G of disk and bandwidth?

I typically try to plan my infrastructure investments (which are often more about time than money) for a reasonable life expectancy.  If the resource utilization rate is predictable then this is possible.  If, say, the transaction rate could change on a whim and necessitate a rapid escalation of the resources I need to deploy such a system then there is less likilyhood that I will bother in the first place.  I doubt that I am alone in such a calculus.

 - edit: to answer your question, I was looking for 1) a system which would allow me to compile my own OS, and 2) in a jurisdiction which was suitible miffed at the NSA spying that they may have take real steps to prevent it and have the technical expertise to do so.  I found myself here:  http://nqhost.com/freebsd-vps.html.  I also considered AWS micro instances which may or may not work.

If you ran one on a VPS you could use an SPV client locally that connects to it, and that'd be an equivalent security level.

If I ran bitcoind on a VPS there would be little or no need to run anything locally.  At least on a VPS that I could have some confidence in from a security perspective (a big question mark in my mind at this time.)
donator
Activity: 1218
Merit: 1079
Gerald Davis
If your problem is you can't afford to even download 10G of data then you're better off using an SPV client instead. I'm pretty sure almost any VPS could run bitcoind - where did you find a VPS that has <20G of disk and bandwidth? If you ran one on a VPS you could use an SPV client locally that connects to it, and that'd be an equivalent security level.

This is a good point and one that I think will become more common in the future.   In residential scenarios there is something called "the last mile".  It is relatively easy to drop a multi-gigabit data connection into a neighborhood but the installation and maintenance of the last mile into thousands of residences (which you will only collect $30 to $100 monthly) is a bottleneck.  The good news is that datacenter bandwidth is a magnitude cheaper and continues to get cheaper at a faster rate. 

Full node has (relatively) high bandwidth requirements
Users personal tx and confirmations have low bandwidth requirements.
Move the high bandwidth portion to where bandwidth is both cheap and available.

I imagine we will even see the development of ultra light clients which communicate to a specific trusted peer (probably one run by the user).  For example a user could have bitcoin wallets on mobile phone, laptop, desktop, and some hardware device which all communicate via encrypted and authenticated channel to a full node peer operated by the user. Best of both worlds.
Pages:
Jump to: