Pages:
Author

Topic: . - page 6. (Read 24771 times)

hero member
Activity: 546
Merit: 500
Warning: Confrmed Gavinista
February 10, 2016, 06:57:08 PM

... many people around here seem to think that operating a node is comparable to downloading a web page every 10 minutes.


You are the first person to mention that.  I don't think that anyone here believes that.

You've missed much of the discussion spanning back to summer of last year. You're new here. I won't hold it against you.

By the way, I was partly responding to this guy who was talking about the number of web pages he downloads in 10 minutes:

But I sometimes look at more than one webpage in 10 minutes.

Reading comprehension, people! It's real important when you're trying to decode what smarter people are trying to tell you.

Indeed. A lot of people either have trouble understanding the concept of "implication" or are playing dumb about it for the sake of argument. Hence why Nick Szabo's comment, "If you want to store your money on the web use Mt. Gox" went over so many peoples' heads.

Nah, you still don't make sense.  So the domain for this discussion has been extended to include "everything written since the summer 2015" - thats quite the leap. Lets say we just keep it to this thread - right back to the start, ok?  Still, you are the only person I am aware of who thinks that *anyone* believes that running a node is equivalent to downloading a webpage every 10 minutes. Do you believe this yourself? Of course not, so then why would you assume that anyone else would be that dickish or clueless?

I think your issue is that deep down you realise you are defending the indefensible and some part of your inner soul is rebelling at your conscious attempts to promote idiotic fallacies.

Do you agree that that is what is going on?  Or did someone say something in 2011 about something that we should be aware of?
legendary
Activity: 4424
Merit: 4794
February 10, 2016, 06:55:17 PM

Once we mitigate these bandwidth limitations, increasing the block size limit stops looking so dangerous.

Franky, all you've proven is that video-conferencing is a slightly better analogy for the upload requirements of running a bitcoin node than is downloading a web page. I don't need to "debunk Skype," as you put it, because the question is not whether "the internet cannot cope with their activity".

The question is absolutely not "can more than 2mb can be uploaded?" The question is not, is it possible to run a node? The question is, at what point do bandwidth limitations disincentivize the operation of full nodes to the extent that centralization endangers security and fungibility?


2mb.. or even 2mb+segwit is not an issue.. and is not a nuclear bomb threat.. so at this point of this topic of 2mb.. CHILL OUT. there is no doomsday.
i do agree that 20mb is a doomsday right now. but 2mb is not. so that hopefully ends your argument that 2mb is a problem.

maybe worth saving the doomsday arguments if within the next year we suddenly debate 8-20mb.. but until then CHILL OUT

as for the rest of your post

We already know that over the past several years, as block size has gradually increased, operating nodes have persistently fallen. Would you suggest that block size, which is directly related to bandwidth requirements for nodes, is not related to the perpetual decline in nodes?

blaming it purely on bandwidth as the reason is false.
many users are not running a full node 24-7 not due to bandwidth concerns. but more so to do with the fact that people are not day trading as much, people are not spending 3-10 times a day so they see less point in keeping their computer on 24-7 if they are not actually going to use it.

EG in gaming. if you want to go away for half an hour and do something else. you just go AFK(away from keyboard) and minimise the game while keeping it running. but if your only going to play the game for 10 minutes a day. you will log out and switch off the computer because you know you wont be using it any time soon.

ive even seen it in IRC chat rooms. those that just go AFK return promptly but never log out of IRC. but when people say 'im going out for the day' they log out completely.

As a node operator, I can tell you that bandwidth is the only possible reason why I wouldn't run a node (as opposed to storage, hard disk resources, hardware). That's the only pressure. Most people do not have unlimited fiber connections. Most people have capped-bandwidth cable or low quality DSL. So the question is not, "can these people run a node, using much or all of their upload bandwidth? Or will they choose not to? The latter is what we must contend with -- and is related to the perpetual decline in nodes over the past couple years.

again internet speed is not an issue, the fear that it would be an issue .. is the issue. even on standard DSL in 2011 people on livestream/twich/skype were ok, and 5 years later still ok.. infact millions of people.. not 5000,, millions of people are making HD content which is actually more bandwidth than my numbers in my last post.


now here is my doomsday theory of lack of nodes...
segwit, with its promises of non-witness relay is going to wrongly tempt people to run in compatibility mode and not archival mode. and that alone will kill off more full node(archival mode) users than any bandwidth concerns will, purely because they wont be holding full data to be classed as fullnodes.. even if they run it 24-7 they are still not going to be full nodes.

what needs to be done is to chill out about bandwidth doomsdays. and instead highlight how important it is for full nodes to ensure they run archival mode on segwit. and also to drum up new uses of bitcoin so that people want to use bitcoin more than once a day, to be tempted to leave their computers on 24-7. because things like increasing the transaction fee is not a good tempter for people to want to spend more often. they are more likely to wait it out and spend more wisely and less often. losing the desire to be part of the network.
full member
Activity: 154
Merit: 100
February 10, 2016, 06:53:08 PM
... Hence why Nick Szabo's comment, "If you want to store your money on the web use Mt. Gox" went over so many peoples' heads.

How about we stop talking about "non-mining nodes" and use more appropriate nomenclature, like "secure wallets"?
Because otherwise, seems like I should worry whether someone other than myself chooses to use a web wallet, "run a full node," or use an exchange as their wallet.

Quote
[decline in the number of secure wallets is] sad, as it defeats the purpose of bitcoin ...
Bitcoin's purpose du jour being what?
sr. member
Activity: 400
Merit: 250
February 10, 2016, 06:50:57 PM

The question is absolutely not "can more than 2mb can be uploaded?" The question is not, is it possible to run a node? The question is, at what point do bandwidth limitations disincentivize the operation of full nodes to the extent that centralization endangers security and fungibility? We already know that over the past several years, as block size has gradually increased, operating nodes have persistently fallen. Would you suggest that block size, which is directly related to bandwidth requirements for nodes, is not related to the perpetual decline in nodes? As a node operator, I can tell you that bandwidth is the only possible reason why I wouldn't run a node (as opposed to storage, hard disk resources, hardware). That's the only pressure. Most people do not have unlimited fiber connections. Most people have capped-bandwidth cable or low quality DSL. So the question is not, "can these people run a node, using much or all of their upload bandwidth? Or will they choose not to? The latter is what we must contend with -- and is related to the perpetual decline in nodes over the past couple years.


The falling of full node count is a simple result of people prefer to use off-chain web wallets because of their simplicity and convenience (Before they were using full nodes as their wallet). In fact 99% of my transactions nowadays happen between exchanges and web wallets, my node is never used to process transaction, but simply works as a dedicated mining/relay node

Any proof of this, or just based on your personal anecdote?

This trend will continue, majority of the average user will often forget about their password or get their phone/computer hacked easily. So they would prefer a human maintained service so that they can reset the password or use 2FA login to increase security (And in case they lose the 2FA device, they also need human support to reset it)

That's sad, as it defeats the purpose of bitcoin, but...SPV and off-chain solutions are fine as long as the number of full nodes remains healthy enough. It's not about growth in SPV or off-chain usage -- it's about decline in full nodes. You can talk all day about how SPV nodes are growing, but if in the end full nodes become highly centralized in the process, bitcoin is susceptible to many security failures.

On the other hand, thousands of enthusiasts with enough IT expertise will set up dedicated full nodes around the world, they understand bitcoin much better than average Joe, and they will make sure that all these nodes meet the technical requirement of current block size limit

Will they? It's not about whether they understand the technical requirements -- it's whether they are willing to sacrifice the bandwidth at any cost. How do you know?
sr. member
Activity: 400
Merit: 250
February 10, 2016, 06:45:09 PM

... many people around here seem to think that operating a node is comparable to downloading a web page every 10 minutes.


You are the first person to mention that.  I don't think that anyone here believes that.

You've missed much of the discussion spanning back to summer of last year. You're new here. I won't hold it against you.

By the way, I was partly responding to this guy who was talking about the number of web pages he downloads in 10 minutes:

But I sometimes look at more than one webpage in 10 minutes.

Reading comprehension, people! It's real important when you're trying to decode what smarter people are trying to tell you.

Indeed. A lot of people either have trouble understanding the concept of "implication" or are playing dumb about it for the sake of argument. Hence why Nick Szabo's comment, "If you want to store your money on the web use Mt. Gox" went over so many peoples' heads.
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
February 10, 2016, 06:25:24 PM

The question is absolutely not "can more than 2mb can be uploaded?" The question is not, is it possible to run a node? The question is, at what point do bandwidth limitations disincentivize the operation of full nodes to the extent that centralization endangers security and fungibility? We already know that over the past several years, as block size has gradually increased, operating nodes have persistently fallen. Would you suggest that block size, which is directly related to bandwidth requirements for nodes, is not related to the perpetual decline in nodes? As a node operator, I can tell you that bandwidth is the only possible reason why I wouldn't run a node (as opposed to storage, hard disk resources, hardware). That's the only pressure. Most people do not have unlimited fiber connections. Most people have capped-bandwidth cable or low quality DSL. So the question is not, "can these people run a node, using much or all of their upload bandwidth? Or will they choose not to? The latter is what we must contend with -- and is related to the perpetual decline in nodes over the past couple years.


The falling of full node count is a simple result of people prefer to use off-chain web wallets because of their simplicity and convenience (Before they were using full nodes as their wallet). In fact 99% of my transactions nowadays happen between exchanges and web wallets, my node is never used to process transaction, but simply works as a dedicated mining/relay node

This trend will continue, majority of the average user will often forget about their password or get their phone/computer hacked easily. So they would prefer a human maintained service so that they can reset the password or use 2FA login to increase security (And in case they lose the 2FA device, they also need human support to reset it)

On the other hand, thousands of enthusiasts with enough IT expertise will set up dedicated full nodes around the world, they understand bitcoin much better than average Joe, and they will make sure that all these nodes meet the technical requirement of current block size limit
legendary
Activity: 1554
Merit: 1014
Make Bitcoin glow with ENIAC
February 10, 2016, 06:01:38 PM
But I sometimes look at more than one webpage in 10 minutes.

That's still just another single download. Think "single download vs. many uploads" and further that upload bandwidth is more limited.

10 minutes is just the target time. Sometimes valid blocks are published within seconds of one another.

It was an understatement.

It wasn't an analogy. It was an example to show how small amount of data it is.

Not sure about that. In any case, it's an important topic to discuss since many people around here seem to think that operating a node is comparable to downloading a web page every 10 minutes. That leads to a fundamental misunderstanding of bitcoin's capacity limitations.

Reading comprehension, people! It's real important when you're trying to decode what smarter people are trying to tell you.
full member
Activity: 154
Merit: 100
February 10, 2016, 05:52:23 PM
...
The question is absolutely not "can more than 2mb can be uploaded?" The question is not, is it possible to run a node? The question is, at what point do bandwidth limitations disincentivize the operation of full nodes to the extent that centralization endangers security and fungibility? We already know that over the past several years, as block size has gradually increased, operating nodes have persistently fallen. Would you suggest that block size, which is directly related to bandwidth requirements for nodes, is not related to the perpetual decline in nodes? As a node operator, I can tell you that bandwidth is the only possible reason why I wouldn't run a node (as opposed to storage, hard disk resources, hardware). That's the only pressure. Most people do not have unlimited fiber connections. Most people have capped-bandwidth cable or low quality DSL. So the question is not, "can these people run a node, using much or all of their upload bandwidth? Or will they choose not to? The latter is what we must contend with -- and is related to the perpetual decline in nodes over the past couple years.

The tragedy of vanishing non-mining nodes can not be overstated.
sr. member
Activity: 400
Merit: 250
February 10, 2016, 05:48:01 PM
The issue is the bandwidth load that individual nodes take on, which due to this "massive redundancy" is not at all comparable to downloading a web page. This speaks to the misinformed comparisons that XT/Classic supporters have often made between a single instance of download stream with many instances of upload stream -- hence why "web page" vs. "block size" makes for such a horrible analogy.

Skype video call = 30mb per 10 minutes outgoing(UPLOAD)
live streaming, outgoing(UPLOAD)
online gaming outgoing(UPLOAD)

the list goes on..

goodluck debunking that bitcoin cannot scale purely because the internet cant cope with webpages. when real examples of things that are more data heavy than webpages are being uploaded from users HOMES all the time. and im not talking about single instances of 5 seconds.. im talking about constant data transmissions that last hours. which proves that in a 10 minute space. more than 2mb can be uploaded.

if thats not good enough: here is another solution:
miners can set up 'blind relay nodes' which are nodes dotted around the internet used purely to relay data without checking, because the miners are sending out the same data to their blind relays so there is no point in their own blind relays re-checking.

imagine it. a miner sends its data to 7 blind relays. and those blind relays send it out to 7 other nodes each (49). the time saving is noticeable compared to one node sending out data to 49 nodes direct

if you want to debunk skype, online gaming and live stream uploads.. maybe you should call those companies first and tell them that the internet cannot cope with their activity first. and that millions of people cannot livestream, cannot online game, cannot make video calls. before you post another comment on this topic

I never said that bitcoin cannot scale. I never said that the internet cannot cope with webpages...  Huh

I said that bitcoin has capacity limitations that are linked to bandwidth limitations for individual nodes. Further optimizations like IBLTs and weak blocks could greatly mitigate those limitations (and over time, infrastructural limits to upload bandwidth should improve as well).

Quote
IBLTs and weak blocks: 90% or more reduction in critical bandwidth to relay blocks created by miners who want their blocks to propagate quickly with a modest increase in total bandwidth, bringing many of the benefits of the Bitcoin Relay Network to all full nodes. This improvement is accomplished by spreading bandwidth usage out over time for full nodes, which means IBLT and weak blocks may allow for safer future increases to the max block size.

Once we mitigate these bandwidth limitations, increasing the block size limit stops looking so dangerous.

Franky, all you've proven is that video-conferencing is a slightly better analogy for the upload requirements of running a bitcoin node than is downloading a web page. I don't need to "debunk Skype," as you put it, because the question is not whether "the internet cannot cope with their activity".

The question is absolutely not "can more than 2mb can be uploaded?" The question is not, is it possible to run a node? The question is, at what point do bandwidth limitations disincentivize the operation of full nodes to the extent that centralization endangers security and fungibility? We already know that over the past several years, as block size has gradually increased, operating nodes have persistently fallen. Would you suggest that block size, which is directly related to bandwidth requirements for nodes, is not related to the perpetual decline in nodes? As a node operator, I can tell you that bandwidth is the only possible reason why I wouldn't run a node (as opposed to storage, hard disk resources, hardware). That's the only pressure. Most people do not have unlimited fiber connections. Most people have capped-bandwidth cable or low quality DSL. So the question is not, "can these people run a node, using much or all of their upload bandwidth? Or will they choose not to? The latter is what we must contend with -- and is related to the perpetual decline in nodes over the past couple years.
hero member
Activity: 546
Merit: 500
Warning: Confrmed Gavinista
February 10, 2016, 05:09:59 PM

... many people around here seem to think that operating a node is comparable to downloading a web page every 10 minutes.


You are the first person to mention that.  I don't think that anyone here believes that.
legendary
Activity: 4424
Merit: 4794
February 10, 2016, 05:02:29 PM
The issue is the bandwidth load that individual nodes take on, which due to this "massive redundancy" is not at all comparable to downloading a web page. This speaks to the misinformed comparisons that XT/Classic supporters have often made between a single instance of download stream with many instances of upload stream -- hence why "web page" vs. "block size" makes for such a horrible analogy.

Skype video call = 30mb per 10 minutes outgoing(UPLOAD)
live streaming, outgoing(UPLOAD)
online gaming outgoing(UPLOAD)

the list goes on..

goodluck debunking that bitcoin cannot scale purely because the internet cant cope with webpages. when real examples of things that are more data heavy than webpages are being uploaded from users HOMES all the time. and im not talking about single instances of 5 seconds.. im talking about constant data transmissions that last hours. which proves that in a 10 minute space. more than 2mb can be uploaded.

if thats not good enough: here is another solution:
miners can set up 'blind relay nodes' which are nodes dotted around the internet used purely to relay data without checking, because the miners are sending out the same data to their blind relays so there is no point in their own blind relays re-checking.

imagine it. a miner sends its data to 7 blind relays. and those blind relays send it out to 7 other nodes each (49). the time saving is noticeable compared to one node sending out data to 49 nodes direct

if you want to debunk skype, online gaming and live stream uploads.. maybe you should call those companies first and tell them that the internet cannot cope with their activity first. and that millions of people cannot livestream, cannot online game, cannot make video calls. before you post another comment on this topic
sr. member
Activity: 400
Merit: 250
February 10, 2016, 04:57:18 PM
But I sometimes look at more than one webpage in 10 minutes.

That's still just another single download. Think "single download vs. many uploads" and further that upload bandwidth is more limited.

10 minutes is just the target time. Sometimes valid blocks are published within seconds of one another.

It wasn't an analogy. It was an example to show how small amount of data it is.

Not sure about that. In any case, it's an important topic to discuss since many people around here seem to think that operating a node is comparable to downloading a web page every 10 minutes. That leads to a fundamental misunderstanding of bitcoin's capacity limitations.
legendary
Activity: 1554
Merit: 1014
Make Bitcoin glow with ENIAC
February 10, 2016, 04:34:24 PM
You're right -- "massively" is probably more appropriate. Cheesy

What does this multiplier effect have to do with the bandwidth load that individual nodes must take on?

That was already answered.

Could you explain it? Because I don't think it was.

A miner can broadcast a solved block to as many or few nodes as they want.  Those nodes in turn will broadcast to other nodes.  That is the multiplier effect I am referring to. 

Sure, that's just how bitcoin works, like other networks. No one ever said every node in existence has to constantly communicate with every other node in existence. That's just an arbitrary comparison. What you're talking about doesn't mitigate this:

Quote
Every node in the network must receive and process every transaction in every block. There is no distribution or sharing of that load.

The issue is the bandwidth load that individual nodes take on, which due to this "massive redundancy" is not at all comparable to downloading a web page. This speaks to the misinformed comparisons that XT/Classic supporters have often made between a single instance of download stream with many instances of upload stream -- hence why "web page" vs. "block size" makes for such a horrible analogy.
...
..
.

But I sometimes look at more than one webpage in 10 minutes.

It wasn't an analogy. It was an example to show how small amount of data it is.
sr. member
Activity: 400
Merit: 250
February 10, 2016, 04:29:24 PM
You're right -- "massively" is probably more appropriate. Cheesy

What does this multiplier effect have to do with the bandwidth load that individual nodes must take on?

That was already answered.

Could you explain it? Because I don't think it was.

A miner can broadcast a solved block to as many or few nodes as they want.  Those nodes in turn will broadcast to other nodes.  That is the multiplier effect I am referring to. 

Sure, that's just how bitcoin works, like other networks. No one ever said every node in existence has to constantly communicate with every other node in existence. That's just an arbitrary comparison. What you're talking about doesn't mitigate this:

Quote
Every node in the network must receive and process every transaction in every block. There is no distribution or sharing of that load.

The issue is the bandwidth load that individual nodes take on, which due to this "massive redundancy" is not at all comparable to downloading a web page. This speaks to the misinformed comparisons that XT/Classic supporters have often made between a single instance of download stream with many instances of upload stream -- hence why "web page" vs. "block size" makes for such a horrible analogy.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
February 10, 2016, 03:17:11 PM
Increase demands relative to what? Huh

What "part of bitcoin" are we talking about? Think about what you guys are saying. If every node did not need to receive and process every transaction in every block, and that load was distributed, why would nodes be propagating data to other nodes at all? If a node is not self-validating, it is necessarily depending on trusted third parties, so any data it propagates is worthless (e.g. SPV). Self-validation is inextricably linked to the idea that peers are propagating data to one another.

The entire point is that self-validating and propagating each block to many peers is quite redundant, and requires far more resources than, say, downloading a web page.

Hence, epic meta commentary.

So now we went from 'massively' redundant to 'quite' redundant.  Ok.

Yes nodes have to validate, that is a good point.
But we still get the multiplier effect where information gets propagated
exponentially.  Also we have the relay networks.  Greg is so fond of
pointing these out in the context of the fee market discussion where
it serves his arguments about the lack of orphaning risk, so why ignore
them here when they can help propagation?

Regardless, 2mb is still small when it comes to internet bandwidth.
(You initally made it sound like 2mb would have to be uploaded thousands of
times by a single node.)

You're right -- "massively" is probably more appropriate. Cheesy

What does this multiplier effect have to do with the bandwidth load that individual nodes must take on?

That was already answered.

Could you explain it? Because I don't think it was.

A miner can broadcast a solved block to as many or few nodes as they want.  Those nodes in turn will broadcast to other nodes.  That is the multiplier effect I am referring to. 
sr. member
Activity: 400
Merit: 250
February 10, 2016, 03:06:39 PM
Are you suggesting that jtoomim is not the lead maintainer? Are you playing ignorant regarding the definition of a hard fork?

I have been consistent. Please show otherwise. You've never addressed anything I've said. How have I been hypocritical? How have I been ignorant? The burden is on you to actually show how that's true -- otherwise, as usual, you're just talking shit.

You are hypocritical in that you charged me with ad-hom when you yourself personalise your arguments at one or two people in Classic.

Whether its your entire argument,  a large portion or merely a constituent part  is of little importance.  

Its nothing to do with who the lead maintainer is, or what a HF is or is not.

You need to deal with the hurt you feel and accept that you will not see your $1100 bitcoins for the foreseeable future. Its not Gavins fault. Or jtoomim.


I called you out for that because everything you've said to me has been a personal attack or a simple mischaracterization of my words. Also, the mere mention of Gavin or jtoomim does not constitute ad hominem.

Right -- so we're supposed to ignore virtually everything that's been said, regardless of merit or substance. Never mind the discussions of: node centralization and associated security threats, the dangers and likelihood of a chain fork without miner consensus, the definition of consensus, the inexperience and lack of proven competence of the Classic dev team, the philosophical question of defining "money" and whose roadmap (or lack thereof) -- Core or Classic -- fits bitcoin's stated and practical purposes, whether Core is "corrupted" by Blockstream, the technical merits of a 2MB hardfork vs. 1MB or 1.5MB + segwit, etc.

We're supposed to ignore all this discussion (is it because the discussion has made Classic's hard fork look so unattractive?) because: "madjules007 is a bagholder." And please come correct with your endless personal attacks -- my base cost after being in bitcoin for years is $4000, not $1100. And it's super relevant to this discussion, obviously.
sr. member
Activity: 400
Merit: 250
February 10, 2016, 01:54:50 PM
Increase demands relative to what? Huh

What "part of bitcoin" are we talking about? Think about what you guys are saying. If every node did not need to receive and process every transaction in every block, and that load was distributed, why would nodes be propagating data to other nodes at all? If a node is not self-validating, it is necessarily depending on trusted third parties, so any data it propagates is worthless (e.g. SPV). Self-validation is inextricably linked to the idea that peers are propagating data to one another.

The entire point is that self-validating and propagating each block to many peers is quite redundant, and requires far more resources than, say, downloading a web page.

Hence, epic meta commentary.

So now we went from 'massively' redundant to 'quite' redundant.  Ok.

Yes nodes have to validate, that is a good point.
But we still get the multiplier effect where information gets propagated
exponentially.  Also we have the relay networks.  Greg is so fond of
pointing these out in the context of the fee market discussion where
it serves his arguments about the lack of orphaning risk, so why ignore
them here when they can help propagation?

Regardless, 2mb is still small when it comes to internet bandwidth.
(You initally made it sound like 2mb would have to be uploaded thousands of
times by a single node.)

You're right -- "massively" is probably more appropriate. Cheesy

What does this multiplier effect have to do with the bandwidth load that individual nodes must take on?

That was already answered.

Could you explain it? Because I don't think it was.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
February 10, 2016, 01:34:42 PM
Increase demands relative to what? Huh

What "part of bitcoin" are we talking about? Think about what you guys are saying. If every node did not need to receive and process every transaction in every block, and that load was distributed, why would nodes be propagating data to other nodes at all? If a node is not self-validating, it is necessarily depending on trusted third parties, so any data it propagates is worthless (e.g. SPV). Self-validation is inextricably linked to the idea that peers are propagating data to one another.

The entire point is that self-validating and propagating each block to many peers is quite redundant, and requires far more resources than, say, downloading a web page.

Hence, epic meta commentary.

So now we went from 'massively' redundant to 'quite' redundant.  Ok.

Yes nodes have to validate, that is a good point.
But we still get the multiplier effect where information gets propagated
exponentially.  Also we have the relay networks.  Greg is so fond of
pointing these out in the context of the fee market discussion where
it serves his arguments about the lack of orphaning risk, so why ignore
them here when they can help propagation?

Regardless, 2mb is still small when it comes to internet bandwidth.
(You initally made it sound like 2mb would have to be uploaded thousands of
times by a single node.)

You're right -- "massively" is probably more appropriate. Cheesy

What does this multiplier effect have to do with the bandwidth load that individual nodes must take on?

That was already answered.

full member
Activity: 126
Merit: 100
February 10, 2016, 07:29:56 AM
You worry too much. Core supporters have been begging for large block proponents to stick with consensus. Surely, when that has changed those same people will stick with the new consensus. Anything else would be hypocrisy.
I do not worry. You haven't answered my question.
Quote
Consider a 50-50 split. Which one would you call Bitcoin?

In your hypothetical scenario you've already quoted how that works.

Yeah, but if Lauda don't get it, is it still a burn?

... Due to the oscillations in the hashrate, we would never have a perfect 50-50 split .

Edit: replace "Yeah, but if" with "Yeah, but when."
legendary
Activity: 2674
Merit: 3000
Terminated.
February 10, 2016, 07:28:04 AM
In your hypothetical scenario you've already quoted how that works.
That means that you agree, good. Due to the oscillations in the hashrate, we would never have a perfect 50-50 split and thus the 'majority' argument would be useless in this case (i.e. it would likely look like 49-51). Explain how and why 75 - 25% is different?
Pages:
Jump to: