Pages:
Author

Topic: . - page 8. (Read 24690 times)

hero member
Activity: 534
Merit: 500
February 10, 2016, 04:43:23 AM

You can assume that everyone will immediately update to Classic nodes and therefore that only one chain will survive. But that's unlikely. I sure won't be updating.

You'll probably be DUMPING!!!  Grin

Well, that depends how it pans out. Cheesy

Yes, dumping is a realistic option. So is double spending on one chain vs. the other. Or some combination. It will be interesting if the hard fork breaks consensus, I'll say that much.

I know that I will be dumping both chains if we fork to a new client.

I will buy bitcoins for $1 each after you and your Core supporters crash the market and then wait for the next new moon! Grin
sr. member
Activity: 400
Merit: 250
February 10, 2016, 02:39:19 AM
Increase demands relative to what? Huh

What "part of bitcoin" are we talking about? Think about what you guys are saying. If every node did not need to receive and process every transaction in every block, and that load was distributed, why would nodes be propagating data to other nodes at all? If a node is not self-validating, it is necessarily depending on trusted third parties, so any data it propagates is worthless (e.g. SPV). Self-validation is inextricably linked to the idea that peers are propagating data to one another.

The entire point is that self-validating and propagating each block to many peers is quite redundant, and requires far more resources than, say, downloading a web page.

Hence, epic meta commentary.

So now we went from 'massively' redundant to 'quite' redundant.  Ok.

Yes nodes have to validate, that is a good point.
But we still get the multiplier effect where information gets propagated
exponentially.  Also we have the relay networks.  Greg is so fond of
pointing these out in the context of the fee market discussion where
it serves his arguments about the lack of orphaning risk, so why ignore
them here when they can help propagation?

Regardless, 2mb is still small when it comes to internet bandwidth.
(You initally made it sound like 2mb would have to be uploaded thousands of
times by a single node.)

You're right -- "massively" is probably more appropriate. Cheesy

What does this multiplier effect have to do with the bandwidth load that individual nodes must take on? Aren't relay networks about overcoming latencies (as opposed to reducing bandwidth requirements for nodes)?

When you say 2MB is small, what are we comparing it to? Web pages?

Depending on maxconnections and the time a node is turned on, 2MB would be uploaded thousands of times by a single node fairly quickly.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
February 10, 2016, 02:02:03 AM
Every node in the network must receive and process every transaction in every block. There is no distribution or sharing of that load.
This isn't the part of Bitcoin that they are talking about.
They aren't talking about processing the data that they are receiving, they are talking about spreading to other nodes these data.
Nodes doesn't need to process data again and again for every other nodes that they connect to, and mostly not with every possible node.

Correct.  And the fact that each node processes blocks makes the processing redundant but doesnt increase demands on individual nodes.  I think Greg knows full well what I meant. It raises an eyebrow at the intentions of his posts.

Increase demands relative to what? Huh

What "part of bitcoin" are we talking about? Think about what you guys are saying. If every node did not need to receive and process every transaction in every block, and that load was distributed, why would nodes be propagating data to other nodes at all? If a node is not self-validating, it is necessarily depending on trusted third parties, so any data it propagates is worthless (e.g. SPV). Self-validation is inextricably linked to the idea that peers are propagating data to one another.

The entire point is that self-validating and propagating each block to many peers is quite redundant, and requires far more resources than, say, downloading a web page.

Hence, epic meta commentary.

So now we went from 'massively' redundant to 'quite' redundant.  Ok.

Yes nodes have to validate, that is a good point.
But we still get the multiplier effect where information gets propagated
exponentially.  Also we have the relay networks.  Greg is so fond of
pointing these out in the context of the fee market discussion where
it serves his arguments about the lack of orphaning risk, so why ignore
them here when they can help propagation?

Regardless, 2mb is still small when it comes to internet bandwidth.
(You initally made it sound like 2mb would have to be uploaded thousands of
times by a single node.)

sr. member
Activity: 400
Merit: 250
February 10, 2016, 01:35:35 AM
Every node in the network must receive and process every transaction in every block. There is no distribution or sharing of that load.
This isn't the part of Bitcoin that they are talking about.
They aren't talking about processing the data that they are receiving, they are talking about spreading to other nodes these data.
Nodes doesn't need to process data again and again for every other nodes that they connect to, and mostly not with every possible node.

Correct.  And the fact that each node processes blocks makes the processing redundant but doesnt increase demands on individual nodes.  I think Greg knows full well what I meant. It raises an eyebrow at the intentions of his posts.

Increase demands relative to what? Huh

What "part of bitcoin" are we talking about? Think about what you guys are saying. If every node did not need to receive and process every transaction in every block, and that load was distributed, why would nodes be propagating data to other nodes at all? If a node is not self-validating, it is necessarily depending on trusted third parties, so any data it propagates is worthless (e.g. SPV). Self-validation is inextricably linked to the idea that peers are propagating data to one another.

The entire point is that self-validating and propagating each block to many peers is quite redundant, and requires far more resources than, say, downloading a web page.

Hence, epic meta commentary.
legendary
Activity: 1260
Merit: 1115
February 09, 2016, 08:46:10 PM
I think we should probably all agree going forward to assume that everybody here has the best of intentions.

https://en.wikipedia.org/wiki/Principle_of_charity

Just sayin'.

In fact, I'd like to formally invite gmaxwell to my Blockstream thread. I think it's time to give not war an opportunity.

For Heaven's sake!


staff
Activity: 4214
Merit: 1203
I support freedom of choice
February 09, 2016, 08:43:42 PM
Maybe he just read it too faster
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
February 09, 2016, 08:24:38 PM
Every node in the network must receive and process every transaction in every block. There is no distribution or sharing of that load.
This isn't the part of Bitcoin that they are talking about.
They aren't talking about processing the data that they are receiving, they are talking about spreading to other nodes these data.
Nodes doesn't need to process data again and again for every other nodes that they connect to, and mostly not with every possible node.

Correct.  And the fact that each node processes blocks makes the processing redundant but doesnt increase demands on individual nodes.  I think Greg knows full well what I meant. It raises an eyebrow at the intentions of his posts.
staff
Activity: 4214
Merit: 1203
I support freedom of choice
February 09, 2016, 08:16:46 PM
Every node in the network must receive and process every transaction in every block. There is no distribution or sharing of that load.
This isn't the part of Bitcoin that they are talking about.
They aren't talking about processing the data that they are receiving, they are talking about spreading to other nodes these data.
Nodes doesn't need to process data again and again for every other nodes that they connect to, and mostly not with every possible node.
legendary
Activity: 4214
Merit: 4458
February 09, 2016, 08:12:48 PM
. Bitcoin requires massive redundancy.

So ?

So Gavin's comparison of bitcoin blocks to average web pages is completely inappropriate, and Nick Szabo's comparison to MT Gox is apt. Hence, epic meta commentary.

That's not how it works.

Nodes relay information to other nodes but one node doesn't need to broadcast to the entire network!
The burden of 'massive' redundancy' you speak of is distributed across the thousands of nodes.


relax.. its people like madjules and lauda that think bitcoin needs datacentres with 6000 connections. they dont see the logic of things like the '7 degree's of separation'
EG
7 node that connect to 7 separate nodes each (meaning 7 layers each multiplying by 7 and each sprouting out by 7 (77)) is over 800,000 different possible nodes connected to the same network.

it does not require each node having 5000 connections, but 6 connections minimum to form a good bitcoin network to allow for the fullnodes to receive data.. and for supernodes (and miners) they can add the main 16 mining nodes to ensure fast propagation direct to miners (to help the fastest first race) and a few other nodes to propagate out to the community. something at a minimum of 22+ connections is acceptable as being the minimum of miners and those that deem themselves important for the blockheight race of miners(rather than the later sync of archival data)
staff
Activity: 4158
Merit: 8382
February 09, 2016, 08:08:34 PM
That's not how it works.

Nodes relay information to other nodes but one node doesn't need to broadcast to the entire network!
The burden of 'massive redundancy' you speak of is distributed across the thousands of nodes,
each responsible for communication with other nodes.
Wow. Now I understand why you're so confused about this blocksize issue.  That _is_ how it works.

Every node in the network must receive and process every transaction in every block. There is no distribution or sharing of that load.

Are you suggesting that jtoomim is not the lead maintainer? Are you playing ignorant regarding the definition of a hard fork?
I don't think classic has been very transparent about its governance. Jtoomim is supposedly lead maintainer, but if you look in the repository he has only made a few small changes and not done much merging. Meanwhile, the project was supposedly created with extensive involvement with people at cryptsy but they seem to have backed out and been removed from the site; ... and Jeff Garzik had one of his change requests summarily overridden and closed by Olivier Janssens. Beyond non-developers with commit access like Olivier there are also many commits being made by people I've never heard of before such as "rusty-loy". None of the membership of the project appears to be public.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
February 09, 2016, 07:57:18 PM
. Bitcoin requires massive redundancy.

So ?

So Gavin's comparison of bitcoin blocks to average web pages is completely inappropriate, and Nick Szabo's comparison to MT Gox is apt. Hence, epic meta commentary.

That's not how it works.

Nodes relay information to other nodes but one node doesn't need to broadcast to the entire network!
The burden of 'massive redundancy' you speak of is distributed across the thousands of nodes,
each responsible for communication with other nodes.


sr. member
Activity: 400
Merit: 250
February 09, 2016, 07:24:05 PM
For context:


If we were in the Wall Observer thread, I'd say "fair game" but we're trying to discuss the likelihood of a chain fork and what level of risk in that context is justifiable.

And you do that by making a statement along the lines of "The keys of the kingdom to toomim"?

How is that a valid argument
on a topic titled "Gavin proposes BIP for 2Mb...." (paraphrased) ?


A valid argument? You snipped 7 words from this post:

Why would you assume that? The most important thing to consider here is that miners are working off incomplete information. They don't really know how many nodes are running what implementation as it's very easy to run fake nodes. And it's nodes -- not hashing power -- that determine the validity of a blockchain. It's a more diverse and interesting question than most realize. Miners are pretty centralized. I think this is why Gavin is targeting them: it's much easier to trick a small number of highly centralized mining pools than it is to trick thousands of node operators. And if the 2MB implementation is capable of triggering the rule change based on hashing power (at 75% or whatever bullshit "democratic" threshold Gavin & Co. come up with -- 51%, etc.), then everyone else will crumble in submission, right?

Well...the dozen nodes that I run won't. The definition of "majority" and "minority" chain can change in a heartbeat; that's just a matter of miners temporarily pointing their hashing power at one chain or the other. It doesn't matter what Coinbase and Bitstamp say now, or where Bitfury points its hashing power. What really matters are the nodes that determine block validity, and what proportion of them enforce the new fork's consensus rules. Because if a significant proportion of them enforce the old rules, we will have an irreparable chain fork. These irrelevant musings about how a majority of hashing power will render all other blockchains instantly dead are amusing but not very informative. If nodes do not approach consensus, miners will have to choose which fork to built on top of. But, which one? All of the Classic/XT rhetoric says that a temporary majority of hashing power will surely solve everything. But what the hell does that have to do with nodes? What proof do you have that Classic nodes will comprise a majority of nodes -- simply because Bitfury and a few mining pools upgraded (if that happens at all)? Well, if a majority of nodes continue to enforce the 1MB rule, you may find quickly that the "majority chain" isn't a very meaningful phrase. It's all about validity. Miners will point their hashing power at the longest, valid chain. If it isn't clear which one is the longest valid chain (due to no clear consensus among nodes), we will have multiple blockchains and this will be irreconcilable. IMO, the most likely outcome of that is for mining farms to shut down en masse and for difficulty adjustment to drop significantly, as miners cannot risk expending resources to build on potentially invalid blockchains. The market would likely never recover -- probably rightfully so. For this to happen would mean that the only mechanism to enforce rules within the bitcoin protocol was broken, and all it took was the prodding of a loud minority.

By the way, you know that pre-fork coins could also be sold off on majority-fork exchanges? Particularly because early adopters might be a little pissed off at the commit keys for bitcoin's dominant implementation being in the hands of a junior dev who wants to make the question of inflating the money supply a democratic one (jtoomim). How do you know who controls millions of pre-fork coins? You can be sure that I'll be dumping everything the second Toomim gets the keys to the kingdom, and I know several likeminded people.

...And you're suggesting that "the keys of the kingdom to toomim" is the argument I'm making? Have you ever made an honest argument in your life?


... are you making a "toomim kingdom" argument now?   Huh srly, pick a point and be consistant.

To be honest, the amount of fucks I give for your 'arguments' tends to zero, I was only making a point concerning "ad-homs". You seemed concerned that I would resort to such a thing - Im pointing out that you have been hypocritically  personalising most of your classic-ignorance towards Gavin and jtoomim.

U still dont get it?


Are you suggesting that jtoomim is not the lead maintainer? Are you playing ignorant regarding the definition of a hard fork?

I have been consistent. Please show otherwise. You've never addressed anything I've said. How have I been hypocritical? How have I been ignorant? The burden is on you to actually show how that's true -- otherwise, as usual, you're just talking shit.
hero member
Activity: 546
Merit: 500
Warning: Confrmed Gavinista
February 09, 2016, 06:58:43 PM


A valid argument? You snipped 7 words from this post:

[snipity snip again]

...And you're suggesting that "the keys of the kingdom to toomim" is the argument I'm making? Have you ever made an honest argument in your life?


Are these supposed to be connected points?  Huh  OK - so you are not making the "toomim kingdom" argument....

Quote

You do realize that the intent of a successful hard fork is for all nodes to update to the new consensus rules? And that Gavin's intention, then, is for all nodes to update to Classic? Do you realize, further, that Toomim is the lead maintainer of Classic -- that he controls commit access, and that Core will obviously not control commit access to the dominant implementation in that case?

Never mind that I already explained that in a subsequent post, since your method of debate is to delete everything substantive your opponent says and take the one phrase that's left out of context.

... are you making a "toomim kingdom" argument now?   Huh srly, pick a point and be consistant.

To be honest, the amount of fucks I give for your 'arguments' tends to zero, I was only making a point concerning "ad-homs". You seemed concerned that I would resort to such a thing - Im pointing out that you have been hypocritically  personalising most of your classic-ignorance towards Gavin and jtoomim.

U still dont get it?
sr. member
Activity: 400
Merit: 250
February 09, 2016, 06:06:11 PM
. Bitcoin requires massive redundancy.

So ?

So Gavin's comparison of bitcoin blocks to average web pages is completely inappropriate, and Nick Szabo's comparison to MT Gox is apt. Hence, epic meta commentary.
legendary
Activity: 4214
Merit: 4458
February 09, 2016, 04:57:51 PM

Downloading a web page is not comparable to uploading blockchain data to many peers on a constant basis. Bitcoin requires massive redundancy. Downloading web pages, or storing your coins on Gox, does not.

gavin should have used skype video calls as a better comparison to home user experience. because although domains can send out millions of webpages a day (their UPLOAD is millions of megabytes) to show the internet can cope.. a better analogy would be online gaming or skype video calls which is a larger user UPLOAD rate compared to bitcoin
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
February 09, 2016, 04:47:17 PM
. Bitcoin requires massive redundancy.

So ?
sr. member
Activity: 400
Merit: 250
February 09, 2016, 04:37:11 PM
Re: 'epic commentary' by spazo.

Complete non sequitor. 

2mb being an average page size means its a small payload.
Got nothing to do with gox or security whatsoever.

Downloading a web page is not comparable to uploading blockchain data to many peers on a constant basis. Bitcoin requires massive redundancy. Downloading web pages, or storing your coins on Gox, does not.
sr. member
Activity: 400
Merit: 250
February 09, 2016, 04:33:09 PM
nor his comments that the block size limit was only a temporary measure.

Satoshi never said that block size limit was temporary measure. He only said that it can be increased when needed, thats exactly position of core devs and me personally (note he never said it should be removed).

Actually Satoshi seemed to be very much against block size bloat and did not even want namecoin on top of bitcoin.

Core developers are more inline with "Satoshi's vision" then Gavin, who wants to get rid of block limit altogether.

Not that I think its a valid argument, but I hate how Hearn and big-blockers try to hijack Satoshi and claim to speak for him.

Agreed, temporary was the wrong way to frame it. A need basis was what I was trying to convey. My personal view is that bitcoin's decentralized nature (i.e. self-validation) is integral to its function. Any technical issues, like blockchain bloat, should be measured against that. Centralization pressures are paramount to capacity issues, always -- if the whitepaper has anything to say about "Satoshi's vision," it's that.
legendary
Activity: 4214
Merit: 4458
February 09, 2016, 04:21:34 PM
is this topic still the circle jerk of lauda and mad jules trying to claim that Classic is part of the debate.

lol

if CORE implemented gavins bip then where would these 2 twerks be,

so lets stick to the debate about 2mb proposal as a whole and not the crappy politics of band camps.

lauda said ages ago that bitcoin wont work with 2mb.. yet bitcoin now works on a raspberry pi. and as such normal home computers are atleast twice the capacity, speed, etc as a raspberry pi.. so that debunks the data center theory.

as for the internet speed theory. well millions of customers doing skype videocalls at 30mb every 10 minutes. (video's are real constant upload data sending your webcam video OUT), which debunks the theory of internet limitations theory too.

but what makes me laugh is the third theory that 2mb would make 5000 nodes turn into less.. yet they do not concede that segwits no witness mode will turn 5000 full nodes into less also.. because people are blindly told by the twerks that having no witness is still compatible, and nothing bad will happen

they have not considered that by increasing REAL capacity that more people will use bitcoin more regularly because they are no longer waiting upto an hour for a confirmation. and as such those people who used to power down their full nodes because currently they see no point in running a full node 24/7 to use it irregularly, will infact keep a full node powered up longer because they can use bitcoin more easily and more often multiple times a day, compared to maybe once a day.

i wait for them to claim "those that want to use regularly can just pay the fee" yet there are MANY examples that even with a fee, there is no guarantee of being part of the very next block. all because capacity needs to increase to make sure that all fee payers have room in the next block.

if they then claim that segwit is the capacity solution they have forgot that an average signatures (71byte) saving is eaten up by a cuple bytes for the flags. and then 40 extra bytes for the payment code changes of blockstreams other features and other bytes added for things like lightning and sidechains.. so segwits capacity increase is just  bait and switch temporary patch, that doesnt last more then an couple months.. people want REAL capacity increases
sr. member
Activity: 400
Merit: 250
February 09, 2016, 04:06:08 PM

If we were in the Wall Observer thread, I'd say "fair game" but we're trying to discuss the likelihood of a chain fork and what level of risk in that context is justifiable.

And you do that by making a statement along the lines of "The keys of the kingdom to toomim"?

How is that a valid argument on a topic titled "Gavin proposes BIP for 2Mb...." (paraphrased) ?

A valid argument? You snipped 7 words from this post:

Why would you assume that? The most important thing to consider here is that miners are working off incomplete information. They don't really know how many nodes are running what implementation as it's very easy to run fake nodes. And it's nodes -- not hashing power -- that determine the validity of a blockchain. It's a more diverse and interesting question than most realize. Miners are pretty centralized. I think this is why Gavin is targeting them: it's much easier to trick a small number of highly centralized mining pools than it is to trick thousands of node operators. And if the 2MB implementation is capable of triggering the rule change based on hashing power (at 75% or whatever bullshit "democratic" threshold Gavin & Co. come up with -- 51%, etc.), then everyone else will crumble in submission, right?

Well...the dozen nodes that I run won't. The definition of "majority" and "minority" chain can change in a heartbeat; that's just a matter of miners temporarily pointing their hashing power at one chain or the other. It doesn't matter what Coinbase and Bitstamp say now, or where Bitfury points its hashing power. What really matters are the nodes that determine block validity, and what proportion of them enforce the new fork's consensus rules. Because if a significant proportion of them enforce the old rules, we will have an irreparable chain fork. These irrelevant musings about how a majority of hashing power will render all other blockchains instantly dead are amusing but not very informative. If nodes do not approach consensus, miners will have to choose which fork to built on top of. But, which one? All of the Classic/XT rhetoric says that a temporary majority of hashing power will surely solve everything. But what the hell does that have to do with nodes? What proof do you have that Classic nodes will comprise a majority of nodes -- simply because Bitfury and a few mining pools upgraded (if that happens at all)? Well, if a majority of nodes continue to enforce the 1MB rule, you may find quickly that the "majority chain" isn't a very meaningful phrase. It's all about validity. Miners will point their hashing power at the longest, valid chain. If it isn't clear which one is the longest valid chain (due to no clear consensus among nodes), we will have multiple blockchains and this will be irreconcilable. IMO, the most likely outcome of that is for mining farms to shut down en masse and for difficulty adjustment to drop significantly, as miners cannot risk expending resources to build on potentially invalid blockchains. The market would likely never recover -- probably rightfully so. For this to happen would mean that the only mechanism to enforce rules within the bitcoin protocol was broken, and all it took was the prodding of a loud minority.

By the way, you know that pre-fork coins could also be sold off on majority-fork exchanges? Particularly because early adopters might be a little pissed off at the commit keys for bitcoin's dominant implementation being in the hands of a junior dev who wants to make the question of inflating the money supply a democratic one (jtoomim). How do you know who controls millions of pre-fork coins? You can be sure that I'll be dumping everything the second Toomim gets the keys to the kingdom, and I know several likeminded people.

...And you're suggesting that "the keys of the kingdom to toomim" is the argument I'm making? Have you ever made an honest argument in your life?

You do realize that the intent of a successful hard fork is for all nodes to update to the new consensus rules? And that Gavin's intention, then, is for all nodes to update to Classic? Do you realize, further, that Toomim is the lead maintainer of Classic -- that he controls commit access, and that Core will obviously not control commit access to the dominant implementation in that case?

Never mind that I already explained that in a subsequent post, since your method of debate is to delete everything substantive your opponent says and take the one phrase that's left out of context.
Pages:
Jump to: