Pages:
Author

Topic: Limits to accepting a new longest chain to prevent >50% (Read 1678 times)

legendary
Activity: 1232
Merit: 1094
So, how do you prevent that from basically being the same as "everyone flip a coin?", I'm not sure what kind of guidance you could usefully give there.

"Everyone" would know which is the real chain.  It is saying that the rule in that case is that it requires manual discussion.  

You need to go to forums etc to say which is the "real" chain.

Quote
I might, with enough effort and thinking into how you do something useful with the instructions there be convinced that there could be something useful along those lines— or at least, no more harmful than any other rearrangement of the titanic's deckchairs— but then there is the issue of if someone has enough computing power to create a 10k reorg, the could constantly reorg 9999 blocks over and and over again.  "This isn't any better".

Maybe they are willing to do it once, but not over and over.  In fact, maybe even 10k is to high, perhaps the node could auto-checkpoint all blocks that are at least 24 hours old.

A 24 hour reversal inherently requires manual intervention.

You connect daily and your client tells you a reversal has happened and manual intervention is required.

95% of users would be on the historical chain.  The reversal would have to be short to not trigger the warning.
staff
Activity: 4284
Merit: 8808
What about just defining a fork more than 10k blocks from the main chain as just that, a fork.  Have the client consider both alt chains and tell the user he needs to check the internet as to which one the community considers the real one.
So, how do you prevent that from basically being the same as "everyone flip a coin?", I'm not sure what kind of guidance you could usefully give there.

I might, with enough effort and thinking into how you do something useful with the instructions there be convinced that there could be something useful along those lines— or at least, no more harmful than any other rearrangement of the titanic's deckchairs— but then there is the issue of if someone has enough computing power to create a 10k reorg, the could constantly reorg 9999 blocks over and and over again.  "This isn't any better".

If it really is only a matter of picking which color tie the corpse of Bitcoin will wear— well the software to implement anything like that itself has a cost (size, review bandwidth, vulnerability surface)— and I can't get too excited about something costly that doesn't make a real improvement. And I also think thats a general issue for these sorts of last-resort thing: if they're only attractive in a already doomed outcome its hard to justify their cost.
legendary
Activity: 1232
Merit: 1094
What about just defining a fork more than 10k blocks from the main chain as just that, a fork.  Have the client consider both alt chains and tell the user he needs to check the internet as to which one the community considers the real one.
kjj
legendary
Activity: 1302
Merit: 1026
Personally, I am not convinced it is needed.

There was so far... about 2 "emergencies" with the blockchain (that "generate a lot of bitcoins" bug, and the doublespend in the recent fork) none of which were exploited by a real malicious party.

As far as I can tell, many major banks have a less stelar record.

It doesn't seem to me that bitcoin's "the hashiest chain wins" approach is "broken", so maybe we should refrain from "fixing" it.

My issue was always that I didn't like the idea of a hidden chain attack.  If someone has a bunch of hashing power (very unlikely) and generates a chain offline, then publishes it, overturning a huge number of blocks, we pretty much just have to sit and watch.  Of course, we can then intervene after the fact to put it back.

But it seems like a better way would be to devise a scheme where the attacker would be unable to keep their longer chain secret.  That is why I like the exponential difficulty method.  Under ordinary circumstances, and even honest chain forks, the network would operate as usual.  But a high powered attacker is fighting against the clock, and people can look at the number of blocks protecting their transaction, calculate the exponential, and evaluate the risk with something more closely approaching certainty.

The cost is, however, that the notion of correctness gets a little fuzzy.  I still think that it is better to protect the network as a whole, in exchange for individual nodes needing manual intervention during some attacks.  Gmaxwell gives the opposing view, which is widely shared.  It is a really critical part of bitcoin, and won't be tweaked lightly, if ever.
legendary
Activity: 1232
Merit: 1094
Errr. I thought dev-hardcoded checkpoints prevent a complete reversal even in the face of stupidly overwhelming adversary...

Yes, I mean if you didn't want central checkpointing.
newbie
Activity: 26
Merit: 0
Errr. I thought dev-hardcoded checkpoints prevent a complete reversal even in the face of stupidly overwhelming adversary...
legendary
Activity: 1232
Merit: 1094
It doesn't seem to me that bitcoin's "the hashiest chain wins" approach is "broken", so maybe we should refrain from "fixing" it.

I think the main point is that no scheme can prevent a complete reversal from the genesis block.  If you have 2 chains that fork at the genesis block, then you can only compare the total POW.

However, if they fork at a later point, you could use something like proof of stake. or burn or whatever.  Only stakes from before the fork would count though.

If the coin value before the fork is distributed, then this distributed checkpointing.
newbie
Activity: 26
Merit: 0
I do agree that we haven't seen such a threat - but it always is a nagging concern (damn it Satoshi are you sure you got it right?).

Grin


Unless you run a bitcoin service with aspirations to eventual greatness (which  I do Wink ), doublespends that require immense hashrates should not be a concern for you at all (unless you happen to be good at making enemies among major pool operators Cheesy )
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
I do agree that we haven't seen such a threat - but it always is a nagging concern (damn it Satoshi are you sure you got it right?).

Grin
newbie
Activity: 26
Merit: 0
Personally, I am not convinced it is needed.

There was so far... about 2 "emergencies" with the blockchain (that "generate a lot of bitcoins" bug, and the doublespend in the recent fork) none of which were exploited by a real malicious party.

As far as I can tell, many major banks have a less stelar record.

It doesn't seem to me that bitcoin's "the hashiest chain wins" approach is "broken", so maybe we should refrain from "fixing" it.
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
I started a discussion on this back in february.

https://bitcointalksearch.org/topic/blockchain-rollback-limit-140695

Thanks for the link - don't know how I missed that.
legendary
Activity: 1193
Merit: 1003
9.9.2012: I predict that single digits... <- FAIL
Although perhaps not a very likely scenario such an attack would be a massive confidence destroyer - so I am wondering would it not be reasonable for a client to reject a new chain if it contains blocks that it hasn't seen that are much older than blocks in the chain it is already building on (or is this already the case)?

I started a discussion on this back in february.

https://bitcointalksearch.org/topic/blockchain-rollback-limit-140695
staff
Activity: 4284
Merit: 8808
So basically I think what you are saying is that if anyone gets >50% we are screwed no matter what (so therefore why try and mitigate anything) - correct?
(am willing to accept that there may be nothing we can do about it but it of course does leave some concern if we simply have no defense at all)
There are things that can be done, but they depend on the specifics of the attacker and the attack... and if the attacker knows about them they will be less effective. You can be confident that Bitcoin wouldn't go down without a fight.

But fundamentally: The security assumption of Bitcoin is that the honest users control the majority.  If it could be stronger, it would be— but at least so far as I've seen the proposals to strengthen it within the algorithm end up trading off one weakness for a worse one. If you break the majority assumption then no algorithm can protect you— but people, acting externally to the system adapting it with the consent of the honest users— still can.  People can make value judgements "this chain is good, that chain is an attack" which are very hard for an algorithm to make especially when the attacker can see the algorithm.  Those value judgements are a liability— they're part of why traditional monies are untrustworthy— but if Bitcoin's security assumptions get broken by an overt attack I expect there would easily be universal consensus for some kind manual intervention.
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
So basically I think what you are saying is that if anyone gets >50% we are screwed no matter what (so therefore why try and mitigate anything) - correct?

(am willing to accept that there may be nothing we can do about it but it of course does leave some concern if we simply have no defense at all)
staff
Activity: 4284
Merit: 8808
Although perhaps not a very likely scenario such an attack would be a massive confidence destroyer - so I am wondering would it not be reasonable for a client to reject a new chain if it contains blocks that it hasn't seen that are much older than blocks in the chain it is already building on (or is this already the case)?
If you make the longest chain decision stateful and not a pure function of the universe equally visible to all nodes then you replace a consensus change with an even more devastating consensus _failure_.

As an example, an oft-repeated suggestion is "just refuse to make any reorg greater than 50 blocks". Great, so now an attacker who can outpace the network can produce a fork 49 blocks back  and then mine two more blocks— one on the real branch one on the fork— and concurrently announce them each to half of the network ... and from one currency you have two: nodes are forever split and will never converge.  ... Or more simply, he makes his longer chain and all new nodes will accept it, all old nodes reject it.

Of course, if you make the fork far back enough then "okay, it'll never happen"— indeed, but if it'll never happen, what value is it?

legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer

Thanks for the link - well it does seem that although not so worried about it (as perhaps I am) Gavin has thought that this might be something that should be addressed.
full member
Activity: 154
Merit: 100
I must admit I hadn't thought about SPV clients although if all headers of all blocks (including those of a fork) were available then couldn't an SPV client also decide to ignore new headers that it decides are too old (i.e. they still have the timestamps for every header they are using don't they)?

Anyway, the issue has been discussed to death before. Taking priority into account is something Gavin mentioned on his blog last year, and that was just to make sure the public realized there are last ditch options available.

Any link to where this has been discussed in depth before (maybe I am not searching on the right thing)?


http://gavintech.blogspot.ca/2012/05/neutralizing-51-attack.html
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
I must admit I hadn't thought about SPV clients although if all headers of all blocks (including those of a fork) were available then couldn't an SPV client also decide to ignore new headers that it decides are too old (i.e. they still have the timestamps for every header they are using don't they)?

Anyway, the issue has been discussed to death before. Taking priority into account is something Gavin mentioned on his blog last year, and that was just to make sure the public realized there are last ditch options available.

Any link to where this has been discussed in depth before (maybe I am not searching on the right thing)?
legendary
Activity: 1232
Merit: 1094
If Bitcoin thinks that 100/120 is the *safe* point to allow spending from coinbase then I would be proposing a figure that would be closely related to that (making it no more subjective than the limit already in place).

The rule could be something like, auto-checkpoint if
- the block is on the longest chain
- the block is at least 2160 blocks deep
- the block was received by the node more than 30 days previous
- During the last 30 days,
-- the daily hashing for the chain has never been less than 50% of the median for the 30 days
-- the chain received more than 90% of total hashing power of all forks been mined

Auto-checkpoints could be discarded if the fork gets long enough, so they are soft checkpoints.  They would get harder the older they are.
legendary
Activity: 1120
Merit: 1164
You can use https://twitter.com/blockheaders as your blockchain information source and create a secure Bitcoin wallet that has no concept of the P2P network or the blockchain itself. If you have a few other sources of block header information you aren't even trusting any one entity; information is easy to spread and difficult to stifle.

Any attempt to change the rules for what constitutes a valid blockchain to something other than longest wins has the ugly consequence that SPV clients no longer work. You can do it - in an emergency we may have no choice at all - but remember that it has ugly consequences.

Anyway, the issue has been discussed to death before. Taking priority into account is something Gavin mentioned on his blog last year, and that was just to make sure the public realized there are last ditch options available.
Pages:
Jump to: