Author

Topic: 2 Bitcoin papers published at Financial Cryptography 2012 conference (Read 2448 times)

legendary
Activity: 3920
Merit: 2349
Eadem mutata resurgo
full member
Activity: 210
Merit: 100
Thanks for the links Simon.
staff
Activity: 4284
Merit: 8808
Checkpoints don't solve anything.

They bound some corner case DOS attacks (I send you a million block difficulty 1 fork, filling up your disk space), and reduce isolation attacks (new person installs software and an evil ISP keeps them from connecting to the real bitcoin network) to the problem of getting an authentic copy of the software or the problem of plausably maintaining a fork at recent difficulties.

Don't read too much more into them than that.
legendary
Activity: 1526
Merit: 1134
Thanks for the links.

I agree that the Bitcoin community has already invented virtually all of the solutions proposed in the paper. This is good - it means other smart people have independently studied the issues and come to the same conclusions.

BTW there is no need for third party filtering services. It can be integrated directly into the node software. Clients provide bloom filters over script blocks and the nodes return matching transactions with merkle branches linking them to the block headers. I'll probably implement it at some point if nobody else does, but right now there are more important things to do.
hero member
Activity: 714
Merit: 500
Thanks for the papers.
legendary
Activity: 1358
Merit: 1003
Ron Gross
... These design choices are not compatible with a long-term sustainable system

This is only true if Bitcoin doesn't achieve world dominancy, and transaction fees are not sustainable.
If the entire world uses Bitcoin for 1% of its online transactions, then there could be enough transactions fees to properly incentivize honest miners. I agree that it's an open, hard problem for Bitcoin ... but I wouldn't go as far as saying that the current design cannot be sustained. It remains to be proven or disproven.
legendary
Activity: 1050
Merit: 1003
Checkpoints don't solve anything. A 51% attacker can still disrupt the network and prevent all txns from occurring with or without checkpoints in place. Core problem is the currency generation rules (declining rewards) and proof-of-work based voting. These design choices are not compatible with a long-term sustainable system. The authors point this out. I don't agree with their solutions either, but I give them credit for focusing attention on a very worrisome issue which most people ignore. It is far-sighted of them.

I agree that there is nothing original in the paper, but that is beside the point.

legendary
Activity: 1358
Merit: 1003
Ron Gross
FYI, the CommitCoin paper is not new - I've seen it a couple of months ago, and it does raise interesting ideas.
It's a good paper.
legendary
Activity: 1358
Merit: 1003
Ron Gross
Section 4.3 of the Xavier paper makes me wonder if they've read any of my many posts about setting an exponential difficulty function for reorgs.

Bah

So far they promote the ludicurous idea of "deflationary spiral" - meaning, the coints are so valuable they become worthless, and also propose the "novel" idea of Checkpoints.

Quote
Countering “Revisionism” by Checkpointing the Past
We outline a distributed strategy to tackle the history-revision attack threat in a simple and elegant way

I mean, they even use the same fucking name ... would Googling it before publishing it as a new way to combat forks have been so hard?


I was referring to this paper btw. Just finished reading it.

tl;dr - nothing new here, although it's a good intro into Bitcoin for newbies. I don't like how they claim to be innovative ... if they treated it as a review article and did a bit more research, I would be supportive.

As I often do, I spoke too soon. They mention the existing checkpoints (they call them Fiat Checkpoints) after a few paragraphs, but they qualify it:

Quote
Alas, there is no reason to trust a download of the software any more than one of the transaction history itself.

A claim which is false. No GPU/FGPA/ASIC farm in the world can change the published checkpoints ... so they do provide more security. Yeah, you have to trust the developers to trust the checkpoints, but the point is that everyone in the Bitcoin community already trusts the devs far more than they trust "the hash power of the Bitcoin network". So they do provide an extra layer of protection.

There might be some merit to develop a more intricate checkpoint system in the far future, but it's not in the list of top priorities for the Bitcoin project.

Read a bit onwards, the authors also "discover" the problem of a maleware stealing Bitcoins, and propose another "novel" approach - multi-sig. Also on their list of innovations are deterministic wallets, thin clients. They called it "Filtering Service" .. but it can't just filter blocks and still have the client verify the blocks relevant to him, because the blocks depend on each other ... so it's essentially just yet-another-thin-client-approach. I'll admit I haven't taken the time to understand their proposed filtering service protocol, but I don't understand how something between a thin and full client can function, so I won't bother (please correct me if I'm wrong and there is some new bit of info in this paper after all).

Also, instead of Mixers they propose "Fair Exchange Protocol" as a way to implement a zero-trust Mixer. This has already been proposed a few times ... I believe Meni wrote about it.
legendary
Activity: 1050
Merit: 1003
The paper is by and large good. They are completely correct to point out that bitcoin is designed to become increasingly insecure over time.
legendary
Activity: 1358
Merit: 1003
Ron Gross
Section 4.3 of the Xavier paper makes me wonder if they've read any of my many posts about setting an exponential difficulty function for reorgs.

Bah

So far they promote the ludicurous idea of "deflationary spiral" - meaning, the coints are so valuable they become worthless, and also propose the "novel" idea of Checkpoints.

Quote
Countering “Revisionism” by Checkpointing the Past
We outline a distributed strategy to tackle the history-revision attack threat in a simple and elegant way

I mean, they even use the same fucking name ... would Googling it before publishing it as a new way to combat forks have been so hard?
kjj
legendary
Activity: 1302
Merit: 1026
Section 4.3 of the Xavier paper makes me wonder if they've read any of my many posts about setting an exponential difficulty function for reorgs.
sr. member
Activity: 303
Merit: 251
Thanks for providing these excellent links, Simon.
newbie
Activity: 56
Merit: 0
You can find the papers here:

http://fc12.ifca.ai/program.html

The paper 'Bitter to Better' has a slightly cleaner version and also presentation slides available here:

http://crypto.stanford.edu/~xb/fc12/
Jump to: