Pages:
Author

Topic: On The Longest Chain Rule and Programmed Self-Destruction of Crypto Currencies (Read 17858 times)

full member
Activity: 154
Merit: 100
If I have misinterpreted his writings, he will I assume point that out.
You have, but I've given up responding.

Btw, a couple of months ago I solved the selfish-mining problem and proved to myself that gmaxell was wrong. Here I quote a snippet from my private design document:

Code:
   3. Sharing revenue proportationally with the fork(s) of lower cumulative
      difficulty entirely defeats selfish mining (positive relative revenue) for
      any α. All blocks in the merged forks receive their full block reward
      adjusted by relative difficulty.

         r_others = p_0(1-α) + p_1(1-α) + p_2(1-α) + p[k>2](1-α), cases (e), (f), (g), and (h)
         r_pool = p_1(1-α)/2 + p_2(1-α)2 + p[k>2](1-α)/2, cases (f), (g), and (h)

         r_others = p_1(1-α)(1/α + 1 + α/(1-α) + (α-1)/(2α-1) - 1 - α/(1-α))
         r_pool = p_1(1-α)(1/2 + 2α/(1-α) + (α-1)/(4α-2) - 1/2 - α/(2-2α))

         R_pool = (3α/(2-2α) + (α-1)/(4α-2))/(1/α  + 3α/(2-2α) + 3(α-1)/(4α-2))

      Plot the following at wolframalpha.com.

         (3α/(2-2α) + (α-1)/(4α-2))/(1/α  + 3α/(2-2α) + 3(α-1)/(4α-2)),  α, 0 < α < 0.5

      However, merging forks of lower cumulative difficulty would remove the
      economic incentive to propagate forks as fast as possible; and issuing a
      double-spend would be trivial. The solution is to only merge forks of
      lower cumulative difficulty when they are known by the majority before
      receiving a fork of higher culmulative difficulty. Thus an attacker or
      network hiccup that creates a hidden higher culmulative difficulty fork
      has an incentive to propagate it before that fork falls behind the
      majority. A node assumes it is part of the majority if it is participating
      non-selfishly; it will attempt to merge any lower culmulative difficulty
      forks it was aware of upon receiving a higher culmulative difficulty fork.
      If it is not part of the majority, then its attempt will not be accepted.
      If there are double-spends at stake, each node will continue to try for
      the longest fork miners wish to be able to merge, i.e. longer forks with
      double-spends won't be merged so the market will decide the value of the
      two forks. If there are no double-spends at stake then to defeat selfish
      mining, each node will continue to try during the duration of the longest
      fork probable for an attacker α < 0.5[6]. If a node selfishly forsakes its
      obligation and joins the attacker, that node attacks the value of the
      block rewards it earned.

      Also, forfeiting double-spends defeats any incentive to temporarily rent
      greater than 50% of the network hashrate because the persistent honest
      miners will forfeit the double-spends from the attacker’s fork upon it
      relinquishing control. Compared to Bitcoin, there is no increased
      incentive to double-spend, because miners will not place into any block a
      double-spend they’ve already seen; double-spends can only appear in forks.
      In Bitcoin one fork wins so one transaction from each of the double-spend
      is forfeited so the honest recipient loses when the attacker’s fork wins.
      In this new algorithm both of the transactions from each double-spend are
      forfeited, so both the honest recipient and the attacker lose.

      Defeating short-term rented hashrate attacks in conjunction with only
      accepting inputs to a transaction which are as old as the longest duration
      fork that will be merged, proves the probability of a double-spend[6] to
      be determined solely by the number of block confirmations no matter how
      fast the block period is.

      Instead of monolithically choosing a winning fork, forfeiting double-
      spends deletes selective downstream transactions. Thus fully opaque block
      chains such as Zerocash, are thought to be incompatible because they not
      only obscure which transaction consumed its inputs but also hide any coin
      identifier that could correlate spends on separate forks. Cryptonote’s
      one-time ring signatures are more flexible because mixes could choose to
      mix only with sufficiently aged transaction outputs.

      Each block of the Proof Chain contains a tree of block hashes, which are
      the forks that were merged and a branch in a subsequent block may continue
      a branch in a prior block. The restricted merging rule, double-spend
      forfeiture and proof trees avoid the risks of Ethereum’s proposed
      approximation[5]. Miners have to maintain transaction history for the
      longest fork they wish to merge. And downstream consumers of inputs must
      wait for the longest fork they anticipate[6].


   [5] https://blog.ethereum.org/2014/07/11/toward-a-12-second-block-time/
       Now, we can’t reward all stales always and forever; that would be a
       bookkeeping nightmare (the algorithm would need to check very diligently
       that a newly included uncle had never been included before, so we would
       need an “uncle tree” in each block alongside the transaction tree and
       state tree) and more importantly it would make double-spends cost-free...
       Specifically, when only the main chain gets rewarded there is an
       unambiguous argument...Additionally, there is a selfish-mining-esque
       attack against single-level...The presence of one non-standard strategy
       strongly suggests the existence of other, and more exploitative,
       non-standard strategies...True. We solved the selfish mining bugs of
       traditional mining and 1-level GHOST, but this may end up introducing
       some new subtlety; I hinted at a couple potential trouble sources in the
       post.
   [6] Meni Rosenfeld. Analysis of hashrate-based double-spending.
       Section 5 Graphs and analysis.
       http://arxiv.org/pdf/1402.2009v1.pdf#page=10
newbie
Activity: 10
Merit: 0

[/quote]

Hi Andrew,

In future it is better to create a new thread rather than resurrecting an old one, especially one as vivacious as this one.

As to the content of your article, I briefly skimmed it. A few comments -- your concerns about ASIC monopolies are largely addressed in my ASICs and Decentralization FAQ, and secondly, the "anti-monopoly" scheme by Sirer and Eyal is seriously and fundamentally broken by being progress-free. It seems to me that these authors are more concerned with promoting themselves with doomsday headlines than they are getting the fundamentals of what they write about correct, and it's best for the Bitcoin world if they not be given attention.

Andrew

[/quote]

Hi Andrew,

thank you for your reply. And thank for your FAQ it does explain a lot of things pretty well. Especially why ASIC-miners are so important to validate the Blockchain on the long term. But on the other hand in my article I have not ruled out the use of ASIC miners. Just possible monopolies in the future for controling the network I see as an threat for Bitcoin. Please read the part of my article which deals with the position stamp for guaranteeing the decentralization of bitcoin. With decentralization I do not mean that in the future the should not be any mining pools or big miners. On the contrary I mean a safeguard which is also implemented into the blockchain to counter attacks and to detect monopolies. My suggestions would also impact on the hardware flow. But these we have to dicuss in detail.

I think I will open a new thread about this topic Smiley

Cheers,
Andrew
full member
Activity: 179
Merit: 151
-
A very interesting paper. But I don´t think that the crypto currencies will self destruct themselfes if we can make the right adjustments.
I have thought about a way to guarantee the decentralization of a crypto currency or Bitcoin on the long term. I have written an article and would like to hear what you think about it:
http://techreports2014.wordpress.com/2014/09/07/fundamentals-of-a-possible-new-bitcoin-fork-bitcoin-2-0/

Hope it can help and I would love discuss it with you guys.

Hi Andrew,

In future it is better to create a new thread rather than resurrecting an old one, especially one as vivacious as this one.

As to the content of your article, I briefly skimmed it. A few comments -- your concerns about ASIC monopolies are largely addressed in my ASICs and Decentralization FAQ, and secondly, the "anti-monopoly" scheme by Sirer and Eyal is seriously and fundamentally broken by being progress-free. It seems to me that these authors are more concerned with promoting themselves with doomsday headlines than they are getting the fundamentals of what they write about correct, and it's best for the Bitcoin world if they not be given attention.

Andrew
newbie
Activity: 10
Merit: 0
@all

A very interesting paper. But I don´t think that the crypto currencies will self destruct themselfes if we can make the right adjustments.
I have thought about a way to guarantee the decentralization of a crypto currency or Bitcoin on the long term. I have written an article and would like to hear what you think about it:
http://techreports2014.wordpress.com/2014/09/07/fundamentals-of-a-possible-new-bitcoin-fork-bitcoin-2-0/

Hope it can help and I would love discuss it with you guys.

Greetings.
Andrew
legendary
Activity: 924
Merit: 1129
The heaviest subtree model?   What's that? 
I've not heard that one yet.

In any decision about which of two potential branches to accept, the transactions-as-proof-of-stake method (aka heaviest-subtree model) prefers the branch whose transactions have spent the greatest proportion of the txouts that existed at the moment when those two branches diverged. 

In order for this to work, transactions must belong clearly to one branch or the other.  So the transaction itself has to have a block ID embedded in it, and it counts as "support" for the branch that includes that block ID. 

So the 'finite resource' that must be used up to support a branch and which, if used, cannot also be used to support another branch, is coins in txouts, not hashing power.  And the people who deserve the coins for helping to secure the blockchain are everybody who made a transaction using their stake, rather than whoever came up with the winning hash. 

hero member
Activity: 518
Merit: 521
Academics have produced nothing but perfect nonsense on the topic of Bitcoin. This is one of the worst.

Nice "not invented here" bravado but incorrect.


He's also right about the effects of block reward halving on hash power allocation.

No he isn't or at least not his conclusions on what "will" happen are just speculation.

...

a) continue to mine bitcoin for half the revenue
b) sell the hardware to a miner with lower costs (namely cheaper/free electricity and cool climate)
c) mine an altcoin.

The author jumps right to c.

The author might have the wrong justification for the conclusion, but the "Programmed Self-Destruction" conclusion is not speculation because the self-evident fact is that investors only spend a small fraction of their income, thus if you don't redistribute currency then its use dies[1].

[1] https://bitcointalksearch.org/topic/m.7900846
hero member
Activity: 518
Merit: 521
I believe transactions-as-proof-of-stake (the heaviest subtree model) is probably the best alternative to proof-of-work - and it isn't all that good.

Agreed.  One issue is that it makes risk analysis difficult.  This means the simplicity of wait for x confirmations and you are safe (unless attacker has a majority of the hashrate) no longer applies.

I don't know if I have missed some discussion that would have changed the understanding I formed, but I pointed out egregious flaws in the original proposal for Transactions as a Proof-of-stake.

The fundamental math problem with using any metric from the block chain (or any consensus voting such as proof-of-stake) is that the input entropy can be gamed deterministically unlike proof-of-work which is a randomized process, i.e. the input entropy is not orthogonally unbounded as it is in the randomization of proof-of-work.
hero member
Activity: 518
Merit: 521
If I have misinterpreted his writings, he will I assume point that out.
You have, but I've given up responding.

Nice to see technical discussion has been reduced to politics. Smells like the typical "not invented here, so ignore it" phenomenon of vested interests (or I don't want to help the competition).

What is the point of technical discussion if you are not going to make necessary clarifications.

I believe my interpretation is complete. It is up to you to show otherwise, or give up if you can't/won't.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
The heaviest subtree model?   What's that? 
I've not heard that one yet.
donator
Activity: 1218
Merit: 1079
Gerald Davis
I believe transactions-as-proof-of-stake (the heaviest subtree model) is probably the best alternative to proof-of-work - and it isn't all that good.

Agreed.  One issue is that it makes risk analysis difficult.  This means the simplicity of wait for x confirmations and you are safe (unless attacker has a majority of the hashrate) no longer applies.  

Quote
In the long run, it can provide an absolute security guarantee given enough time; Once more than half of all the coins in txouts that existed before a block was created have been spent, that block becomes absolutely irrevocable no matter what proof-of-work anybody pours on or what manipulations they do with spending and transactions.  

One problem is that a large number of outputs have not ever been spend, and may not be spent for years or decades.  So it could be some time before a block was absolutely irrevocable.  The large amount of old unspent outputs create uncertainty.  One variant would be to only include outputs which are below a certain age at the time of the block.   For example you could say for the purpose of block scoring outputs older than one block month (4,320) aren't included in the score.  This would reduce the requirement to only a majority of the outputs less than a month old.

Still it will require some careful analysis to avoid some unexpected weaknesses.   For as complex as Bitcoin is in implementation, it is rather simple (maybe elegant is a better word) in design.  There are still nuances, and gotchas in the Bitcoin protocol and it is built on a simple design.  More complexity may not be the answer.
legendary
Activity: 924
Merit: 1129
I believe transactions-as-proof-of-stake (the heaviest subtree model) is probably the best alternative to proof-of-work - and it isn't all that good.

The basic idea is that the "finite resource" available for deciding to prefer one chain over another, is the set of unspent txouts that exist at the point of the chains' divergence from each other.  If transactions must give the hash of a very recent block (parent or grandparent to the block they'll be in) then they can be counted as a "vote" for a chain including that block. 

In practice, this makes it possible for an attacker to spend ten coins in one chain, then support a different chain by spending a thousand coins (probably to himself) there, and if the second chain is accepted it 'unspends' his ten coins.  Obviously this only works as a double spend if he does it before everybody else in the course of regular spending puts the first chain a thousand coins ahead. 

But it gets worse than that, because at any given moment there may be dozens or even hundreds of crooks looking for a chance to double spend, and if two competing chains appear, their efforts to make a small initial expenditure in the apparent leading chain and then dump a huge transaction into the second chain all reinforce each other. 

On one hand, if everybody understands the security requirement for transactions as proof of stake and regularly transacts their coins several times a day, (which you can arrange with a proof-of-stake interest/security payment for each transaction) the crooks shouldn't be able to overwhelm that traffic with their timing games.  On the other, that would generate an absolutely enormous blockchain and have a high communications overhead.

So in the short run, it doesn't work.  In the long run, it can provide an absolute security guarantee given enough time; Once more than half of all the coins in txouts that existed before a block was created have been spent, that block becomes absolutely irrevocable no matter what proof-of-work anybody pours on or what manipulations they do with spending and transactions.   
staff
Activity: 4172
Merit: 8419
If I have misinterpreted his writings, he will I assume point that out.
You have, but I've given up responding.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
the bottom line is your proposal just shifts around the conditions for an attack , it doesn't really add additional security without trading off security elsewhere.

If you disagree, the only person you seem to have convinced is yourself.
hero member
Activity: 518
Merit: 521
gmaxwell said "your proposal is completely ineffective".

But whatever, keep arguing... its what you're best at.

I am not arguing, I am clarifying that you (are intellectually handicapped—which I avoided stating until you attacked me—and) don't understand what Gmaxell wrote:

and where was my solution proposed before?

But it's not a solution, alas. Ignoring other issues, at best it still leaves it at a simple piece of extortion "return most of the funds to me or I will reliably destroy your payment". It that sense pretty much isomorphic to "replace by fee scorched earth". The ongoing effort has other problems— a txout can be spent again immediately in the same block. Imagine it takes months to get the fraud notice out (heck, imagine a malicious miner creating one and intentionally withholding it).  By that time perhaps virtually all coins in active circulation are deprived from the conflicted coins. Now they finally get the notice out (/finally stop hiding it). What do you do?  Nothing? Invalidate _everyone's_ coins? Partially invalidate everyone's coins?  Each option is horrible. Do nothing makes the 'fix' ineffective in all cases: the attacker just always sends the coins to themselves in the same block, the others make the failure propagate— potentially forever, and don't just hit the unlucky merchant with the potentially unwise policy.

The "makes the 'fix' ineffective in all cases" refers to "The ongoing effort has other problems", so he means there is no solution in Bitcoin (in "the ongoing effort").

And my response:

But it's not a solution, alas. Ignoring other issues, at best it still leaves it at a simple piece of extortion "return most of the funds to me or I will reliably destroy your payment".

That specific threat was paramount in my mind as I was designing my proposal and I think I eliminated it.

The mining nodes reject any double-spend transaction which conflicts with the block chain. The only transactions that can be unwound are those which appear in a competing fork and only when that competing fork does not have enough sustained agreement. The premise is the attacker can't maintain 50+% of the hashrate indefinitely. Essentially what I am proposing is that orphaned chains are not forgotten by the sustained majority when the longer chain temporarily double-spends the orphaned chain, so the sustained majority (eventually) unwinds the temporary attack. The attack is differentiated from the majority because it is not sustained indefinitely. Abstractly I am proposing a smoothing filter on Proof-of-work longest chain rule. The ephemeral attacker is aliasing error.

And I think (perhaps) they can be unwound to eliminate the double-spend, rather than to the ether.

Gmaxell asserted that if transactions can be unwound then any recipient of funds could be under threat by the payer to send a double-spend and invalidate the transaction.

I rebutted by explaining that transactions are only unwound if a double-spend appeared in a block chain, but that consensus nodes were not going to accept a double-spend into the block chain. The only way to get a double-spend into the block chain is to do a 50% attack, thus payers won't be able to make such a threat.

And Gmaxell admitted that for the 50% attack scenario, Bitcoin has the same weakness in that transactions can be unwound when a chain is orphaned.

If I have misinterpreted his writings, he will I assume point that out.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
gmaxwell said "your proposal is completely ineffective".

But whatever, keep arguing... its what you're best at.
hero member
Activity: 518
Merit: 521
Uh, pretty sure they did say your idea won't work...more than once.

Quote them to document your assertion. You can't.
legendary
Activity: 1302
Merit: 1004
Core dev leaves me neg feedback #abuse #political
I have such a headache from trying to understand this thread so instead, I am bookmarking it for later.  Cheesy

A lot of Anonymint's ideas (Aliasing) are analogies and technobabble, and really have nothing to do with Bitcoin or blockchain technology.

(He should really stop trying to impress everyone with big words and explain his ideas in plain English.)

Hey ad hominem bullshit flows out of your mouth. I can't compensate for your intellectual handicap.

Anyway, Anonymint, I don't think more devs need to see the proposal.  You've already got Gmaxwell and DeathandTaxes telling you it won't work, what more do you want?

Neither Gmaxwell nor DeathandTaxes have stated that my idea won't work. D&T hasn't even addressed my idea. Gmaxell stated that the existing strategy has the same problem with derivative unwinds as my strategy-- that is not the same as saying my idea won't work. But you aren't even able to comprehend.

I think I've also given some fairly clear arguments also.

And I've addressed all of your posts.

I doubt you can rent half the network power anyway.

I've put that out there as an open question in my prior post and no one has credibly addressed it yet.

Uh, pretty sure they did say your idea won't work...more than once.

Notice they stopped posting because they are probably sick of
your antics...AGAIN.  You earned your badge in trolling long ago with
your hysterics that Bitcoiners are going to be broke and
also go to jail because of clawbacks.  

And no, you didn't address all my points.  You didn't solve the
51% attack problem in any way, shape, or form.

And if you want to disprove my assertion that you're a
pseudo-intellectualist, go ahead... explain in plain,
simple English how "Aliasing" relates to blockchains or
distributed consensus or anything related.


hero member
Activity: 518
Merit: 521
...Gmaxell stated that the existing strategy has the same problem with derivative unwinds as my strategy-- that is not the same as saying my idea won't work...

The cited paper:

Variants in measuring the blockchain, such as following the heaviest subtree, not the longest chain, have been proposed, and are the best hope at improving that basic piece of the protocol. https://eprint.iacr.org/2013/881.pdf

Quote
Perhaps the most important question that will affect Bitcoin’s success,
is whether or not it will be able to scale to support the
high volume of transactions required from a global currency system.
We investigate the restrictions on the rate of transaction processing in Bitcoin as a
function of both the bandwidth available to nodes and the network delay, both of which
lower the efficiency of Bitcoin’s transaction processing.

Summarizes Gmaxell's point as reinterpreted as quoted above, and also my orthogonal point that there is an unbounded increase in number of confirmations needed to protect against an attacker who can sustain > 50% of the network hashrate:

https://eprint.iacr.org/2013/881.pdf#page=7

Quote
The replacement of the current world-view with an alternative one has far reaching conse-
quences: some transactions may be removed from the current ledger. This fact can be used by
an attacker to reverse transactions. The attacker may pay some merchant and then secretly
create a blockchain that is longer than that of the network that does not include his payment.
By releasing this chain he can trigger a switch that effectively erases the transaction, or redirects
the payment elsewhere. This is a difficult undertaking, since the honest nodes usually have a
great deal of computational power, and the attacker must get very lucky if he is to replace
long chains. The longer the chain, the more difficult it becomes to generate the proof-of-work
required to replace it. Satoshi’s original security analysis defines a policy for receivers of pay-
ments: a transaction is only considered sufficiently irreversible after it was included in a block
and some n additional blocks were built on top of it. With this policy, Satoshi shows that the
probability of a successful attack can be made arbitrarily low. As a receiver of funds waits for
more blocks (larger n ), this probability goes down exponentially.

However, if an attacker has more computational power than the rest of the network combined
(i.e., it holds at least 50% of the computing power), it is always able to generate blocks faster
than the rest of the network and thus to reverse transactions at will (given enough time). This
stronger form of attack is known as the 50% attack.

The paper even points out that with network propagation advantages the attacker may be able to sustain the longest chain indefinitely with < 50% of the network hashrate:

Quote
In fact, the assumption that at least 50% of the computational power is required for such an
attack to succeed with high probability is inaccurate. If we assume the attacker is centralized
and does not suffer from delays, he can beat a network that does suffer from delays using fewer
resources. We formulate the exact conditions for safety from this attack, and amend Satoshi’s
analysis below. We return to the analysis of the weaker double spend attack in Sections 6 and
7.

The following calculation applies to a block period of 3.5 (1/0.29) seconds and 17 (59/3.5) confirmations:

https://eprint.iacr.org/2013/881.pdf#page=17

Quote
in some network configurations that match the assumptions above, an attacker with just over
24% of the hash-rate can successfully execute a so-called 50% attack, i.e., to replace the main chain
at will

The paper has some related insights as I did for my idea:

https://eprint.iacr.org/2013/881.pdf#page=18

Quote
The basic observation behind the protocol modification that we suggest, is that blocks that
are off the main chain can still contribute to a chain’s irreversibility. Consider for example
a block B, and two blocks that were created on top of it C1 and C2, i.e.,
parent(C1) = parent(C2) = B. The Bitcoin protocol, at its current form, will
eventually adopt only one of the sub-chains rooted at C1 and C2, and will discard
the other. Note however, that both blocks were created by nodes that have accepted block B and its
entire history as correct. The heaviest sub-tree protocol we suggest makes use of this fact, and adds
additional weight to block B, helping to ensure that it will be part of the main chain.

However it is not trying to address the ephemeral > 50% attack that my idea does. Instead the paper's GHOST protocol mitigates the fact that otherwise network propagation delay topologies can give the attacker an advantage such that it can execute attacks with the same probability of success with less than 50% (actually less than any probability curve calculated in Meni Rosenfeld's paper as cited).

GHOST aggregates the proof-of-work over n confirmations of all forks in the subtree above (i.e. after) B, i.e. it is a smoothing function:

Quote
10We are in fact interested in the sub-tree with the hardest combined proof-of-work, but for the sake of
conciseness, we write the size of the subtree instead.

https://eprint.iacr.org/2013/881.pdf#page=19

Quote
Thus, if we wait long enough, the honest subtree above B will be larger than the one constructed
by the attacker, with sufficiently high probability.

In my idea the nodes of the network utilize an additional piece of information which is the observation that a fork above B was orphaned by a fork which double-spends transactions in B (which applies weighting to the forks of the subtree above B by the observations of the nodes). Thus I believe my idea is more powerful and able to address the ephemeral > 50% attack (as well as < 50% attacks with greater probability) because it utilizes more information.

That paper and my idea are applying smoothing filters which incorporate more information, so that aliasing error is mitigated. There is a general concept in sampling theory-- don't discard information, filter it instead.

The paper also says we also shouldn't discard information when retargeting the difficulty:

https://eprint.iacr.org/2013/881.pdf#page=21

Quote
Retargeting (difficulty adjustment).
Given potentially complex relations between the
growth rate of the main chain and the rate of created blocks, and the fact that
GHOST depends more on the total rate of block creation, we suggest a change in the way difficulty
adjustments to the proof-of-work are done. Instead of targeting a certain rate of growth for
the longest chain, i.e., Beta (which is Bitcoin’s current strategy), we suggest that the total rate of
block creation be kept constant (Lambda). As our protocol requires knowledge of off chain blocks by
all nodes, we propose that information about off chain blocks be embedded inside each block
(blocks can simply hold hashes of other blocks they consider off-chain). This can be used to
measure and re-target the difficulty level so as to keep the total block creation rate constant.
hero member
Activity: 518
Merit: 521
The only other alternative Blockchain-by-Proof-of-X method that has been proposed since Nakamoto's solution has been Blockchain-by-Proof-of-Stake...

Recent rigorous security analyses of Blockchain-by-Proof-of-Stake methods are troubling...

Proof-of-stake == centralization.
hero member
Activity: 518
Merit: 521
I found my post where I had analyzed this paper on May 14.

Well I see as January 2014, others below started to expound upon what I had explained in November 2013 at the threads given by the quoted links above.

On The Longest Chain Rule and Programmed
Self-Destruction of Crypto Currencies


...

The rest of the point of the above paper regarding tx timestampes is really a flawed ad hoc way of attempting to achieve the decentralization that the prior sentence would achieve more correctly.

http://arxiv.org/pdf/1405.0534.pdf#page=29

Quote from: Nicolas T. Courtois
A big question is whether timestamps are needed at all, see Section 7.3. An
alternative to timestamps could be various pure consensus mechanisms without
timestamps by which numerous network nodes would certify that that they have
seen one transaction earlier than another transaction. In this paper we take the
view that they should be present by default and further con rmed by (the same)
sorts of additional mechanisms.
Pages:
Jump to: