Pages:
Author

Topic: DECENTRALIZED crypto currency (including Bitcoin) is a delusion (any solutions?) - page 33. (Read 91125 times)

sr. member
Activity: 420
Merit: 262
I refuse to repeat myself again.

Yet this phrase was repeated several times...

Ok then, I refuse to refuse to refuse to repeat myself again.   Cheesy

(readers hacker humor I guess)

Ok, I'll just leave this here for the interest of the reader

Thanks for attempting to make the readers misinformed. That will help prevent anyone from seriously implementing before I do. A future white paper can put this issue to rest.
legendary
Activity: 2142
Merit: 1009
Newbie
I refuse to repeat myself again.

Yet this phrase was repeated several times...
member
Activity: 81
Merit: 10
Ok, I'll just leave this here for the interest of the reader

Quote
Profitability of mining has nothing to do with the miner's economics of double-spending.

versus

http://nakamotoinstitute.org/bitcoin/

Quote
6. Incentive
By convention, the first transaction in a block is a special transaction that starts a new coin owned by the creator of the block. This adds an incentive for nodes to support the network, and provides a way to initially distribute coins into circulation, since there is no central authority to issue them. The steady addition of a constant of amount of new coins is analogous to gold miners expending resources to add gold to circulation. In our case, it is CPU time and electricity that is expended.

The incentive can also be funded with transaction fees. If the output value of a transaction is less than its input value, the difference is a transaction fee that is added to the incentive value of the block containing the transaction. Once a predetermined number of coins have entered circulation, the incentive can transition entirely to transaction fees and be completely inflation free.

The incentive may help encourage nodes to stay honest. If a greedy attacker is able to assemble more CPU power than all the honest nodes, he would have to choose between using it to defraud people by stealing back his payments, or using it to generate new coins. He ought to find it more profitable to play by the rules, such rules that favour him with more new coins than everyone else combined, than to undermine the system and the validity of his own wealth.
sr. member
Activity: 420
Merit: 262
This requires unprofitable mining

No incentives = system will not work.

I already justified the incentives upthread. I refuse to repeat myself again.
member
Activity: 81
Merit: 10
This requires unprofitable mining

No incentives = system will not work. The basic equation of any such system is that all participants / nodes maximise their expected income. So I'd posit the opposite: verification has to be profitable in the long term. There is a problem of defining an appropriate time-horizon and setting in competitive situation.
Such equations could be modelled with game-theory, but I'm not aware of anyone having done that yet. Rather than speculating about AI, it would be more insightful to formulate the utility maximization in a model. There is a long history in AI and economics of study of decisions under uncertainty and adverse behaviour (Bernoulli, von Neumann, Nash, etc.).

Some links on these topics:
http://gambit.sourceforge.net/
https://en.wikipedia.org/wiki/Game_theory
https://en.wikipedia.org/wiki/Mechanism_design
https://en.wikipedia.org/wiki/Agent-based_model
https://en.wikipedia.org/wiki/Utility_maximization_problem
https://en.wikipedia.org/wiki/Solomonoff%27s_theory_of_inductive_inference
https://en.wikipedia.org/wiki/List_of_Nobel_Memorial_Prize_laureates_in_Economics , relevant Nobel prizes in 1994, 2005, 2007
https://en.wikipedia.org/wiki/Utility
https://en.wikipedia.org/wiki/Decision_theory
legendary
Activity: 1008
Merit: 1007
I believe I have already explained the solution upthread, which is these verification is centralized, but the PoW voting control is decentralized. This requires unprofitable mining and also a very good memory hard hash will help on asymmetry of costs between mining farm and decentralized PoW computation.

I believe there is a way to do this in a completely decentralised way (i.e. signing your PoW). I'm writing a white paper at the moment, I hope to describe it in sufficient detail, once I've run through all the edge cases in my own mind.
sr. member
Activity: 420
Merit: 262
It could be a Pseudo-decentralized like what we're having in Bitcoin. It is decentralized but due to the majority of it are being held by the Chinese, it seems they are controlling it at their own will.

I believe I have already explained the solution upthread, which is these verification is centralized, but the PoW voting control is decentralized. This requires unprofitable mining and also a very good memory hard hash will help on asymmetry of costs between mining farm and decentralized PoW computation.
hero member
Activity: 728
Merit: 500
It could be a Pseudo-decentralized like what we're having in Bitcoin. It is decentralized but due to the majority of it are being held by the Chinese, it seems they are controlling it at their own will.
legendary
Activity: 996
Merit: 1013

That is not global consensus. It is a Patitioning, with exchange via free market pricing between Partitions, i.e. chaos. A principle property of currency is a common unit-of-exchange.

You're wrong and probably stupid Grin

I appreciate your brevity.

Man, you've gotten pretty chill. A few pages back that would have gone differently.

The caliber of the addressee probably matters.
hero member
Activity: 980
Merit: 1001

That is not global consensus. It is a Patitioning, with exchange via free market pricing between Partitions, i.e. chaos. A principle property of currency is a common unit-of-exchange.

You're wrong and probably stupid Grin

I appreciate your brevity.

Man, you've gotten pretty chill. A few pages back that would have gone differently.
sr. member
Activity: 420
Merit: 262

That is not global consensus. It is a Patitioning, with exchange via free market pricing between Partitions, i.e. chaos. A principle property of currency is a common unit-of-exchange.

You're wrong and probably stupid Grin

I appreciate your brevity.
sr. member
Activity: 687
Merit: 269

That is not global consensus. It is a Patitioning, with exchange via free market pricing between Partitions, i.e. chaos. A principle property of currency is a common unit-of-exchange.

You're wrong and probably stupid Grin
sr. member
Activity: 420
Merit: 262
Create more altcoins Grin

That is not global consensus. It is a Patitioning, with exchange via free market pricing between Partitions, i.e. chaos. A principle property of currency is a common unit-of-exchange.
sr. member
Activity: 687
Merit: 269
member
Activity: 109
Merit: 10
I am not sure if the ideal solution exists yet but have faith it will come. When it does hopefully bitcoin will be flexible enough to adapt it.
sr. member
Activity: 420
Merit: 262
If AI progresses to the point where it can model and predict human economics it would rapidly replace humans as primary economic decision makers. However, rather then a single central AI I suspect you would see multiple AI's managing the economics of corporations and even households. The artificial intelligences would presumably not be able to fully model the behavior of other AI's due to processing power limitations. Thus equilibrium would again be obtained by a multitude of actors (this time artificial) working towards their individual goals and walking towards  equilibrium as if guided by an invisible hand.

The impossibility of a top-down omniscience was already proved:

This has already been refuted (by Lindsey Lamport and other Byzantine fault tolerance researchers) because the speed-of-life is not infinite, thus no perspective can be a total ordering. Or stated another way, due to the delay of propagation of information there will exist a plurality of arbitrary perspectives none of which are a total ordering.

Sorry. It is impossible to argue with that truth.

But my pragmatism is, damn the torpedoes and cover thy eyes, ears, and logic. Buy the dips with your student lunch allowance!

It doesn't matter how observers alter their environment, because they can never alter ALL OF IT because the speed-of-light is not infinite.

thaaanos can chase his tail with unbounded failed attempts to obfuscate that inviolable fundamental fact.

Only the Invisible Hand of the trend of entropy to maxium (because time can't be reversed, thermodynamic processes are irreversible) is in control. The entropic force is fundamental. Even gravity has recently been shown to derive from it.
member
Activity: 81
Merit: 10
edit: anyway, I think that's enough. cheers
sr. member
Activity: 420
Merit: 262
enet you have a reading comprehension handicap. The thread already addresses your reply. I refuse to repeat myself again.
member
Activity: 81
Merit: 10
My post referred to the quote from the first post in this thread

Quote
"The CAP theorem is fundamental. There will be no way to solve it. "

CAP is not fundamental at all - quite the opposite, it is misleading in this context. What are you trying to say - Bitcoin does not work, because... ? It worked quite well for a while. DAG is in my opinion incoherent and a step backwards. It doesn't recognize Bitcoin's achievement to agree on partial order.

If one codes up a P2P one has to reason from the perspective of a single node. Then one realises that local view != global view. But there is no good terminology for these things (yet). If one had it should be easy to show where the DAG idea goes wrong.

Quote
Perhaps one day you will graduate to higher Computer Science concepts such as pure functional programming and asynchronous programming

All data in modern languages are treated as pointers to memory. Even if you know FP its the same thing. You name a pointer, and then the pointed value changes. That is called a variable. What I meant with almost nobody understands this, is that 99.9% of all programming works this way. Variables are the only way to define facts, which is strange. Its not a good way to reason about time. For distributed systems there is:

https://web.archive.org/web/20160120095606/http://research.microsoft.com/en-us/um/people/lamport/pubs/time-clocks.pdf

sr. member
Activity: 420
Merit: 262
Bitcoin continues because it doesn't allow multiple Partitions and because in the case where chaos would result from a fork, then centralization of the mining is able to coordinate to set policy. But we also see the negative side of centralization when recently the Chinese miners who control roughly 67% of Bitcoin's network hashrate were able to veto any block size increase. And lie about their justification, since an analysis by smooth and I (mostly me) concluded that their excuse about slow connection across China's firewall is not a valid technical justification. Large blocks can be handled with their slow connection by putting a pool outside of China to communicate block hashes to the mining hardware in China.

Hmm, you have a steep background in relativity, but somehow things go wrong somewhere. Bitcoin partitions all the time - that's the default for everything. Nodes only synchronize ex-post, hence the block cycle.

Dude I haven't written anything in this entire thread (unless you crop out the context as you did!) to disagree with the fact that Bitcoin's global consensus is probabilistic. My point is the system converges on a Longest Chain Rule (LCR) and doesn't diverge. Duh! The distinction between convergence and divergence has been my entire point when comparing Satoshi's LCR PoW to other divergent Partition tolerance designs such as a DAG.

I request you quote me properly next time, so I don't have to go hunting for the post you quoted from. I fixed if for you above by inserting the proper quote and underlining the part you had written without attribution and link. I presume you know how to press the "Quote" button on each post. Please use it. Respect my time. Don't create extra work for me.

I'd humbly suggest to start with some through research of some basics:

Passive aggressively implying that I haven't studied fundamentals is not humble.

* computers are electronic elements with billions of components. how does such a machine achieve consistent state? see: Shannon and von Neumann and the early days of computing (maybe even Zuse)

* partitions, blocks, DAG's, .... all this stuff generally confuses the most fundamental notions. after investigating this matter for a very long time, I can assure you that almost nobody understands this.

Humble  Huh

Blah blah blah. Make your point.

I can assure you I've understood the point deeply about the impossibility of absolute global consistency in open systems (and a perfectly closed system is useless, i.e. static). Go read my debates with dmbarbour (David Barbour) from circa 2010.

I'll give an example: in any computer language and modern OS, you have the following piece of code:

Code:
declare variable X
set X to 10
if (X equals 10) do Z

will the code do Z? unfortunately the answer in general is no, and its very hard to know why. the answer: concurrency. a different thread might have changed X and one needs to lock X safely.

Perhaps one day you will graduate to higher Computer Science concepts such as pure functional programming and asynchronous programming (e.g. Node.js or Scala Akka) to simulate multithreading safely with one thread using promises and continuations. But you are correct to imply there is never a perfect global model of consistency. This is fundamentally due to the unbounded entropy of the universe, which is also reflected in the unbounded recursion of Turing completeness which thus yields the proof that the Halting problem is undecidable.

If you are still using non-reentrant impure programming with threads (and mutexes), or otherwise threads in anything other than Worker threads mode, you are probably doing it wrong (in most every case).

in other words data or state in modern computing is based on memory locations. programs always assume that everything is completely static, when in reality it is dynamic (OS and CPU caches on many levels). These are all human abstractions. The actual physical process of a computing machine is not uniform. In fact it is amazing that one can have such things at all exist, since Heisenberg discovered its impossible to tell even the most elementary properties of a particle with certainty. Shannon found that still one can build reliable systems from many unreliable parts (the magic of computing).

Higher level abstractions and quantum decoherence. I am not operating at the quantum scale as a classical human being.

With regards to your basic thesis you're right and wrong at same time. Total coordination is impossible even on the microscopic level. Bitcoin implements a new coordination mechanism, based on the Internet, previously unknown to mankind. It's certainly not perfect but that notion leads nowhere anyway. The foundations of computing is how one treats unreliable physical parts to create reliable systems (things that are imperfect add up to something which as reliable as necessary).

Who ever said "perfect". I said probabilistic. The key distinction was convergence versus divergence, but that seems to have escaped you along the way here.
Pages:
Jump to: