Pages:
Author

Topic: Getting rid of pools: Proof of Collaborative Work - page 4. (Read 1861 times)

legendary
Activity: 3430
Merit: 3074
Proof of everything other than Work

Annoymint doesn't like the implications of proof of work; he's been claiming for 5-6 years that he's working on a "blockchain breakthrough", but never proves he's working on anything Smiley


@Annoymint, you need to start a new Bitcointalk user called "Proof of everything other than work"
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
    There is always a way around, an escape, and that has been driving new Physics and technological innovations.

    Sorry no. You are handwaving.

    I do not buy into false hopes. There are invariants here which cannot be overcome.

    Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end?  

    ...Much better for me if the competition wastes time on an insoluble direction, while I am working on a promising one...


    Not really aware of a competition. Is the promising one you are working on a solution to the issues of Bitcoin with an entirely new algorithm?
    I'm sorry saying this but I think we have been trolled by @anunymint.  Sad

    PoW is one of the most important innovations in modern history (kudos Satoshi  Smiley) it is very irresponsible decision to abandon it because of some flaws and limitations by  claiming every flaw to be an inherent, essential one  and  jumping back to a pre-crypto, failed,  subjective alternative (like reputation based systems) often rebranded by using the same terminology of Satoshi Nakamoto and bitcoin!

    I'm not against change, on the contrary I strongly support any new idea whenever by whoever. But I personally feel good about a change when it is suggested (at least mainly) to help people to do something better not as an instrument in hands of an opportunist who has found or been informed about a weakness in a newly born technology and instead of trying or helping to fix it, initiates a hypocritical campaign just to sell us his crippy name or to convince his dad that he is genius, ... whatever.

    I'm not that kind of person, It's so tempting to take advantage of weaknesses and flaws of a system but I don't like such a miserable life. This proposal is a fix not a hypocritical alternative to PoW.

    It is a fix for  a series of of important challenges of bitcoin and PoW networks, it deserves decent reasoning and discussion instead of trolling and vandalism.

    To understand how unacceptable is that kind of behavior it is better to understand the importance and beauty of the subject, imo. let's take a look:

    1- It fixes Pooling pressure as the most centralization threat to bitcoin, by:
    • eliminating (solo)mining variance flaw by dividing mining to 3 phases that in the most important one, Collaboration phase (being the second one), where 98% of the block reward is going to be distributed,  they can partially contribute to PoW process  tens of thousands of times easier directly.
    • eliminating proximity premium flaw by uniquely distributing 'new block found' information across tens of thousands points in the network and incentivizing announcement of this information simultaneously.

    2- Although this proposal is ready for an alpha version implementation and consequent deployment phases, it is too young to be thoroughly understood for its other impacts and applications, the ones that it is not primarily designed for.  As some premature intuitions I can list:
    • It seems to be a great infrastructure for sharding , the most important onchain scalability solution.
      The current situation with pools makes sharding almost impossible, when +50% mining power is centralized in palms of few (5 for bitcoin and 3 for Ethereum) pools, the problem wouldn't be just security and vulnerability to cartel attacks, unlike what is usually assumed, it is more importantly a prohibiting factor for implementing sharding (and many other crucial and urgent improvements).
      If my intuition might be proven correct, it would have a disruptive impact on the current trend that prioritizes off chain against on chain scalability solutions.
    • This protocol probably can offer a better chance for signaling and autonomous governance solutions
    • {TODO: suggest more}

    A thorough analysis of the details suggested in the design, would convince non-biased reader that this proposal is thought enough and is not that immature to encourage anybody to attempt a slam dunk and reject it trivially, on the contrary considering the above features and promises, and the importance of pooling pressure as one of the critical flaws of bitcoin, it deserves a fair extensive discussion.

    Now, when someone comes and ruins such a decent topic, like what @anunimint did here, by repeating nosens objections and being never convinced no matter what, it could be either due to his ingenuity or as a result of him being biased obsessively because of his history in public sphere that is full of Proof of everything other than Work obsessions and vague claims about PoW being a boring, old fashioned weak system, doomed to be centralized, vulnerable to every possible attack vector, blah, blah, blah, ...  that he is trapped himself in or both .

    I vote for the second option about this guy, but if he is really smart, he should put the load (of his own history) off his shoulders and be ready for revision and improve.[/list]
    member
    Activity: 518
    Merit: 21
    Well that sounds good to get rid of pools for mining crypto in order to promote individual or small scale mining where areas mining earning would achieve an optimal profit. Shared mining profit will favored on mining team developers and for the actual miners will get a percentage on it. We should promote and create a mining opportunity that will be able to take the miners a good profit in doing it.
    jr. member
    Activity: 56
    Merit: 3
    ONNI COIN! The New Paradigm!
    There is always a way around, an escape, and that has been driving new Physics and technological innovations.

    Sorry no. You are handwaving.

    I do not buy into false hopes. There are invariants here which cannot be overcome.

    Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end?  

    ...Much better for me if the competition wastes time on an insoluble direction, while I am working on a promising one...


    Not really aware of a competition. Is the promising one you are working on a solution to the issues of Bitcoin with an entirely new algorithm?
    jr. member
    Activity: 56
    Merit: 3
    ONNI COIN! The New Paradigm!
    There is always a way around, an escape, and that has been driving new Physics and technological innovations.

    Sorry no. You are handwaving.

    I do not buy into false hopes. There are invariants here which cannot be overcome.

    Is it possible for you to supply your mathematical details of these insurmountable invariants so we can look into it from our end? 
    jr. member
    Activity: 56
    Merit: 3
    ONNI COIN! The New Paradigm!
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    The Shared Transaction Coinbase is not a part of the Header, its hash(id) is,

    All the small proof-of-work solutions have to communicated and calculated before the winning block can be communicated. So that is up to 10,000 (if difficulty target is 0.0001) multiplied by the 64B size of a SHA256 hash, which is 640KB of data that must be communicated across the network. That’s not factoring in if the network is subdivided and miners are mining on two or more leader Prepared blocks, in which case the network load can be double or more of that.
    You are mixing up heterogenous things imo:
    As I have  said before, Shared Coinbase Transaction is just a transaction with a size as small as 60 bytes (likely, implementation dependent) up to as large as a maximum of 60,000 bytes with normal distribution of probabilities and an average of 30,000 bytes. This is it. There is just one SHA256(2) hash that is committed to block header.
    This special transaction is verified by checking the asserted score and reward of each row (from 1 to 10,000 rows out there) by computing the hash of this row appended to previous block hash. There is no need to attach this hash to each row neither in the storage nor in the communication.

    As of the need for fetching this special transaction by peers to be able to verify the finalized block, it is very common.
    After BIP 152 peers check whether they have the corresponding transaction committed to the Merkle hash of the under validation block, is present in their version of mempool or not. In the latter case,  they fetch the transaction from the peer and validate it.

    For ordinary transactions, as I have declared before, the validation process is by no means a trivial process, it involves ECDSA signature verification and UTXO consistency check for each input of each transaction which both are difficult jobs in orders of magnitude compared to what should be done for the (output)rows of our special transaction under consideration, Shared Coinbase Transaction.

    For each row of this transaction there is only few processor cycles needed to compute the hash and it is not even the case for all of the rows, just for the rows missing from the memory of the node.

    Conclusion: I maintain my previous assertion of zero computation over head and an average of 32 KB block size increase.
    Quote
    Now I do understand that these proof-of-work share solutions are communicated continuously and not all at once at the Finalized block, but you’ve got at least three possible issues:

    1. As I told you from the beginning of this time wasting discussion, the small miners have to verify all the small proof-of-work solutions otherwise they’re trusting the security to the large miner which prepares the Finalized block. If they trust, then you do have a problem about non-uniform hashrate which changes the security model of Bitcoin. And if they trust you also have a change to the security model of Bitcoin.

    Easy dude, it is not time wasting, and if it is, why in the hell we should keep doing this, nobody reads our posts, people are busy with more imporatnt issues, no body is going to be the president of bitcoin or anything.

    I'm somewhat shocked  reading this post tho.
    We have discussed it exhaustively before. It is crystal clear, imo.

    First of all (I have to repeat) mining have nothing to do with verifying shares, blocks, whatever ... Miners just perform zillions times of nonce incrementation  and hash computation to find a good hash, It is  a full node's job to verify whatever it should. Agree?

    Now, full nodes busy I/O operations, stuff that need extensive networking and disk access,  have a lot of cpu power free and a modern os can utilize it to perform hundreds of thousands of SHA256 hashes without hesitation and any bad performance consequence, just like nothing happened ever.

    Is that hard to keep in mind and forget about what have been said in other context (infamous block size debate) please concentrate.

    In that debate core team was against the block size increase because they were worried about transaction verification being an I/O bound task, with your share verification nightmare, we are dealing with a cpu bound task, it is not the same issue, don't worry about it.
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)  and doesn't need any further verification, like checking UTXO, mempool, whatever.

    Since PoW should be considered an essential part of the header, what you are proposing then is to increase header size from 80 bytes upto 72 KB (worst case 10000 items), a nearly 1000 fold increase...

    This is more significant when considered in conjunction with the 0.02 * threshold on finishing a block. That threshold means it’s more likely that two blocks will be finished closer together than for 10 minute block periods and thus the increased propagation and verification (for the up to 10,000 block solutions) can be significant relative to the spacing between duplicate finished blocks. As I wrote in my prior post, all of this contributes to amplifying the selfish mining attack.
    Well, @tromp is not on the point, neither you @anunymint:

    The Shared Transaction Coinbase is not a part of the Header, its hash(id) is,
    The transaction itself is part of the block, like conventional coinbase transaction and other transactions. The block size remains the same as what protocol dictates, plus the size of this transaction it implies an almost 5% increase (worst case) which is not a big deal.
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    @anunymint

    As of classical selfish attack itself, I personally disagree to call it an attack at all. I rather see it as a fallacy, a straw man fallacy.
    My reasoning:
    PoW has nothing to do with announcement. Once a miner prefers to keep his block secret it is his choice and his right as well, he is risking his block to become orphan in exchange for a possible advantage against the rest of the network in mining for the next block.

    Although Like PoW, this proposal is not about prohibiting people from selfish mining, there is a point to rephrase the above reasoning somehow different, this proposal is about reducing the pooling pressure and helping the network to become more decentralized by increasing the number of miners. How? By reducing the variance of mining rewards that is one of the 2 important factors for this pressure (I will come back to the second factor, soon).

    So, it might be a reasonable expectation from PoCW to have something to do with selfish mining.

    It has, but first of all it is worth mentioning, according to the protocol, miners are free to choose not to collaborate and go solo if they wish although by keeping the costs of participation very low and the benefits high enough, this practice is discouraged.

    PoCW improves this situation by reducing the likelihood of pools to take place, eliminating one of the most important factors that makes their existence possible at all.

    Your second objection but happens to be about the second important factor for pooling pressure: proximity.

    It is about taking advantage of having access to information (a freshly mined block for instance) and taking advantage of it or not having access to such an information and wasting resources (mining stall blocks) because of it. Even with completely loyal nodes in bitcoin and other PoW based networks, there is always a proximity premium for the nodes nearer to the source (lucky finder of the fresh block) compared to other nodes.

    I have to accept that by pushing for more information being circulating around, PoCW, this proposal, is suspected to enforcing this second pressure for pooling.

    I have been investigating it for a while and my analysis suggests otherwise. It is a bit complicated and deserves to be considered more cautiously I need to remind that proximity premium is known flaw for PoW's decentralization agenda.

    For a traditional winner-takes-all PoW network, like bitcoin there is just one pieces of information (the fresh block) that causes the problem, true, but the weight of this information and resulting premium is very high and it is focused in one spot, the lucky miner in the focal point and its neighbors in the hot zone.

    For this proposal, this premium is distributed more evenly, tens of thousands times.

    OOps! there is almost no proximity premium flaw in Proof of Contributive Work!

    Without a proximity premium and a mining variance flaw, there will be no mining pressure, no threat to centralization. It is how selfish mining concerns (again not a flaw) are addressed too. It turns to become a simple, innocent solo mining.

    As of @tromp's and your concerns about share validation overhead, I have already addressed it, there is no resource other than a few cpu cycles to be consumed for it, not a big deal according to my analysis and by distributing the proximity premium almost evenly, it does more than enough to compensate  Wink

    legendary
    Activity: 988
    Merit: 1108
    Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)  and doesn't need any further verification, like checking UTXO, mempool, whatever.

    Since PoW should be considered an essential part of the header, what you are proposing then is to increase header size from 80 bytes upto 72 KB (worst case 10000 items), a nearly 1000 fold increase...
    newbie
    Activity: 1
    Merit: 0
    i like it
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    Additionally I think I found another game theory flaw in his design.

    The design presumes that the leadership (for finding the 0.05 * Prepared blocks) can’t be attacked and subdivide the rest of the hashrate because you assume they would need 50+% to get a lead, but AFAICT that is not true because of selfish mining.

    The 33% attacker can mine on his hidden Prepared block and then release it right before the rest of the network catches up.

    Thanks for the comment,  I have to analyse it more thoroughly, I am very glad to see you guys are approaching that good. Will be back in like half an hour with the analysis and possible mitigations.
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    • Verification process involves:
      • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively
    This is a serious problem with your proposal. The proof of work is not self-contained within the header.
    It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.[/list]
    Shared Coinbase transaction typically is 32 kB data (an average of 4500 items)  and doesn't need any further verification, like checking UTXO, mempool, whatever.
    Although shares have to be verified to have the required difficulty (being hashed and examined) it is a cpu/bound task  and ways faster than the block itself to be verified.

    Note: verifying a block takes a lot of communication, accessing the mempool in hard disk, querying/fetching the missing transactions from the peers, verifying transaction signatures (which is hell of a processing although not bein I/O bound), accessing the hard disk to check each transaction against the UTXO , ...

    According to my assessments, this verification will be done with adding zero or a very small latency because verifier is multitasking and the job will be done in cpu idle times.
    legendary
    Activity: 988
    Merit: 1108
    Verification process involves:
    • Checking both the hash of the finalized block and all of its Shared Coinbase Transaction items to satisfy network difficulty target cumulatively

    This is a serious problem with your proposal. The proof of work is not self-contained within the header.
    It requires the verifier to obtain up to 10000 additional pieces of data that must all be verified, which is too much overhead in latency, bandwidth, and verification time.
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    miner charges all transaction fees to his account <--- why is a miner paying transaction fees?
    First of all, glad to see you being back and showing your commitment, I appreciate it:  Smiley
    miner is not paying, he is charging, not being charged. I mean he rewards his wallet with transaction fees, (only transaction fees and not the block reward)
    Quote
    Quote
    calculated difficulty using previous block hash padded with all previous fields <--- padded? how does a hash provide a difficulty adjustment?
    who said anything here about difficulty adjustment? It is about calculating the difficulty of the share by
    1- padding some fields to each other: previous block hash + other fields of the structure (Net Merkle root+the Meiner's wallet address+nonce)
    2- performing a sha2 hash
    3- evaluating the difficulty of the hash
    Quote
    Quote

    A computed difficulty score using the hash of ...
    A calculated difficulty score is the ratio of the difficulty of the share compared to the target difficulty. It is typically less than 1 but the greater scores (if any) will be set to 1.
    Quote
    Quote

    For each share difficulty score is at least as good as 0.0001 <--- why is a difficulty good or bad? criteria?
    being good means being close to the target difficulty.
    Quote
    Quote

    Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score <--- Do you mean weighted sum? Huh? Needs better explanation.
    Yes, it deserves more explanation. It is about the structure of Shared Coinbase transaction. It is a magical structure that we use for both proving the work of the contributor (sum of the scores/difficulties of all the items MUST satisfy the required difficulty target) and for distributing the reward (each share gets the fraction proportional to its score/difficulty).
    Quote
    Quote

    It is fixed to yield a hash that is as difficult as target difficulty * 0.05  <--- how so? Where? What algorithm?
    It is about Prepared Block  difficulty target that should be set to 0.05 of the calculated network difficulty. Nothing new in terms of algorithm just a matter of protocol, just like how traditional PoW enforces the difficulty for the blocks.
    Quote
    Quote

    It is fixed to yield a hash that is as difficult as target difficulty * 0.02

    Mining process goes through 3 phases for each block: <--- these sections are not a sufficient explanation of the algorithm. You expect the reader to read your mind. Help us out here and explain how this thing you invented works

    Ok I'll do my best:

    Unlike the situation with traditional PoW, in PoCW miners should go through three phases (they better do so unless they want to solo mine which is not of their interests or commit an attack against the network which is not feasible as long as they have not the majority):

    Phase 1: Miners SHOULD try to find a block with at least 5% good hash and while rewarding the transaction fees into their wallets through a coinbase transaction (free of reward, just tr fees) committed to the merkle tree that its root  is committed to the block header. It is called the Preparation phase

    Phase 2: After the network reaches to a state that one or two or three competing instances of such a block have been mined and propagated in the network miners MAY eventually realise that the window for mining such a block is closing because of the risks involved in not getting to the final stage because of the competition.
    Instead they accept the fact that they won't be rewarded for transaction fess and choose to produce/mine Collaboration shares for one of the above mined blocks (i.e. putting their Merkle root in the data structure named Collaboration Share which can (later) trivially be translated to Coinbase share and being used for difficulty evaluation and reward distribution at the same time(if the miner happened to choose the most popular Prepared Block) .
    I have extensively discussed with @ir.hn this phase and have shown that it is an exponentially convergent process and in the midsts of the process we will be witnessing the whole network being busy to produce shares for the same Net Merkle Tree root.
    It is called the Contribution Phase, Note: As you might have already realized this is not mandatory. Also note that in this phase miners don't generate blocks, these are just shares, Contribution Shares that should wait for the next phase in which a miner (just one miner) may include them in a block for  using their scores both to prove the work and to share the reward.

    Phase3: after enough shares have been accumulated for a Merkle root, miners SHOULD start to search for one  final block (with a difficulty being fixed to be at least 2% close to the calculated network difficulty) encompassing:
    1- The Merkle root (remember it has one coinbase transaction only rewarding the original miner of the first phase) of one of the blocks mined in the first phase.
    2- Anew coinbase transaction, the Shared Coinbase Transaction containing the required shares to prove the work and the weighed distribution of the block rewards as an integrated whole.
    3- other usual fields

    It is the Finalization Phase.
    Quote
    Quote

    Phrases are devoid of meaning for me. With any key words that really confound me as making no sense are highlighted in bold.

    Without being able to understand these, I can’t digest your specification unless I holistically reverse engineer what your intended meaning is. And I am unwilling to expend that effort.

    Please translate.

    I could figure it out if I really want to. But as I said, I have a lot of things to do and I enough puzzles on my TODO list to solve already.
    Did my best. Thanks for the patience/commitment  Smiley
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    @anunymint

    I understand, you are a good writer and a respected crypto advocate, I have shown more than once my respect for you. But it just happens, the level of noise and wired claims and the number of white papers and proposals about Proof of Something other than work is annoyingly high and it was my fault from the first hand to start a thread and try to convince people, guys like you specially, that it is not alike.

    I have to apologize for my too much expectation and getting too intimate and hurting you. I didn't mean it.

    As I've just mentioned, it is too much expectation form advocates (who are already vaccinated against the said noise and hypes) to take this proposal as a serious one and try to digest it thoroughly (why should they?).

    You might be right, I'm not potent enough for presenting a proposal with such an ambiguous agenda like shifting Nakamoto's Winner takes all  tradition with a collaborative proof of work alternative, as a serious paradigm shift and encouraging people to spend some time on digesting it.

    But it is what I got and it makes me angry sometimes with myself primarily and with the whole situation secondly, not with you. You are just one other advocate, you are not alone, people are busy investigating PoS or pumping bitcoin, nobody cares. I'm sick of it.

    And when you came on board and I started getting more optimistic, my expectations got too high and I went out of rail. Sorry.

    Imo, despite the bitterness, we have made some progress and I sincerely ask you to schedule some time and take a closer look to the proposal, I assure you,  every single objection you have made here is already addressed by the starting post or through the replies I have made. Thank you for participation and sorry for the inconvenience.  Smiley
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    Are you kidding? Running a SHA256 hash takes few microseconds even for an average cpu!

    An ASIC miner does it in  few nanoseconds!

    Am I missing something or you are just confused somewhat?
    ...
    Also please remember my objection was in terms of unbounded validators for OmniLedger shards. I never was coming over to refute your proposal for use not in OmniLedger.
    Oh you did, we are not discussing OmniLedger here, but thank you, you are taking back your objection, well, somehow, it is progress.
    So you are serious!
    Really? One nano second 100X is just 0.1 microsecond and 1 microsecond 100X is 0.1 millisecond.

    The absolute time is irrelevant. It is the relativity that matters. Please take some time to understand what that means. I shouldn’t have to explain it.
    I have been studying theoretical physics for a while and I'm somehow an expert in relativity theory and yet I can't find any relativity related issue here.
    Essentially all you’re doing is lowering the block period, which is known to be insecure as it approaches a smaller multiple of the propagation delay in the network. So I am also thinking your proposal is flawed for that reason. I think that was the reason it was shot down in the past. As I wrote in one of my edits, Vitalik had explained some of these flaws in GHOST.
    reducing block time to 1 minute is not a part of this proposal from the algorithmic point of view, but I vote in favor of it and can void any argument against it, Ethereum uses 15 second block time with an average of uncle blocks lower than 10% , I believe even a 30 second blocktime is feasible.
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    Actually,  I guess handling  more than 180,000 shares per minute (3,000 shares per second) by a full node with a commodity pc is totally feasible.
    With parameters I have proposed in this version, there would be less than 20,000 shares per minute in the worst scenario however.
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    Are you kidding? Running a SHA256 hash takes few microseconds even for an average cpu!

    An ASIC miner does it in  few nanoseconds!

    Am I missing something or you are just confused somewhat?

    No you’re not thinking. Think about what you just wrote. Everything is relative. And go back to my original objection and the point about “100X”.

    So you are serious!
    Really? One nano second 100X is just 0.1 microsecond and 1 microsecond 100X is 0.1 millisecond.

    Come on, you have to take it back, your objection about validation crisis, there is no crisis, just take it back.
    legendary
    Activity: 1456
    Merit: 1174
    Always remember the cause!
    @anunimint
    Please edit your latest reply, som quote tags is missing there. I just don't quote and simply reply to your most important point in that post:

    You say that your objection is not about signatures, UTXO, etc of the Markle Path and the transactions included in the block but about its hash being qualified enough!

    Are you kidding? Running a SHA256 hash takes few microseconds even for an average cpu!

    An ASIC miner does it in  few nanoseconds!

    Am I missing something or you are just confused somewhat?
    Pages:
    Jump to: