Pages:
Author

Topic: Proof of Work: Limit node hashrate to improve decentralisation? - page 2. (Read 442 times)

newbie
Activity: 2
Merit: 0
ASIC resistant algorithms are not doomed to failure.  The problem is they are implemented so poorly because they just have to cater to GPU's and therefore they make the memory requirement too small.  Even monero made it small enough to fit in the processor cache.  It is just not big enough.  You may say well the asics will just add more memory, the problem is that random memory accesses take lots of processor speed, which GPU and ASIC need to minimize to be competitive.  Scrypt, the classic alt algorithm only picked 1024 for its memory size (N value) where for real asic resistance you need 20,000 even up to 50-100,000 is no problem for CPU"s.  You can't have your cake and eat it too, if you want GPU's to be fast at mining then you will have an ASIC problem, if you design it so GPU's will struggle and GPU miners won't like mining your coin, then you are safe.  We need to mature as a community I feel.  It is time to drop our GPU love affair.  The ideal algorithm will require a CPU and GPU in tandem, and this algorithm is called "Factorization of large numbers" wherin an ASIC has never been created though a incentive has existed for decades.  A miner for this algorirhm requires a CPU and GPU.

Right, there are actually quite a number of ASIC resistant one-way functions out there that really can go on to dis-incentiving Sybil attacks. The easiest solution is to look into mechanisms for anti-denial of service which I believe alluded to Bitcoin's choice for Hashcash originally purported for anti-mail spam.

Apart from Scrypt, there is the memory-hard function Argon2, and a whole paper titled "Asymmetrically and Symmetrically-hard Cryptography" from ASIACRYPT 2017 that denotes the enabling of resource-hardness through plugs in a wide variety of cryptographic hash functions.

The key to creating anti-DDoS mechanisms (such as the ones you'd see in node identity derivation is S/Kademlia) really is to make a function that is only really computable on general-purpose computing devices. Ones that are selective in their forms of resource-hardness only enable ASIC resistance for ever-so-long (as we could see out of Equihash) given an ASIC's capability in specialization towards computing very selective, specific tasks.
member
Activity: 322
Merit: 54
Consensus is Constitution
ASIC resistant algorithms are not doomed to failure.  The problem is they are implemented so poorly because they just have to cater to GPU's and therefore they make the memory requirement too small.  Even monero made it small enough to fit in the processor cache.  It is just not big enough.  You may say well the asics will just add more memory, the problem is that random memory accesses take lots of processor speed, which GPU and ASIC need to minimize to be competitive.  Scrypt, the classic alt algorithm only picked 1024 for its memory size (N value) where for real asic resistance you need 20,000 even up to 50-100,000 is no problem for CPU"s.  You can't have your cake and eat it too, if you want GPU's to be fast at mining then you will have an ASIC problem, if you design it so GPU's will struggle and GPU miners won't like mining your coin, then you are safe.  We need to mature as a community I feel.  It is time to drop our GPU love affair.  The ideal algorithm will require a CPU and GPU in tandem, and this algorithm is called "Factorization of large numbers" wherin an ASIC has never been created though a incentive has existed for decades.  A miner for this algorirhm requires a CPU and GPU.
legendary
Activity: 3122
Merit: 2178
Playgram - The Telegram Casino
You need to look up sybil attack. What constitutes a node? A port? So all the mining farms do is to split their existing hash rate over whatever you're defining as the limit so they're achieving the same goal.

Proof of Stake solves the sybil by just making it irrelevant on how many nodes your coins are sitting on...

Likewise PoW solves sybil attacks by making it irrelevant from how many nodes your hashrate is coming from.

Also keep in mind that PoS based schemes are arguably more prone to centralization than PoW. Their initial monetary base needs to be centrally issued as otherwise there'd be nothing to stake with, the rich get richer by definition and unlike PoW coins where you usually have a "division of power" into devs, miners and holders with PoS coins all three usually fall into the same, exclusive circle.


monsterer2 is correct. It is worth noting that exactly this inability to ensure that every node / user / market participant has only "one vote" (ie. the max hashrate permitted) is why PoW is applied to cryptocurrencies in the first place.

But that brings the known issues like that the BTC network is much less decentralized than intended by Satoshi. Originally it was meant to be mined on *every single user of the network*, with their mere CPU or lets say GPU. But the ASICs and mining farms totally wrecked this concept. Getblocktemplate lowers the dangers of (pool) centralisation a little but not the issue of huge farms like Bitmain providing alot of hashrate alone already.

ASIC-resistant algorithms are not a solution too to save PoW, as every ASIC "resistantce" has been broken sooner or later.

Actually Satoshi did foresee mining farms:
https://bitcointalksearch.org/topic/m.6306

In a way, at least.

I concur that centralization and the current dominance of Bitmain is problematic. I also fully agree that any attempts at creating ASIC resistant algorithms is likely to fail.

This doesn't change anything about the inability to limit the hashrate of individual nodes though. I'm afraid that's an inherent property of PoW. And I'm afraid that for all its flaws PoW is currently the most decentralized, secure consensus algorithm cryptocurrencies have to offer.
member
Activity: 187
Merit: 20
monsterer2 is correct. It is worth noting that exactly this inability to ensure that every node / user / market participant has only "one vote" (ie. the max hashrate permitted) is why PoW is applied to cryptocurrencies in the first place.

But that brings the known issues like that the BTC network is much less decentralized than intended by Satoshi. Originally it was meant to be mined on *every single user of the network*, with their mere CPU or lets say GPU. But the ASICs and mining farms totally wrecked this concept. Getblocktemplate lowers the dangers of (pool) centralisation a little but not the issue of huge farms like Bitmain providing alot of hashrate alone already.

ASIC-resistant algorithms are not a solution too to save PoW, as every ASIC "resistantce" has been broken sooner or later.
member
Activity: 187
Merit: 20
Maybe for Bitcoin and all other Proof of Work coins: Would it be possible to somehow limit the hashrate the network accepts from a node? Like what if you limit the hashrate per node down to what a normal desktop CPU can do. This would cause all ASICs become irrelevant and/or increase the effort to set up huge mining farms as these, that generate thousand and million times the hashrate of a CPU, would need to run as many nodes as their hashrate is above the abilities of a standard desktop CPU. It would also stop the need to create new algorithms that are ASIC-resistant just to be cracked a few years later.

You need to look up sybil attack. What constitutes a node? A port? So all the mining farms do is to split their existing hash rate over whatever you're defining as the limit so they're achieving the same goal.

Proof of Stake solves the sybil by just making it irrelevant on how many nodes your coins are sitting on...
legendary
Activity: 3122
Merit: 2178
Playgram - The Telegram Casino
monsterer2 is correct. It is worth noting that exactly this inability to ensure that every node / user / market participant has only "one vote" (ie. the max hashrate permitted) is why PoW is applied to cryptocurrencies in the first place.

To add to that, the network doesn't know anything about a hashrate per node. It only knows about block intervals and periodically adjusts the network difficulty (ie. the average amount of work / hashes required per block) to keep this block interval the same. The network "knows" that the hashrate of the network as a whole has increased or declined. But it has no means to reliably derive the hashrate share a single node (eg. mining pool) holds.
full member
Activity: 351
Merit: 134
Maybe for Bitcoin and all other Proof of Work coins: Would it be possible to somehow limit the hashrate the network accepts from a node? Like what if you limit the hashrate per node down to what a normal desktop CPU can do. This would cause all ASICs become irrelevant and/or increase the effort to set up huge mining farms as these, that generate thousand and million times the hashrate of a CPU, would need to run as many nodes as their hashrate is above the abilities of a standard desktop CPU. It would also stop the need to create new algorithms that are ASIC-resistant just to be cracked a few years later.

You need to look up sybil attack. What constitutes a node? A port? So all the mining farms do is to split their existing hash rate over whatever you're defining as the limit so they're achieving the same goal.
member
Activity: 187
Merit: 20
Maybe for Bitcoin and all other Proof of Work coins: Would it be possible to somehow limit the hashrate the network accepts from a node? Like what if you limit the hashrate per node down to what a normal desktop CPU can do. This would cause all ASICs become irrelevant and/or increase the effort to set up huge mining farms as these, that generate thousand and million times the hashrate of a CPU, would need to run as many nodes as their hashrate is above the abilities of a standard desktop CPU. It would also stop the need to create new algorithms that are ASIC-resistant just to be cracked a few years later.
Pages:
Jump to: