Pages:
Author

Topic: Improved Measurement of Proof-of-Work using entropy (Read 345 times)

member
Activity: 99
Merit: 91
Use of the apparent difficult has been proposed many times before-- it and other schemes that result in a unique value for each block which is calcuable by the block finder immediately lead to a withholding attack:  If you know your block is unusually good, such that it is likely to win even if announced late, you are better off not announcing it.  The withholding profits are proporitional to your share of hash power (as thats how often your advantage gets you a second block) so it's also a centralization pressure.

Bitcoin's criteria has always been that each node adopt the first block it sees which meets the target, which creates an incentive to announce (one which breaks down if your share of the hashpower is over a large threshold, but if any party controls that much hashpower the system has other worse problems).


First off, I really appreciate you taking the time to reply. I am aware of the previous proposals to use lexigraphical mechanisms to create a preference in block choice.  However, the proposal here is different in that it has two key new realizations that although subtle are what make this powerful. I know the paper is very dense so I will try to give a more intuitive idea of what is going on.

1) Although each hash is an independent event, each block is a dependent event because it references the hash of the parent on which it is built.
2) The weighting of each block should be based on the expected number of hashes that are needed to reproduce the chain of blocks specified by the tip block.

These two ideas when added together allows us to propose a new weighting mechanism, entropy reduction, which eliminates issues with regards to withholding attacks.

When a block is produced the number of bits (~ leading binary zeros, or = log2(hash)) which exceed the block threshold value follow an exponential distribution b = e^x which has a mean expectation (lambda) of 1. This means a miner will expect on average to get 1 more bit than the threshold, but can achieve many bits within the exponential distribution. However, they could get
luck and get > 1 extra bit or they could get unlucky and get < 1 extra bit. The interesting thing here is that even if they get lucky and say get 3 extra bits there is still a 12.5% chance that if there is a competing block produces they will lose. The likely hood of loosing never goes to zero.

In addition, the weighting using entropy is adding bits, which is the same as multiplying in the linear field. Specifically, Bitcoin: Total Difficulty(n) = diff(n) + Total Difficulty(n-1) whereas PoEM is: Total Difficulty(n) = diff(n) * Total Difficulty(n-1) .This means that even if a miner was to get super "lucky" and find 20 extra bits (1 in 1 million blocks) the extra entropy would only count for 1/3rd of a normal block and they would still have a 0.0000001% chance of losing if they withheld the block. This means that no matter
how lucky a miner gets, they will always maximize revenue by broadcasting the block as quickly as possible.  This actually reduces the withholding attack that exists within the current difficulty weighting system.

If we took the same example, but use the current method of adding difficulties to determine block weight, the withholding period for finding 20 extra bits would be 1 million blocks.  So compare 1 million to 1. This example, elucidates that this is a very different proposal than the simple proposal of using apparent difficulty. Rather this proposal is doing two things. First, It is realizing apparent
difficulty is beneficial because it creates a rule which makes heavier chains and it actually adds randomness into winning a block to prevent withholding. Second, it is using the proper combinatorics, treating blocks as a series of dependent events, to weight them which in turn solves all of the issues with using apparent difficulty.

I hope that provides some intuition. In the paper we have shown that this is indeed the case both empirically as well as experimentally. If you look at the graph on page 12 we show that for any g (block production rate per network delay) that the confirmation delay is smaller using this method compared to the current Bitcoin weighting.



We also explored the concept of concentration or looking at how the outcome distribution effects the statistical convergence.  Bitcoin would be a binomial distribution and PoEM would be the biased gamma.

 

PS: Another way to think about this is that there is infinite variance in a single sample, so reintroducing randomness in the weighting, creates a better outcome because it means that a miner that has <50% of the hashrate is less likely to get sequentially lucky and produce a sequence of N blocks that is longer than the honest majority.

legendary
Activity: 990
Merit: 1108
Use of the apparent difficult has been proposed many times before-- it and other schemes that result in a unique value for each block which is calcuable by the block finder immediately lead to a withholding attack
The withholding attack also reduces bitcoin's stochastic finality: a tx 6 blocks deep still has about a 1% chance of being reorged. One has to wait much longer before a tx can be considered finalized.

The scheme has other downsides as well:

Anytime two blocks are found in short succession, with the later one having a lower hash, it causes the earlier to be reorged, when that earlier one has already been relayed across most of the network,
wasting bandwidth and causing unnecessarily many small reorgs.

staff
Activity: 4284
Merit: 8808
Use of the apparent difficult has been proposed many times before-- it and other schemes that result in a unique value for each block which is calcuable by the block finder immediately lead to a withholding attack:  If you know your block is unusually good, such that it is likely to win even if announced late, you are better off not announcing it.  The withholding profits are proporitional to your share of hash power (as thats how often your advantage gets you a second block) so it's also a centralization pressure.

Bitcoin's criteria has always been that each node adopt the first block it sees which meets the target, which creates an incentive to announce (one which breaks down if your share of the hashpower is over a large threshold, but if any party controls that much hashpower the system has other worse problems).
member
Activity: 99
Merit: 91
There is now formal security proof showing the safety and liveness of Proof-of-Entropy-Minima (PoEM).  https://eprint.iacr.org/2024/200

This relatively simple adjustment to the measurement of block weight has been empirically shown to: prevent selfish mining, create faster finalization time, increase throughput, and decrease block times while maintaining safety and liveness of the chain.  Modifications to the current heaviest chain rule are very straight forward.

The TL;DR is:

Presently the heaviest chain rule does not treat blocks as being dependent events even though they are dependent because of the hash link to the parent.

PoEM uses combinatorics to weigh the dependent block events appropriately. This generates a guaranteed unique weight for each block which is the average number of hashes, in expectation, that would be required to produce a chain of equal weight.

The guaranteed unique weight prevents the network from mining on competing forks because the likelyhood of having blocks of equal weight is (1/2^256) ~= 0

Additionally, the process naturally introduces randomization on each sample (ie block) which prevents profitable withholding attacks.

Finally, it shows that the threshold for a Sybil attack can be improved from 33% to 40%.
legendary
Activity: 4466
Merit: 3391
I is finite, so it's a discrete probability space. The elementary event in this probability space is "hash h, h in 2^I, has been generated". The probability of every elementary event is 2^-I regardless of what hash h corresponds to this elementary event.

I think that is the key point. The amount of work needed to produce a block depended on the target value and not on the value of its hash. Therefore, using the hash value to determine the longest chain could be considered as incorrect since the criteria is the most amount of work.

However, I don't see a problem with the distortion in practice, and it does provide the benefits of a reduction in orphaned blocks and a deterrence of selfish mining.

BTW, using the hash values rather than the target values to determine the longest chain is not a new idea, though this is the first time I have seen the idea explored in detail. My main criticism is the misapplication of "entropy".
sr. member
Activity: 1190
Merit: 469

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

yeah i mean satoshi did things using the KISS principle but amazingly he got things right. people keep trying to come up with "improvements" to certain things but they are invariably way over complicated and have issues so they don't turn out to be an improvement in any real sense. just Keep it simple.

if you can't explain something with a simple example then its probably not something worth trying to figure out.
hero member
Activity: 882
Merit: 5834
not your keys, not your coins!
~
What do you think about takeaway/explanation by one of the author? https://twitter.com/mechanikalk/status/1633694566200094723
Quote them here, please.

New paper on novel mechanism used in @QuaiNetwork to scale blockchain. Proof-of-Entropy-Minima (PoEM). TL;DR blockchains are time invariant static sequence of commitments. PoW is massively inefficient bc it doesn’t do the math correctly. #btc #ETH #Crypto

Some more takeaways:
1) entropy reduction is the goal of work
2) deltaS=1/2^n defines blockchain
3) All PoW vulnerabilities arise bc #2 is not used
4) 67% of all orphans are completely avoidable with PoEM.
5) The best hash algos reduce entropy most efficiently while preserving one-way collision resistance w uniform distribution.
6) Using time to describe a blockchain is completely incorrect
7) Withholding and reordering attacks only exist bc difficulty is computed incorrectly
Cool correct computation of head entropy makes perfect head choices, even with withholding and stales, without any secondary measurement or consideration
9) Satoshi added a friendly finalization mechanism to #BTC. Coercion of intrinsic bits to threshold difficulty. Without it #BTC would not have any finite guarantee of finalization.

The point about 'PoW is massively inefficient' is immediately wrong, in the sense that as a general concept, if you make it more easy / efficient to calculate valid block hashes, the difficulty would adjust upwards to keep the ASICs running for 10 minutes on average to find a new block. Therefore, the energy consumption will not be reduced.
This holds true for any system or proposal that aims to make mining more efficient. Even reusing heat is just an efficiency improvement that leads to more money / value extracted from the same amount of electricity, driving up the difficulty to account for it.
member
Activity: 77
Merit: 28
Hold on.

The first lesson in Probability Theory class is the following.

Whenever you approach a problem, the first thing you should do is to define a probability space. If the probability space is not defined, then all these concepts don't make sense. There is no "probability", there is no "expectation" and "divergence". Also there is no "entropy" and "Shanon entropy".

The probability space has not been defined in the paper. If the choice of the probability space is obvious, then you can easily fill this gap and define probability space instead of the author.



They do define the probability space 2^l which in the case of bitcoin is 2^256.

I is finite, so it's a discrete probability space. The elementary event in this probability space is "hash h, h in 2^I, has been generated". The probability of every elementary event is 2^-I regardless of what hash h corresponds to this elementary event.

Now let's consider the event "block with target T has been mined". It means a miner has calculated a hash h<=T. Every two hashes h1 and h2, which satisfy this inequality, have the same probability, regardless of what they look like and how many zeros are there in their binary representation. They carry the same "information".

The probability P of mining a block in this setting is (T+1)/2^I. The average number of attempts before the success is 1/P. That is 2^I/(T+1). This number is called the "block difficulty" and "block weight". It's unbiased estimation of the work executed by miners. That's what we need and the story often ends here.

In the paper an extra step is made. There is an assumption that every hash has an "information" or "weight". There is an attempt "to reduce an entropy" or, in other words, as I understood, there is an attempt to reduce the standard deviation of estimation from the estimated value. The fact is that when every hash has the same weight, the deviation is minimal.

I think, at some point in the paper, when the entropy was introduced, an unintentional switch to another probability space has occurred. When you start treating hashes by quantity of leading zeros, the fraction of "information" within the hash get lost. You get a new probability space, a new problem and a new optimal unbiased estimation. However, this new solution might not be a solution to the original problem in the original probability space.
member
Activity: 99
Merit: 91
Hold on.

The first lesson in Probability Theory class is the following.

Whenever you approach a problem, the first thing you should do is to define a probability space. If the probability space is not defined, then all these concepts don't make sense. There is no "probability", there is no "expectation" and "divergence". Also there is no "entropy" and "Shanon entropy".

The probability space has not been defined in the paper. If the choice of the probability space is obvious, then you can easily fill this gap and define probability space instead of the author.



They do define the probability space 2^l which in the case of bitcoin is 2^256.
member
Activity: 77
Merit: 28
Hold on.

The first lesson in Probability Theory class is the following.

Whenever you approach a problem, the first thing you should do is to define a probability space. If the probability space is not defined, then all these concepts don't make sense. There is no "probability", there is no "expectation" and "divergence". Also there is no "entropy" and "Shanon entropy".

The probability space has not been defined in the paper. If the choice of the probability space is obvious, then you can easily fill this gap and define probability space instead of the author.

member
Activity: 99
Merit: 91
This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.


There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?

I think the paper is correct.

In your opinion, what definition of "entropy" does the author use through out the paper?

delta_S = 1/2^n where n is the number of leading zeros.  This makes sense from a Shannon entropy concept where entropy is a reduction in divergence and it also makes sense from a system entropy standpoint as well were miners exert work to lower the entropy and increase the system order.
member
Activity: 99
Merit: 91
Quote
Current difficulty based weighting systems do not take the intrinsic block weight into account.
And it is good that they don't. Because this change will lower requirements for chain reorganization. If I understand it correctly, this proposal is about using chainwork based on the current block, instead of difficulty. Those kind of changes were also discussed in other topics.

If you look at the paper it is not just proposing you change chain work but that you simultaneously change the calculation of tip weight from very roughly being Td_new = Td_old + Td_threshold to Td_new = Td_old * Td_chainwork.  That would make both chain work and chain weight geometric which would actually improve finalization time by minimizing conflicts and maximizing recorded work.
copper member
Activity: 821
Merit: 1992
Quote
Current difficulty based weighting systems do not take the intrinsic block weight into account.
And it is good that they don't. Because this change will lower requirements for chain reorganization. If I understand it correctly, this proposal is about using chainwork based on the current block, instead of difficulty. Those kind of changes were also discussed in other topics.
member
Activity: 77
Merit: 28
This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.


There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?

I think the paper is correct.

In your opinion, what definition of "entropy" does the author use through out the paper?
member
Activity: 99
Merit: 91
This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.


There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?

I think the paper is correct.
member
Activity: 77
Merit: 28
This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.

There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?
member
Activity: 99
Merit: 91
This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.
member
Activity: 77
Merit: 28
I spend 15 minutes reading this paper, but doesn't understand a thing. It's a shame they don't spend some time to write example or simpler explanation on their website.

--snip--

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

What do you think about takeaway/explanation by one of the author? https://twitter.com/mechanikalk/status/1633694566200094723

Quote them here, please.
member
Activity: 77
Merit: 28
This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.
sr. member
Activity: 1190
Merit: 469
can you give a simple example because it seems kind of abstract.
Pages:
Jump to: