Author

Topic: How can I get a hashrate from hash-difficulty (Read 144 times)

legendary
Activity: 4522
Merit: 3426
December 27, 2024, 03:30:44 PM
#9
Quote
The calculated hashrate of miners, based on the difficulty of the shares they submit

That is not how mining pools normally work. Normally, the mining pool creates it's own target value that is (for example) 232 * T, where T is the current target. A miner submits all hashes less than the pool target. These are the "shares".

The pool may use the rate of submitted shares to estimate a miner's hash rate, but the hash rate value is only informational. The number of shares is what is used to determine the payout according to the pool's distribution system.

I suggest taking a look at the Mining forum.
member
Activity: 294
Merit: 11
Lord Shiva
December 25, 2024, 09:35:54 AM
#8
We are working with an engine based on https://github.com/stratum-mining/stratum - you are probably right, it's worth basing calculations on the target rather than the average hash_diff.
hero member
Activity: 714
Merit: 1010
Crypto Swap Exchange
December 24, 2024, 04:24:29 PM
#7
From my understanding of mining as it is a completely random process what determines the hashrate of a worker is the frequency with which the worker can deliver valid shares that are above a certain lower difficulty threshold.

The pool tells the worker (gives work items) to report back only if it finds a share above difficulty x. This threshold x has to be chosen based on the worker's capabilites and is subject to be adjusted on-the-fly. The worker shouldn't have to report back too often, with too high frequency (work items have too low difficulty for worker's hashpower) and also not with too low frequency as the pool then might not know if the worker is actually doing some work.

I think it's a misconception to calculate the hashrate based on the specific difficulty value of a submitted share. You should observe the frequency with which a certain difficulty is exceeded, regardless of the specific value that exceeded the lower limit.

If I'm not wrong (see https://en.bitcoin.it/wiki/Difficulty) to statistically exceed difficulty D in timeframe 600s you have to execute a hashrate of at least D * 232 / 600. To sample with higher frequency you can adjust accordingly.


The mining pool code of ckPool is open-source. Last time I looked at the code, I don't remember it very well documented. You may have a look at the code, if you haven't done so already. Repository link is https://bitbucket.org/ckolivas/ckpool
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
December 24, 2024, 10:03:04 AM
#6
50% of the shares come in with a difficulty between 0.05 and 0.15 at a rate of 10 shares per minute. However, episodically, CPUMiner calculates shares with a difficulty of 10.0 and even 40.0
Shouldn't each share have the same difficulty?
member
Activity: 294
Merit: 11
Lord Shiva
December 24, 2024, 05:54:53 AM
#5
Essentially, we need the hashrate displayed by the miner's device to match the hashrate we calculate based on the difficulty of the shares (hashes) it submits.
Can't you accomplish that by making the shares small enough, so the miner has multiple shares per reward period? That way you should get a nice average instead of peaks.

A real-world example: during local testing with CPUMiner, 50% of the shares come in with a difficulty between 0.05 and 0.15 at a rate of 10 shares per minute. However, episodically, CPUMiner calculates shares with a difficulty of 10.0 and even 40.0 - and this completely skews the hashrate calculation over 10-minute intervals. I've tried trimming extreme values in the calculations, but I still can't achieve acceptable accuracy between the hashrate reported by CPUMiner in its logs and the hashrate we calculate. I'm currently experimenting with different normalization schemes, but haven't found a suitable solution yet. So I thought - maybe someone has a ready-made solution.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
December 24, 2024, 03:39:26 AM
#4
Essentially, we need the hashrate displayed by the miner's device to match the hashrate we calculate based on the difficulty of the shares (hashes) it submits.
Can't you accomplish that by making the shares small enough, so the miner has multiple shares per reward period? That way you should get a nice average instead of peaks.
member
Activity: 294
Merit: 11
Lord Shiva
December 24, 2024, 03:14:56 AM
#3
Let me start by saying I'm not a Bitcoin miner, but can't you increase the time period over which you calculate the hashrate?

We are developing a mining pool, and it would be beneficial for us to know the hashrate statistics of each worker over a 10-minute period to distribute rewards fairly. Essentially, we need the hashrate displayed by the miner's device to match the hashrate we calculate based on the difficulty of the shares (hashes) it submits.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
December 24, 2024, 02:44:56 AM
#2
However, in practice, CPUMiner, for example, finds shares with difficulties ranging from 0.01 to 10.0 within a 10-minute period. As a result, our formula leads to a hashrate that fluctuates wildly (by several times), deviating significantly from the actual hashrate.
Isn't that normal? That's why the network hashrate is an estimate, and the difficulty is only adjusted every 2016 blocks.

Quote
Could you suggest a solution or at least point us in the right direction for further investigation?
Let me start by saying I'm not a Bitcoin miner, but can't you increase the time period over which you calculate the hashrate?
member
Activity: 294
Merit: 11
Lord Shiva
December 23, 2024, 12:44:51 PM
#1
Greetings!

During testing and calculating the hashrate of workers (in some bitcoin-app), we've encountered a challenge:

- The calculated hashrate of miners, based on the difficulty of the shares they submit, differs significantly (by several times) from the actual hashrate of the workers – both for ASICs and CPUMiner.

We calculate the hashrate as follows (this is pseudocode for simplicity):

```
diff = max_target / hash
total_hashes = sum(difficulty * pow(2, 32))
hash_rate = total_hashes / sec
```

Essentially, we determine the number of hashes required to find a hash of the required difficulty (`difficulty * pow(2, 32)`), and then we do this for all hashes found in the last 10 minutes, sum them up, and divide the sum by 600 seconds.

It seems like it should work.

However, in practice, CPUMiner, for example, finds shares with difficulties ranging from 0.01 to 10.0 within a 10-minute period. As a result, our formula leads to a hashrate that fluctuates wildly (by several times), deviating significantly from the actual hashrate.

Have you encountered anything similar? Could you suggest a solution or at least point us in the right direction for further investigation?

Thanks in advance to everyone who participates in this discussion.
Jump to: