Author

Topic: Pool mathematics (Read 4707 times)

newbie
Activity: 11
Merit: 0
April 24, 2011, 01:04:33 PM
#13
Shares and GetWorks are completely and utterly unrelated.

There is an expected number of hashes that you have to calculate on average before you find a share. The expected number of hashes is dependent upon the difficulty. All current pools use a difficulty of 1, so the expected number of hashes per share on average is 2^32.

It is only a coincidence that the number of possible hashes in a GetWork is also 2^32. There is absolutely no causation between the two.

Ohh ok, so it's based on the difficulty. Thanks!
sr. member
Activity: 406
Merit: 250
April 24, 2011, 08:30:11 AM
#12
Shares and GetWorks are completely and utterly unrelated.

There is an expected number of hashes that you have to calculate on average before you find a share. The expected number of hashes is dependent upon the difficulty. All current pools use a difficulty of 1, so the expected number of hashes per share on average is 2^32.

It is only a coincidence that the number of possible hashes in a GetWork is also 2^32. There is absolutely no causation between the two.
newbie
Activity: 11
Merit: 0
April 24, 2011, 03:01:56 AM
#11

It works like this...

Minutes_to_Average = 5
Delta = 60 * 5

Current_Shares = The total number of shares submitted in the round
Shares_Before_Delta = Shares submitted before Delta-seconds ago

Total_Hash/s = (((Current_Shares - Shares_Before_Delta) * 2^32) / Delta)

Ghash/s = (((Total_Hash/s / 1000) / 1000) / 1000)


I was inclined to believe this was true, but upon further thought, I don't think so anymore. Each share is simply some hash that's below the minimum difficulty. There can be 0, 1, 2, or more of these hashes in a single getwork response. While a getwork response is worth 2**32 hashes of work, a share isn't necessarily worth that much.

Looking at how other pool calculate their hash speeds, it seems they assume each getwork has, at least on average, a single valid share.

Is there a way to find which getwork created a share? That way, I can disregard double-share getworks and have a more realistic pool measure.
sr. member
Activity: 258
Merit: 250
April 22, 2011, 08:04:48 AM
#10
Multiply to 71. That will be some kind of guesstimation.

How did you arrive at that number? I read something about each getwork being 2^x hashes worth of work, but I forgot what x was and can't find that post again. It may have been 2^18 or 262,144 hashes, which would mean 4 getworks is 1MHashes of work? That seems to match slush's stats (deepbit doesn't say how many getworks there are).

1 minute timedelta  (60sec)

hashrate = (shares_per_timedelta * (2 ** 32)) / timedelta)

to Mhash/sec:

hashrate  / 1000000

~71,5


where did you get the shares_per_timedelta from? It looks like you answered my question indirectly. Each share is one valid getwork and 2**32 hashes, so therefore valid_getworks * (2**32) / (10**9) = GHashes/s.

Thanks!


It works like this...

Minutes_to_Average = 5
Delta = 60 * 5

Current_Shares = The total number of shares submitted in the round
Shares_Before_Delta = Shares submitted before Delta-seconds ago

Total_Hash/s = (((Current_Shares - Shares_Before_Delta) * 2^32) / Delta)

Ghash/s = (((Total_Hash/s / 1000) / 1000) / 1000)


newbie
Activity: 11
Merit: 0
April 20, 2011, 11:24:25 PM
#9
Multiply to 71. That will be some kind of guesstimation.

How did you arrive at that number? I read something about each getwork being 2^x hashes worth of work, but I forgot what x was and can't find that post again. It may have been 2^18 or 262,144 hashes, which would mean 4 getworks is 1MHashes of work? That seems to match slush's stats (deepbit doesn't say how many getworks there are).

1 minute timedelta  (60sec)

hashrate = (shares_per_timedelta * (2 ** 32)) / timedelta)

to Mhash/sec:

hashrate  / 1000000

~71,5


where did you get the shares_per_timedelta from? It looks like you answered my question indirectly. Each share is one valid getwork and 2**32 hashes, so therefore valid_getworks * (2**32) / (10**9) = GHashes/s.

Thanks!
hero member
Activity: 742
Merit: 500
BTCDig - mining pool
April 20, 2011, 11:18:37 PM
#8
Multiply to 71. That will be some kind of guesstimation.

How did you arrive at that number? I read something about each getwork being 2^x hashes worth of work, but I forgot what x was and can't find that post again. It may have been 2^18 or 262,144 hashes, which would mean 4 getworks is 1MHashes of work? That seems to match slush's stats (deepbit doesn't say how many getworks there are).

1 minute timedelta  (60sec)

hashrate = (shares_per_timedelta * (2 ** 32)) / timedelta)

to Mhash/sec:

hashrate  / 1000000

~71,5


newbie
Activity: 11
Merit: 0
April 20, 2011, 11:14:06 PM
#7
Multiply to 71. That will be some kind of guesstimation.

How did you arrive at that number? I read something about each getwork being 2^x hashes worth of work, but I forgot what x was and can't find that post again. It may have been 2^18 or 262,144 hashes, which would mean 4 getworks is 1MHashes of work? That seems to match slush's stats (deepbit doesn't say how many getworks there are).
hero member
Activity: 742
Merit: 500
April 20, 2011, 11:02:20 PM
#6
Yeah, I was looking at that earlier. I'll contact him for the proof he mentioned.
Does anybody know the correct calculation for GHashes/s based on getworks? Or is there another way to calculated it?
You can't calculate hashing luck from getworks. Anyone can request them at any rate.
That's why I qualified it with valid getworks earlier, those are getworks that are also submitting previous work with a valid hash.
Multiply to 71. That will be some kind of guesstimation.
newbie
Activity: 11
Merit: 0
April 20, 2011, 10:56:03 PM
#5
Yeah, I was looking at that earlier. I'll contact him for the proof he mentioned.

Does anybody know the correct calculation for GHashes/s based on getworks? Or is there another way to calculated it?
You can't calculate hashing luck from getworks. Anyone can request them at any rate.

That's why I qualified it with valid getworks earlier, those are getworks that are also submitting previous work with a valid hash.
hero member
Activity: 742
Merit: 500
April 20, 2011, 09:48:39 PM
#4
Yeah, I was looking at that earlier. I'll contact him for the proof he mentioned.

Does anybody know the correct calculation for GHashes/s based on getworks? Or is there another way to calculated it?
You can't calculate hashing luck from getworks. Anyone can request them at any rate.
newbie
Activity: 11
Merit: 0
April 20, 2011, 08:53:42 PM
#3
Yeah, I was looking at that earlier. I'll contact him for the proof he mentioned.

Does anybody know the correct calculation for GHashes/s based on getworks? Or is there another way to calculated it?
sr. member
Activity: 392
Merit: 250
April 20, 2011, 06:32:21 PM
#2
I don't see a need to startup another pool using the same strategy as someone else. There's already 2 proportional and 2 score based.

You might want to have a look at this https://bitcointalksearch.org/topic/geometric-method-new-cheat-proof-mining-pool-scoring-method-4787

Holy-Fire thinks he's got a method that is better against cheating than the current pool strategies in place.

I don't know whether it is or isn't, but he might help you implement it.

newbie
Activity: 11
Merit: 0
April 20, 2011, 06:23:29 PM
#1
Hey guys,

I'm working on starting up another pool to increase the options available, but I'm not as knowledgeable about the implementations so I'm hoping somebody could help me out.

How should I go about calculating the pool performance (Mhashes/s and GHashes/s) from the number of valid getworks (the ones that result in a hash smaller than the minimum difficulty)? From some basic math, I've figured it's 4*valid_getworks GHashes/s, but I'd like confirmation (or rebuttal).

I'm using score-based payout according to slush's pool but I'm willing to change that if users are interested in another cheat-resistant/proof method.

Thanks!
Jump to: