Pages:
Author

Topic: Vanity Pool - vanity address generator pool - page 28. (Read 147800 times)

donator
Activity: 1064
Merit: 1000
I have just started mining the vanity keys Smiley
I am currently getting 19.3 MKeys/s with my 6970 which gets around 410 Mhash usually.
And 22 MKeys/s with my 5850 which usually gets around 340 Mhash.
Any idea why the 5850 is performing better at generating keys when my 6970 performs better at normal Bitcoin mining?
Also any suggestions or tips on how to increase my MKey/s rate?
Thanks Cheesy
//DeaDTerra
hero member
Activity: 759
Merit: 502
Cool. I added mine to the table. If anyone has data for their hardware and doesn't feel like editing the wiki, please at least post here!

Seems I cannot modify the wiki without account so my info

MSI R5850 Twin Frozr II, ovrclocked core/mem  885/885
Phoenix 2.0 is 365.5 MHash/s
oclvanminer is 23.5 Mkeys/s

I have considerably lower ratio 0,0643 (your is 0.075), so while you can still mine vanity adresses more profitably right now, I cant
hero member
Activity: 720
Merit: 525
I pushed another commit with BTC/Gkey instead of BTC/Mkey. Would you please try it out? I don't have a machine to test it on available right now. Thanks!

https://github.com/fizzisist/vanitygen/commit/9981bbeed126a38543a4a869d4a6da957f3a8526
hero member
Activity: 720
Merit: 525
I just tried fizzisist's patch:

Quote
Total value for current work: 0.000000 BTC/Mkey

actually current jobs don't look like worth of electricity ;-). However I don't know if it is just a bug in calculation or current work is really so hard/underpaid.

Heh, whoops. Actually, current work is well paid (better than bitcoin mining), but BTC/Mkey is 0 when shown with so few digits after the decimal. ;p

I'll submit another patch to change to to BTC/Gkey, since that will be a large enough number to easily make sense of.

Thanks for testing!
legendary
Activity: 1386
Merit: 1097
I just tried fizzisist's patch:

Quote
Total value for current work: 0.000000 BTC/Mkey

actually current jobs don't look like worth of electricity ;-). However I don't know if it is just a bug in calculation or current work is really so hard/underpaid.
sr. member
Activity: 444
Merit: 313
Okay, after a small hiccup, the new version of the pool is up. You can see the lavishness of all the works. I haven't implemented the total sum for the entire Pool due to a lack of time. Also made some code refactoring, but that's on the internal side of things.

https://vanitypool.appspot.com/availableWork

Helpfully more updates coming this week, and I will probably do something for Bitcoin Friday...
sr. member
Activity: 444
Merit: 313
Well, I think I managed to implement the basic Lavishness display on the Test Pool:
https://vanitypooltest.appspot.com/availableWork

Now I need to make some sort of calculator page, some more metrics and yeah, checking vanity profitability should be a breeze.

And yeah, as per some old feature request - I will be also looking into some sane bounty levels for different pattern lengths. Generally, the Lavishness will need to be high enough, meaning it will vary based on the pattern complexity/difficulty and bounty.
hero member
Activity: 720
Merit: 525
What's the algorithm which selects the work? Currently my miner (HD5870) is calculating job with 50% probability in one day. Personally I'd like to prefer easier jobs even with lower income, just to have sane variance...

I think you would need to ask samr7 about it. As far as I know, the algorithm selects the most profitable job clusters (grouped by public key).

Btw. I don't see expected income for this job anywhere, is there any chance to obtain it?

I'm working on that metric.

Yeah, oclvanityminer doesn't actually display it. The algorithm it uses is essentially the following (this is the code I used to produce the above value (BTC/Gkey):

Code:
url = 'https://vanitypool.appspot.com/getWork'
try:
urlfh = urllib.urlopen(url)
except:
return 0
pubkeys = {}
for work in urlfh:
work = work.strip().strip(';')
pattern, pubkey, networkbyte, reward = work.split(':')
try:
difficulty = vanitycalc.getdiff([pattern], 0)
except:
return 0
if pubkey in pubkeys:
pubkeys[pubkey] += 1000000000.0 * float(reward) / float(difficulty)
else:
pubkeys[pubkey] = 1000000000.0 * float(reward) / float(difficulty)
max_value = 0
for pubkey, value in pubkeys.iteritems():
if value > max_value:
max_value = value
return max_value

This means it calculates the reward/difficulty for each piece of work. If a single public key has more than one job associated with it, it sums the values (reward/difficulty) for all of those jobs. It then picks the group of work (sharing a common public key) with the highest sum.

You can calculate the expected earnings for your hardware by multiplying the number obtained from that calculation by the key rate for your hardware. Currently, the max_value is: 0.0000585876 BTC/Gkey. My 5870 does about 29 Mkey/s (or 104.4 Gkey/hr), so my expected earnings are:

0.0000585876 BTC/Gkey * 104.4 Gkey/hr = 0.0061165 BTC/hr = 0.1468 BTC/day
sr. member
Activity: 444
Merit: 313
What's the algorithm which selects the work? Currently my miner (HD5870) is calculating job with 50% probability in one day. Personally I'd like to prefer easier jobs even with lower income, just to have sane variance...

I think you would need to ask samr7 about it. As far as I know, the algorithm selects the most profitable job clusters (grouped by public key).

Btw. I don't see expected income for this job anywhere, is there any chance to obtain it?

I'm working on that metric.
legendary
Activity: 1386
Merit: 1097
What's the algorithm which selects the work? Currently my miner (HD5870) is calculating job with 50% probability in one day. Personally I'd like to prefer easier jobs even with lower income, just to have sane variance...

Btw. I don't see expected income for this job anywhere, is there any chance to obtain it?
sr. member
Activity: 444
Merit: 313
sr. member
Activity: 437
Merit: 415
1ninja
sr. member
Activity: 444
Merit: 313
ThePiachu, if you'd like to display the current max_value on your page (or lavishness as you call it), I made a simple API for you to grab this. Anyone else is welcome to as well. The value here is shown in BTC/Gkey, assuming mutliplicative mining (greatest sum of values for work available for each pubkey).

http://fizzisist.com/api/vanitypool-value-mult

Multiply this by your key generation rate in, for example, Gkey/hr and you get BTC/hr.

I am currently working on implementing this and a couple other things as well. I hope to get it done and running by tomorrow, but results may vary Wink.
hero member
Activity: 720
Merit: 525
ThePiachu, if you'd like to display the current max_value on your page (or lavishness as you call it), I made a simple API for you to grab this. Anyone else is welcome to as well. The value here is shown in BTC/Gkey, assuming mutliplicative mining (greatest sum of values for work available for each pubkey).

http://fizzisist.com/mining-value/api/vanitypool-value-mult

Multiply this by your key generation rate in, for example, Gkey/hr and you get BTC/hr.

EDIT: Updated API URL.
sr. member
Activity: 444
Merit: 313
It's so refreshing to see something nice and useful taking shape!

How many useless things have been taking shape recently?Wink
legendary
Activity: 916
Merit: 1003
It's so refreshing to see something nice and useful taking shape!
hero member
Activity: 720
Merit: 525
What is the difference in performance is on average? We should probably be calculating both of these options, with another ratio factor. The client should decide which method to use. This could become important when scaling up, if there is a lot of available work from different pubkeys.

I have no idea what the difference in performance is, as I don't think anyone has implemented the other method yet. However, I'm afraid it might be dependant on the amount of keys to check. I will add the columns in.

Cool. I added mine to the table. If anyone has data for their hardware and doesn't feel like editing the wiki, please at least post here!
sr. member
Activity: 444
Merit: 313
What is the difference in performance is on average? We should probably be calculating both of these options, with another ratio factor. The client should decide which method to use. This could become important when scaling up, if there is a lot of available work from different pubkeys.

I have no idea what the difference in performance is, as I don't think anyone has implemented the other method yet. However, I'm afraid it might be dependant on the amount of keys to check. I will add the columns in.
hero member
Activity: 720
Merit: 525
For multiplicative mining - one would take the biggest Lavishness sum of patterns sharing a single public key, and for additive mining - add all the Lavishnesses of all patterns (but also keep in mind different key generation rate for this method).

What is the difference in performance is on average? We should probably be calculating both of these options, with another ratio factor. The client should decide which method to use. This could become important when scaling up, if there is a lot of available work from different pubkeys.

Add these numbers to the new wiki page, too! Tongue
sr. member
Activity: 444
Merit: 313
If you'd like to help this effort, please post your performance figures for your hardware (GPU model and hash/s and key/s).

This problem needs a wiki page Smiley :

https://en.bitcoin.it/wiki/Vanity_mining_hardware_comparison
Pages:
Jump to: