Author

Topic: bitHopper: Python Pool Hopper Proxy - page 197. (Read 355823 times)

full member
Activity: 196
Merit: 100
July 13, 2011, 10:34:59 PM
Hi,

Statsdump is a sort of median step. It had a bug which should be fixed. On server change it dumps shares, server, and difficulty.

I'm trying to rig something up with btcguild. Then mtred then we'll see.
EDIT: btcguild rigged. Add your api key to passwords.py or you will get my stats.
-c00w
sr. member
Activity: 302
Merit: 250
July 13, 2011, 10:16:23 PM
Do you have an plan for what will be logged? I have a suggestion: It would be good to have logged the number of shares sent to which pool, as well as which worker completed the work (maybe log the username of the worker that connected to the proxy with each share) to help with troubleshooting and statistics.

Log for each share:
Timestamp, Pool, Worker Name, pool round shares as per api at the time share was sent, difficulty

As for the hopping aspect log:
Timestamp, pool hopped to, pool share count

A log to a text file or two would be great for now, just so we can get it into a spreadsheet or parse it somehow with a different program.

Thats what I'd like to see, any other info that should be logged?

EDIT: Maybe for each share also record what block number we were on at the time, although now that I think about it every share will end up being for a different block then when it was submitted...
hero member
Activity: 742
Merit: 500
July 13, 2011, 10:10:13 PM
OK so maybe I'm just too impatient but --statsdump creates an empty file and hasn't populated anything yet. How often does this file get written to?
legendary
Activity: 2618
Merit: 1007
July 13, 2011, 07:38:38 PM
Can anybody give an estimate on what % increase they are seeing in revenue through using this tool?
I got a crazy 200% increase as I hopped in on a very lucky round yesterday that gave nearly a day's earning within an hour or so...

I'd need more stats to give a definite answer though, I currently just slapped together a bit in excel/google spreadsheets but then gave up since it's useless after the next difficulty change anyways to read directly from APIs and dumping stats seems already to be implemented.

(Edit: @c00w: Of course feel free to include my code for triplemining, I just don't have a github account and am not that used to git anyways, so I just posted a diff instead of making a proper pull request)

Edit2: triplemining is not a ponzi scheme, it is a 1 level pyramid scheme with actually no gains for the operator - so not much better/worse than any other referral thingie out there. the other 1% that is distributed via a "lottery" is just increasing variance a bit... In the end you can just mine solo there without referring ppl and have 1% fees (+ potentially a long time until you get that other 1% back) or maybe even create 2 accounts (1 referred by the other) and gain back a little bit of this 1%. The only thing thats not cool imho is the spam it produced lately, other than that it's a solid proportional pool just waiting to be hopped! Cheesy
Writing this stuff is btw. really easy - jsut tell me/us a few more pools and I'll try to add them.
full member
Activity: 196
Merit: 100
July 13, 2011, 07:24:51 PM
Efficiency can mean a number of things. A lot of people use it as hashes/getwork request.

In terms of payouts yes. Expected payout = shares/difficulty * 50 for normal mining. Payout per share = 50/difficulty.

I would probably just display the efficiency as actual/estimated * 100.
sr. member
Activity: 302
Merit: 250
July 13, 2011, 07:22:30 PM
How is efficiency calculated? I was under the impression that one should expect to get:

(Payout per block / difficulty) * number of shares submitted     

As difficulty is the average number of shares that are needed to solve a block so each share should be worth 50BTC / 1,564,057 = 0.000031968BTC per share? Please correct me if I'm wrong.

If that right then I would guess efficiency is how much you get over that expected amount?
full member
Activity: 196
Merit: 100
July 13, 2011, 07:05:20 PM
Well in terms of revenue it should mathematically be 28% more if you are hopping all the time. Earlier someone said they got ~20% more. At the worst case you'll get the same revenue you'd get normally and be stuck mining at eligius all day.

Stats are slowly in the process of going up. There is a stat dump file which records shares, difficulty, and server.

I originally wanted the stats to be integrated into the server but I might just write a program which allows you to enter you rewards from each server and calculates its efficiency.

EDIT: I did some quick calculations on btcguild and I have 9362 shares. I expected to get 0.299 btc for those shares. I got 0.384 for those shares. So I got about 28% more that I should have. Which is actually pretty amazing.
full member
Activity: 168
Merit: 100
July 13, 2011, 03:45:09 PM
Can anybody give an estimate on what % increase they are seeing in revenue through using this tool?
member
Activity: 66
Merit: 10
July 13, 2011, 03:23:19 PM
I guess I am faster with my code! Tongue

Yep, but good to see we used the same methods.  Grin
full member
Activity: 196
Merit: 100
July 13, 2011, 02:10:01 PM
1) Arsbitcoin as the second backup?
Sweet. I'll add it.

EDIT: The website is broken and I can't login. So I haven't added it. Will try later on a windows machine.

2) Refactor code?
Yeah. I want to do it. I was going to move everything thats not explicitly pool related out of pool.py and into bithopper.py and lp.py

3) Pool share count?
I can reenable it.

4) dumping the stats to a file?
Yeah I was going to try and make an sqlite database and then well LP errors and crazy cycling.

EDIT: I added it as --statsdump

5) Triplemining?
Well it seems like a little bit of a ponzi scheme. I have no issues hopping them however.
legendary
Activity: 2618
Merit: 1007
July 13, 2011, 01:46:44 PM
I guess I am faster with my code! Tongue

in pools.py:
Code:
@@ -9,6 +9,7 @@ import sys
 import exceptions
 import optparse
 import time
+import re
 
 from twisted.web import server, resource
 from client import Agent
@@ -68,7 +69,11 @@ servers = {
         'bitp':{'shares': default_shares, 'name': 'bitp.it',
            'mine_address': 'pool.bitp.it:8334', 'user': bitp_user,
            'pass': bitp_pass, 'lag': False, 'LP': None,
-            'api_address':'https://pool.bitp.it/api/pool'}
+            'api_address':'https://pool.bitp.it/api/pool'},
+        'triplemining':{'shares': default_shares, 'name': 'triplemining.com',
+           'mine_address': 'eu.triplemining.com:8344', 'user': triplemining_user,
+           'pass': triplemining_pass, 'lag': False, 'LP': None,
+            'api_address':'https://www.triplemining.com/stats'}
         }
 
 current_server = 'btcg'
@@ -289,6 +294,14 @@ def bitclockers_sharesResponse(response):
     servers['bitclockers']['shares'] = round_shares
     log_dbg( 'bitclockers :' + str(round_shares))
 
+def triplemining_sharesResponse(response):
+    global servers
+    statpage = response
+    shares = re.search(r"[0-9]*", statpage).group(0)[4:-5]
+    round_shares = int(shares)
+    servers['triplemining']['shares'] = round_shares
+    log_dbg( 'triplemining :' + str(round_shares))
+
 def errsharesResponse(error, args):
     log_msg('Error in pool api for ' + str(args))
     log_msg(str(error))
@@ -305,7 +318,8 @@ def selectsharesResponse(response, args):
         'btcg':btcguild_sharesResponse,
         'eclipsemc':eclipsemc_sharesResponse,
         'miningmainframe':mmf_sharesResponse,
-        'bitp':bitp_sharesResponse}
+        'bitp':bitp_sharesResponse,
+        'triplemining':triplemining_sharesResponse}
     func_map[args](response)
     server_update()
and you need a triplemining_user and a triplemining_pass of course in the passwords file.

Th regex currently just takes the first number in a tag which luckily is the current shares number. A bit more safe would be to check for the "MINING" line and then take the next number or so... however: It works for the time being!
member
Activity: 66
Merit: 10
July 13, 2011, 01:32:15 PM
I actually have a sharesResponse definition for triplemining. I used regex and I'm sure it would break if the layout changed at all, but it's functional. I'll post the code when I get home.

The main point for refactoring the code is that changes in the main pool.py routines wouldn't require merging those changes with my own changes. Maybe I'll see if I can do some of that myself and do a proper fork.
legendary
Activity: 2618
Merit: 1007
July 13, 2011, 01:00:37 PM
I think that would make it pretty easy for some of us that just play around with python to add a custom pool into the rotation.
It's the same difficulty as now, only that you have to write the function(s) in pool.py, not in a seperate file.

There's not much to do anyways: A sharesResponse function, a entry in the dictionary and the user/pass info.

I am writing one for triplemining right now, the only problem is that I need to load their stats-page and manually scrape it (with beautiful soup it would be easier but that adds another library - I'll try to do it with some regex kung-fu) instead of having it in json. That can all go in the sharesResponse though, so no big deal.
sr. member
Activity: 314
Merit: 251
July 13, 2011, 11:10:05 AM
Instead of or in addition to --disbale what about just evaluating the existence of username and/or password. If it doesn't exist ignore it. 
full member
Activity: 157
Merit: 101
July 13, 2011, 10:32:11 AM
ah, ofcourse. Smiley
member
Activity: 61
Merit: 10
July 13, 2011, 10:29:29 AM

[17:05:01] Error in pool api for eclipsemc
[17:05:01] [Failure instance: Traceback: : No JSON object could be decoded
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:1076:gotResult
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:1063:_inlineCallbacks
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:361:callback
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:455:_startRunCallbacks
--- ---
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:542:_runCallbacks
pool.py:309:selectsharesResponse
pool.py:250:eclipsemc_sharesResponse
/usr/lib/python2.7/json/__init__.py:326:loads
/usr/lib/python2.7/json/decoder.py:360:decode
/usr/lib/python2.7/json/decoder.py:378:raw_decode
]


I think this is caused by the fact that eclipse mining website https://eclipsemc.com/ is not currently working 100%, so no API data is currently available from them. Bithopper sees this and is reporting it as an API error; expected behavior I would say.
donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 13, 2011, 10:11:54 AM

All in all, I fear that only client side stats from bithopper will be useful (but might not reflect the reality 100%, as in the end only what reaches your wallet counts!). The ones from pools might be interesting for basic statistics, but you can't even see how lucky/unlucky you were each round (something that I'm interested in). They could be used to verify the local stats though and/or to correct them.


A whole heap of Multipool is devoted to this. I couldn't do it, but Nick from Multiclone seems like a nice guy, and he's on the case and might be able to give a hand adapting it by explaining the bits to do with html scraping.
full member
Activity: 157
Merit: 101
July 13, 2011, 10:07:32 AM

[17:05:01] Error in pool api for eclipsemc
[17:05:01] [Failure instance: Traceback: : No JSON object could be decoded
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:1076:gotResult
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:1063:_inlineCallbacks
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:361:callback
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:455:_startRunCallbacks
--- ---
/usr/lib/python2.7/dist-packages/twisted/internet/defer.py:542:_runCallbacks
pool.py:309:selectsharesResponse
pool.py:250:eclipsemc_sharesResponse
/usr/lib/python2.7/json/__init__.py:326:loads
/usr/lib/python2.7/json/decoder.py:360:decode
/usr/lib/python2.7/json/decoder.py:378:raw_decode
]

donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 13, 2011, 09:43:00 AM
Edit:
It seems to me that bitHopper hops to early! According to https://forum.bitcoin.org/index.php?topic=3165.0 line 169 should read min_shares = difficulty*.435, not min_shares = difficulty*.40! We still loose some "hot" share this way in the worst case.

the min_shares = difficulty*.435 comes from a special case of hopping from a 'contributed' model to a 'connected' model. It was a proof-of-concept, and I don't know how applicable the exact figure is to hopping multiple proportional pools. I was trying to figure it out and then thought it might just be easier to run a simulation and vary the percentages to see what the best outcomes were.

Unfortunately I can't find a good poisson distribution random number generator for non-integers (ie fractions of difficulty). Any ideas?
member
Activity: 66
Merit: 10
July 13, 2011, 09:22:57 AM
Hey, c00w, what's the chance we could have the code a bit refactored so each pool has an include or something similar? Something along the lines that the user/pass, server definition, shares definition, etc. are in a single location? The reason I ask is that it would then be very easy to 'plug in' another pool option just by adding a new .py file for that pool.

I think that would make it pretty easy for some of us that just play around with python to add a custom pool into the rotation.
Jump to: