Author

Topic: [1500 TH] p2pool: Decentralized, DoS-resistant, Hop-Proof pool - page 426. (Read 2591964 times)

hero member
Activity: 798
Merit: 1000
I know this has been mentioned in passing or requested here a few times, but I didn't see any direct request for this feature on GitHub's issues list.
So I threw one over the fence..

Feature Request - status of Merged Mining connections #177

Quote
Right now the only feedback we get is a single line in the running stdio display that shows "Got new merged mining work!". This doesn't tell us anything other than maybe at least one of the merged mining connections is working. So if we have Namecoin, Devcoin, IXCoin and IOCoin, how do we know one or more of those aren't working correctly other than hitting a block and seeing reward in the wallet? This can take a long time and won't help us diagnose any issues with connecting to any of the alt chains.

It would be great to have in the running log display, API, or web display some continuous data on whether any/all the merged mining connections are working and identify each connection, i.e. Namecoin, Devcoin, IXCoin, etc.
sr. member
Activity: 434
Merit: 250
my node is running already since some weeks and it is still not listed. so no automatic listings.

and same with listings in p2pools.org.

it would be nice to know how the nodes get listed on those sites.

I believe the scanner is based on all of the addresses in data/coinname/addrs. You might try setting up a direct connection to the scanner's node with -n so it sees you for sure.
legendary
Activity: 2912
Merit: 1060
How does one get a public node on to the p2pool list? http://p2pool-nodes.info/

Just keep your pool with open ports on 9333 and 9332 and make sure it gets both inbound and outbound connections.  Eventually it'll hit the list.  It may just take a day or so. Maybe two.

Mine is the one that says it's in Nutley NJ (it's actually not, but close enough).



my node is running already since some weeks and it is still not listed. so no automatic listings.

It doesn't really matter
member
Activity: 112
Merit: 10
Just Fun!
How does one get a public node on to the p2pool list? http://p2pool-nodes.info/

Just keep your pool with open ports on 9333 and 9332 and make sure it gets both inbound and outbound connections.  Eventually it'll hit the list.  It may just take a day or so. Maybe two.

Mine is the one that says it's in Nutley NJ (it's actually not, but close enough).



my node is running already since some weeks and it is still not listed. so no automatic listings.

and same with listings in p2pools.org.

it would be nice to know how the nodes get listed on those sites.
member
Activity: 112
Merit: 10
I'm running p2pool for a while now, but is there a known problem with memory usage?
My pool is running for days with the same hashrate, and max 20 peer connections.
The memory stays at 500MB, but suddenly it consumes more and more (4GB+) without more peers/hash/users.

Version: 13.4-16-g5ee3172-dirty running on Ubuntu in litecoin mode.
hero member
Activity: 798
Merit: 1000
How does one get a public node on to the p2pool list? http://p2pool-nodes.info/

Just keep your pool with open ports on 9333 and 9332 and make sure it gets both inbound and outbound connections.  Eventually it'll hit the list.  It may just take a day or so. Maybe two.

Mine is the one that says it's in Nutley NJ (it's actually not, but close enough).

legendary
Activity: 1361
Merit: 1003
Don`t panic! Organize!
re: creating p2pools for alt currencies
(...)
Ure right, removed it on my fork
https://github.com/Rav3nPL/p2pool-rav/commit/a19b69536f0e79869a863b6603459e1055441a1a
Will restart all my nodes one by one to make it happen.
hero member
Activity: 994
Merit: 1000
How does one get a public node on to the p2pool list? http://p2pool-nodes.info/
sr. member
Activity: 434
Merit: 250
i modify the 'punishing share' crap in my local copy so it doesn't automatically screw people over, but i suspect nobody else does this.

What specifically do you change and why?
hero member
Activity: 630
Merit: 501
Does P2Pool support Max-coin?

I am not sure what you use to mine it, if it's GPU or ASIC?
zvs
legendary
Activity: 1680
Merit: 1000
https://web.archive.org/web/*/nogleg.com
re: creating p2pools for alt currencies

normally I just set up a solo pool, but DOGE already had too much hashrate, so I used some p2pool network that I think had the identifier/prefixes set by rav3n?

i modify the 'punishing share' crap in my local copy so it doesn't automatically screw people over, but i suspect nobody else does this.  if a share is found locally during this non-punish time, then all the other clients will build off of this new share, as it'll be the longest chain (and it'll be based on a valid block and thus be a valid share... err.. unless some other block is found before another p2pool share)

95% of my orphans are caused by valid shares being orphaned due to new blocks being found, i suspect at least 10-15% of the total orphan/doa rate of the pool as a whole is caused by this

pls modify this for any future altcoin that has block times of 60-120s or less

ed: actually I just went ahead and made it a solo pool,  i think the lotto is more fun then getting valid shares orphaned over and over
full member
Activity: 162
Merit: 100
You added average time to find a share for each miner!
Did a pull request at donSchoe/p2pool-vtc for it. I thought VTC could use a bit of an advantage with a nice & shiny p2pool  Grin
Thanks again for the guide...

Quote
I might need to move to your front end instead of the default extended.

I found this frontend when searching for something more pretty than the original p2pool stats... I could push it to some repo also, but it needs a bit more cleanup. There is still a bug with non-active miners not being deleted from the list.
sr. member
Activity: 434
Merit: 250
Needs the patch I posted above for the miner_last_difficulties.

@roy7: thanks a lot for the guide, I'll try this asap.

You added average time to find a share for each miner! I love that. Smiley I might need to move to your front end instead of the default extended.
sr. member
Activity: 434
Merit: 250
Note: I do not know if those coins are compatible for merged mining so forgive me if i am trying to mine "apples and and oranges" at the same time!

The only merge minable coin I think that exists for scrypt is Unitedscryptcoin. Coins have to be designed to be merge-compatible. Almost none are since then they don't have much real value of their own. (Namecoin's value comes from the other things it can do.)
full member
Activity: 162
Merit: 100
Time to share for the node is simply calculated as (local_stats.attempts_to_share/local_hashrate). So you could just do (local_stats.attempts_to_share/miner_hashrate).

But this doesn't take into account the individual share diffs. So there really should be another way to calculate this better Huh

To answer my own question:
Code:
miner_time_to_share = (local_stats.attempts_to_share / local_stats.miner_hash_rates[address]) *
  (local_stats.miner_last_difficulties[address] / global_stats.min_difficulty)

Needs the patch I posted above for the miner_last_difficulties.

@roy7: thanks a lot for the guide, I'll try this asap.
newbie
Activity: 7
Merit: 0
Hello Guys, I am trying to Merge Mine LiteCoin and FeatherCoin but am getting this error all the time:

2014-02-12 18:25:06.003000 > Error while calling merged getauxblock on http://127.0.0.1:9336:
2014-02-12 18:25:06.033000 > Traceback (most recent call last):
2014-02-12 18:25:06.045000 >   File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 577, in _runCallbacks
2014-02-12 18:25:06.059000 >     current.result = callback(current.result, *args, **kw)
2014-02-12 18:25:06.071000 >   File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 1155, in gotResult
2014-02-12 18:25:06.084000 >     _inlineCallbacks(r, g, deferred)
2014-02-12 18:25:06.097000 >   File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 1097, in _inlineCallbacks
2014-02-12 18:25:06.110000 >     result = result.throwExceptionIntoGenerator(g)
2014-02-12 18:25:06.136000 >   File "C:\Python27\lib\site-packages\twisted\python\failure.py", line 389, in throwExceptionIntoGenerator
2014-02-12 18:25:06.242000 >     return g.throw(self.type, self.value, self.tb)
2014-02-12 18:25:06.253000 > --- ---
2014-02-12 18:25:06.268000 >   File "C:\Users\Andradek\Documents\GitHub\p2pool-rav\p2pool\util\deferral.py", line 41, in f
2014-02-12 18:25:06.279000 >     result = yield func(*args, **kwargs)
2014-02-12 18:25:06.293000 >   File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 1097, in _inlineCallbacks
2014-02-12 18:25:06.294000 >     result = result.throwExceptionIntoGenerator(g)
2014-02-12 18:25:06.305000 >   File "C:\Python27\lib\site-packages\twisted\python\failure.py", line 389, in throwExceptionIntoGenerator
2014-02-12 18:25:06.319000 >     return g.throw(self.type, self.value, self.tb)
2014-02-12 18:25:06.331000 >   File "C:\Users\Andradek\Documents\GitHub\p2pool-rav\p2pool\util\jsonrpc.py", line 133, in _http_do
2014-02-12 18:25:06.344000 >     raise Error_for_code(resp['error']['code'])(resp['error']['message'], resp['error'].get('data', None))
2014-02-12 18:25:06.371000 > p2pool.util.jsonrpc.NarrowError: -32601 Method not found


Does anyone know how to solve this?

Sorry if this has been answered before but i am lost here! Please Help!

Note: I do not know if those coins are compatible for merged mining so forgive me if i am trying to mine "apples and and oranges" at the same time!

Thank You for you help,
sr. member
Activity: 434
Merit: 250
Question: Could somebody point me to a tutorial or so on how to get this merged into the p2pool-rav repo with a proper merge request? Never done that Roll Eyes

I just went through this since I don't use github much myself. From http://git-scm.com/book/ch5-2.html I figured out to do this:

 2050  git clone https://github.com/forrestv/p2pool
 2051  cd p2pool
 2054  git checkout -b vardiffbyaddress
 2069  cd p2pool/
 2070  cp ~/p2pool/p2pool/work.py .
 2071  git diff
 2072  git commit -am 'set share diff target based on address hash rate'

Stop here and go to repo on github. Click fork on top right. Replace url on next line with your fork repo url

 2074  git remote add myfork https://github.com/roy7/p2pool
 2075  git push myfork vardiffbyaddress

2071 was just to see it made sense.

Now go to your repo web site and you should see a "compare and pull request" button lit up. Click it and put in your comments on the pull request to submit it to raven for review.

Edit: Ahhhh that pubkey_hash_to_address I wish I knew sooner. I've updated my pull request to use it.
full member
Activity: 162
Merit: 100
I might want to add "average time to share" per-worker sometime soon on my nodes.

Additional note (after trying to figure this out for my own status page): If you only want "time to share" for the miners, you can do this without any changes to p2pool code. Time to share for the node is simply calculated as (local_stats.attempts_to_share/local_hashrate). So you could just do (local_stats.attempts_to_share/miner_hashrate).

But this doesn't take into account the individual share diffs. So there really should be another way to calculate this better Huh
full member
Activity: 162
Merit: 100
I might want to add "average time to share" per-worker sometime soon on my nodes.

I tried to produce a diff. Actually it's only a few lines added to work.py and web.py. You get some additional data in ..../local_stats and ..../global_stats by that. You will have to integrate that with your web frontend, of course. Or I'd forward my index.html if you send me your mail address.

Code:
diff -ru p2pool-rav/p2pool/web.py /usr/local/p2pool.smc/p2pool/web.py
--- p2pool-rav/p2pool/web.py 2014-02-12 16:35:17.519474099 +0100
+++ /usr/local/p2pool.smc/p2pool/web.py 2014-02-11 19:57:17.917758829 +0100
@@ -99,11 +99,15 @@
        
         nonstale_hash_rate = p2pool_data.get_pool_attempts_per_second(node.tracker, node.best_share_var.value, lookbehind)
         stale_prop = p2pool_data.get_average_stale_prop(node.tracker, node.best_share_var.value, lookbehind)
+        diff = bitcoin_data.target_to_difficulty(wb.current_work.value['bits'].target)
+
         return dict(
             pool_nonstale_hash_rate=nonstale_hash_rate,
             pool_hash_rate=nonstale_hash_rate/(1 - stale_prop),
             pool_stale_prop=stale_prop,
             min_difficulty=bitcoin_data.target_to_difficulty(node.tracker.items[node.best_share_var.value].max_target),
+            network_block_difficulty=diff,
+            network_hashrate=(diff * 2**32 // node.net.PARENT.BLOCK_PERIOD),
         )
    
     def get_local_stats():
@@ -130,6 +134,10 @@
        
         miner_hash_rates, miner_dead_hash_rates = wb.get_local_rates()
         (stale_orphan_shares, stale_doa_shares), shares, _ = wb.get_stale_counts()
+
+        miner_last_difficulties = {}
+        for addr in wb.last_work_shares.value:
+            miner_last_difficulties[addr] = bitcoin_data.target_to_difficulty(wb.last_work_shares.value[addr].target)
        
         return dict(
             my_hash_rates_in_last_hour=dict(
@@ -152,6 +160,7 @@
             ),
             miner_hash_rates=miner_hash_rates,
             miner_dead_hash_rates=miner_dead_hash_rates,
+            miner_last_difficulties=miner_last_difficulties,
             efficiency_if_miner_perfect=(1 - stale_orphan_shares/shares)/(1 - global_stale_prop) if shares else None, # ignores dead shares because those are miner's fault and indicated by pseudoshare rejection
             efficiency=(1 - (stale_orphan_shares+stale_doa_shares)/shares)/(1 - global_stale_prop) if shares else None,
             peers=dict(
diff -ru p2pool-rav/p2pool/work.py /usr/local/p2pool.smc/p2pool/work.py
--- p2pool-rav/p2pool/work.py 2014-02-12 16:35:17.519474099 +0100
+++ /usr/local/p2pool.smc/p2pool/work.py 2014-02-11 19:44:08.277761641 +0100
@@ -36,6 +38,7 @@
         self.removed_unstales_var = variable.Variable((0, 0, 0))
         self.removed_doa_unstales_var = variable.Variable(0)
        
+        self.last_work_shares = variable.Variable( {} )
        
         self.my_share_hashes = set()
         self.my_doa_share_hashes = set()
@@ -319,6 +322,12 @@
             self.current_work.value['subsidy']*1e-8, self.node.net.PARENT.SYMBOL,
             len(self.current_work.value['transactions']),
         )
+        #need this for stats
+        self.last_work_shares.value[bitcoin_data.pubkey_hash_to_address(pubkey_hash, self.node.net.PARENT)]=share_info['bits']
        
         ba = dict(
             version=min(self.current_work.value['version'], 2),

Question: Could somebody point me to a tutorial or so on how to get this merged into the p2pool-rav repo with a proper merge request? Never done that Roll Eyes
member
Activity: 70
Merit: 10
Hey guys I've released the proxypool that sits between p2pool and miners, ensuring smaller miners get paid accordingly. It's currently in a beta stage and fully open source, I'd be great if you guy can help test.

It's at http://proxypool.doge.st

Very cool. Wasn't sure what you were actually doing. So the proxy will mine p2pool using a local wallet payment address, and then do payouts itself as people get enough coin balance saved up to justify a payout?

Yep, we have a reddit thread up as well for people who want to test. http://www.reddit.com/r/dogemining/comments/1xo9yq/help_us_beta_test_a_p2pool_with_almost_zero/

I have a thread up as well https://bitcointalksearch.org/topic/ann-open-source-stratum-to-stratum-proxy-pool-461632
Jump to: