Author

Topic: [1500 TH] p2pool: Decentralized, DoS-resistant, Hop-Proof pool - page 429. (Read 2591964 times)

full member
Activity: 160
Merit: 100
What is the time zone being displayed under the "Last Blocks" tab?  Is it in GMT?  The reason I ask is I see this for my node:
Code:
93499 Sun Feb 09 2014 08:40:00 GMT-0600 (Central Standard Time) 02ad14053894887b847d9060fbad4d070c2349b6eeb21a5ebd97ce329372bb6c

So is that saying that that block was found at 08:40 GMT and I need to use a -0600 offset for my time zone (which is Central btw)?
sr. member
Activity: 434
Merit: 250
As an experiment I'm running this live on one of my VTC nodes now, after testing it out some. The share difficulties are being set exactly as expected. Will check again in morning but it seems to be working great. Here is a diff for anyone's review/comments please. Remember python is whitespace sensitive if you try to apply this to test yourself. What happens is the share target when sending out work is set based on the payment addresses' hash rate instead of the whole node's hash rate. This way each person/group mining to an address on a public node finds shares at the same difficulty as if they ran a local private p2pool, and don't get increased variance based on the size of the public node.

Code:
diff --git a/p2pool/work.py b/p2pool/work.py
index e1c677d..285fa3e 100644
--- a/p2pool/work.py
+++ b/p2pool/work.py
@@ -245,12 +245,11 @@ class WorkerBridge(worker_interface.WorkerBridge):

         if desired_share_target is None:
             desired_share_target = 2**256-1
-            local_hash_rate = self._estimate_local_hash_rate()
-            if local_hash_rate is not None:
+            local_addr_rates = self.get_local_addr_rates()
+            local_hash_rate = local_addr_rates.get(pubkey_hash, 0)
+            if local_hash_rate > 0.0:
                 desired_share_target = min(desired_share_target,
                     bitcoin_data.average_attempts_to_target(local_hash_rate * self.node.net.SHARE_PERIOD / 0.0167)) # limit to 1.67% of pool shares by modulating share difficulty
-
-            local_addr_rates = self.get_local_addr_rates()
             lookbehind = 3600//self.node.net.SHARE_PERIOD
             block_subsidy = self.node.bitcoind_work.value['subsidy']
             if previous_share is not None and self.node.tracker.get_height(previous_share.hash) > lookbehind:

One weakness of the ADDR/1 workaround is it overrides vardiff completely and ignores the network.py dust threshold. The optimal solution is for every miner to run their own p2pool node.
sr. member
Activity: 434
Merit: 250
I realize that p2pool's intention is to treat each node as a single miner, and p2pool isn't intended to operate as a pseduo public pool. So as I look in the code, I see how the 1.67% cap is applied to the node (not individual connected miners) and that makes total sense in the context that a node is a single operation (an individual or group using p2pool with all of their own hardware as a replacement for solo mining).

However, to better support miners that want to use a public node for whatever reason I think it'd be good if that could be handled in a way that will, in effect, simulate the same result as if they were running a p2pool node of their own instead. Maybe as a command line option that is off by default so any changes make zero difference to existing operations.

Basically this comes down to making the share target for a miner (by which I mean a person or group with 1 or more physical mining devices) based on that miner, and the 1.67% cap on that miner. Not on the node as whole.

The key code in get_work currently is:

Code:
if desired_share_target is None:
desired_share_target = 2**256-1
local_hash_rate = self._estimate_local_hash_rate()
if local_hash_rate is not None:
desired_share_target = min(desired_share_target,
bitcoin_data.average_attempts_to_target(local_hash_rate * self.node.net.SHARE_PERIOD / 0.0167))
# limit to 1.67% of pool shares by modulating share difficulty

However, we wouldn't want to just change the local_hash_rate to be the miner whose new work is being assigned (the physical mining device with a connection to the pool). That would defeat the purpose of things like the 1.67% cap. What if, instead, it was based on the estimated hash rate of the destination payment address? So if I have 4 antminers all mining to ADDR_X, the target share rate is based on their combined speed. But someone else connecting their two antminers with ADDR_Y will have a lower target share rate. ADDR_X and ADDR_Y are both having the 1.67% cap applied individually, etc. Someone operating a node now with dozens of pieces of equipment all paying to the same address would see zero change even if they did toggle this on. Individual miners on a public node would see reduced variance in their own shares, since pool hash rate is taken out of the equation. They could do this by hand now with ADDR/1 (or say /.000001 for scrypt), but I think handling it automatically makes more sense (and keeps vardiff alive for miners that are maybe bigger than justifies using ADDR/1).

The way I view this is that if ADDR_X and ADDR_Y were running their own nodes instead of connecting to a public node, their target share rates would be based on only their own hash rates anyway. The 1.67% would be applied to each of them individually (instead of all combined in the public node). By adjusting their target share rates only to their own speeds, it simulates them running their own nodes.

Thoughts?

TLDR: A small miner connecting to a busy public node has much higher variance than running a node of their own.
sr. member
Activity: 434
Merit: 250
Interesting. I've always assumed small miners on public nodes have high variance because of share difficulty. But that's not the case, the bigger the pool is the more it will push up those miner's share targets so that the pool as a whole hits its share target. As far as I understand it so far anyway. This doesn't happen if a miner runs his own pool, since you'd then be working on the minimum difficulty if you aren't getting enough shares.

Please see this post and my two followup replies:

https://bitcointalksearch.org/topic/m.5019286

For example I have a dozen of these for miners:

2014-02-08 19:06:36.628849 New work for worker! Difficulty: 0.002211 Share difficulty: 1.629049 Total block value: 50.017000 VTC including 14 transactions

And here is my ADDR/.000001 test:

2014-02-08 19:06:36.633762 New work for worker! Difficulty: 0.002211 Share difficulty: 0.054301 Total block value: 50.017000 VTC including 14 transactions

Share difficulty from web interface is .054. So left to the defaults, I'd be able to get shares onto the share chain less than 1/100 as often?
legendary
Activity: 2968
Merit: 1198
Could you be any more vague? LOL

An example of people trying to deal with it.

http://www.reddit.com/r/Bitcoin/comments/1f4t2e/90_transaction_fee_how_can_this_be_avoided/ca6typi

You should google "bitcoin dust" and read up on how it works.

That post is slightly out of date as the current client removed the 0.01 output rule and reduced the transaction size from 10k to 3k but the principle is correct.
hero member
Activity: 616
Merit: 500
I got Satoshi's avatar!
After 21 Million BTC have been created; mining will move on to dustbusting Wink

1 dust = $1 million
Zactly! Plus the transaction fees once Bitcoin has taken over the world... Satoshi takes good care of miners Wink
legendary
Activity: 2912
Merit: 1060
After 21 Million BTC have been created; mining will move on to dustbusting Wink

1 dust = $1 million
hero member
Activity: 616
Merit: 500
I got Satoshi's avatar!
After 21 Million BTC have been created; mining will move on to dustbusting Wink
sr. member
Activity: 434
Merit: 250
Could you be any more vague? LOL

An example of people trying to deal with it.

http://www.reddit.com/r/Bitcoin/comments/1f4t2e/90_transaction_fee_how_can_this_be_avoided/ca6typi

You should google "bitcoin dust" and read up on how it works.
newbie
Activity: 23
Merit: 0
Has anyone noticed any correlation between how well distributed the p2pool hash rate is across p2pool nodes and how efficient smaller nodes are? I set up p2pool nodes for a few scrypt altcoins. For more popular coins with dozens of p2pool nodes my node's efficiency is > 100%. For less popular coins where one p2pool node has more than 2/3 of the total p2pool hash rate my node's efficiency is horrible - usually 60% - 80%.

My nodes all run on the same server with the same p2pool version. It seems like there is a pattern when one node has an overly large share of the hash rate the smaller nodes suffer efficiency. Or is this 99.99% likely just to be some other factor I am overlooking. I would love to hear from anyone who can prove/disprove this theory.
hero member
Activity: 616
Merit: 500
I got Satoshi's avatar!
There was an article posted in December of last year, that says in it:

"P2pool is technically a mining pool, but one that acts like solo mining in terms of the end user's view. You enter the pool using your wallet's address, rather than signing up to get an account, and the payments are automatically paid out once blocks are found by anyone on the pool. This cuts down on the middle man and also reduces the time it takes to get paid, although it does bring up another issue: you will get a ton of payments out of it. With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Does anyone know what this comment means? "With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Here is the original article: http://www.michaelnielsen.org/ddi/how-the-bitcoin-protocol-actually-works/

Thanks,
I think that's a reference to dust...

Could you be any more vague? LOL


Basically, tiny transactions that cost more to send than they are worth and clutter the blockchain... I know more space was added to [edit]each block[/edit] for free transactions, so I'm not sure if that affects it at all.
hero member
Activity: 630
Merit: 501
There was an article posted in December of last year, that says in it:

"P2pool is technically a mining pool, but one that acts like solo mining in terms of the end user's view. You enter the pool using your wallet's address, rather than signing up to get an account, and the payments are automatically paid out once blocks are found by anyone on the pool. This cuts down on the middle man and also reduces the time it takes to get paid, although it does bring up another issue: you will get a ton of payments out of it. With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Does anyone know what this comment means? "With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Here is the original article: http://www.michaelnielsen.org/ddi/how-the-bitcoin-protocol-actually-works/

Thanks,
I think that's a reference to dust...

Could you be any more vague? LOL

hero member
Activity: 616
Merit: 500
I got Satoshi's avatar!
There was an article posted in December of last year, that says in it:

"P2pool is technically a mining pool, but one that acts like solo mining in terms of the end user's view. You enter the pool using your wallet's address, rather than signing up to get an account, and the payments are automatically paid out once blocks are found by anyone on the pool. This cuts down on the middle man and also reduces the time it takes to get paid, although it does bring up another issue: you will get a ton of payments out of it. With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Does anyone know what this comment means? "With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Here is the original article: http://www.michaelnielsen.org/ddi/how-the-bitcoin-protocol-actually-works/

Thanks,
I think that's a reference to dust...
hero member
Activity: 630
Merit: 501
There was an article posted in December of last year, that says in it:

"P2pool is technically a mining pool, but one that acts like solo mining in terms of the end user's view. You enter the pool using your wallet's address, rather than signing up to get an account, and the payments are automatically paid out once blocks are found by anyone on the pool. This cuts down on the middle man and also reduces the time it takes to get paid, although it does bring up another issue: you will get a ton of payments out of it. With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Does anyone know what this comment means? "With each new payment you get, you are potentially adding some bytes (around 230) to your transaction size for when you send coins to someone. As this number grows, the cost of sending the coins also grows along with it."

Here is the original article: http://www.michaelnielsen.org/ddi/how-the-bitcoin-protocol-actually-works/

Thanks,
newbie
Activity: 1
Merit: 0
Newbie Question......

If I see no local efficiency, my shares in p2pool console after 20 hrs of running at 30G with +5 DOA above the pool rate
Is it normal ?
and should i keep it on.
I stopped at this level thinking the setup had some problem.
Is there a way to check that my setup is doing some real work in the pool ?

Thanks in advance
sr. member
Activity: 434
Merit: 250
Does anyone know how to actually add alt coins to the P2pool system?  I'd like to setup a node for DigiByte coins but I really can't seem to figure out how to add it.

https://bitcointalksearch.org/topic/m.4998082

Should help. It's be nice of coin developers would release p2pool settings when they release their coins.
full member
Activity: 160
Merit: 100
Does anyone know how to actually add alt coins to the P2pool system?  I'd like to setup a node for DigiByte coins but I really can't seem to figure out how to add it.
sr. member
Activity: 434
Merit: 250
Does adding nodes with -n get added on top of the max connections out? The wiki made me think that, but I added 4 and Out is still sitting at 6.

Also, if/when a hard fork happens for a new major p2pool version, would a larger SPREAD make sense? That would reduce variance for smaller miners. Larger miners can just kick difficulty up higher with address/ if they don't want (possibly) more frequent payments.
hero member
Activity: 994
Merit: 1000
Hi guys, not sure if this is the right place to announce this, but since it's a new p2pool node I thought it might be a good place.

I've recently launched a bitcoin mining p2pool node in Panama, Central America, that pays bonus devcoins.

http://www.blisterpool.com

1% pool fee, using PPLNS (of course)
register bitcoin/devcoin address pair on site
connect with bitcoin address
auto-paid bitcoins and merge mined devcoins to the address pair, along with a 0.2% devcoin bonus
It also pays a 0.2% bonus to open source developers on the bitcoin and devcoin share lists.

I love the front-end you did with the exception of the status page which is obviously the extended front-end. You should stick a miner on your node/pool so that it doesn't look bare. Smiley

Thanks, yeah I was using the extended front end until I can get in there and get the data out myself. I only have weaksauce hashing power myself atm, but I'll follow your suggestion.
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
Just curious, but why do you need to down load p2pool software to join the pool?
So that you aren't in the high DOA/Reject % that helps boost the payment for all the low DOA/Reject % users from your lost income.

p2pool by design ensures that the lower your average reject % the more profit you'll take from other p2pool miners.

Since everyone has network latency and hardware latency, it's those who manage to get them the lowest who gain from the rest of the miners.

Think of it this way, if your DOA/Orphan % is average (at the moment 14%) then 14% of your work is thrown away.
Now if everyone was 14% that's just simply wasted electricity, but if some are 7% or even lower, they are making a better % of each block (based on their hash rate) than you are making. Pity the sods (more than half the pool) that is mining above 14% ...
Jump to: