Author

Topic: [1500 TH] p2pool: Decentralized, DoS-resistant, Hop-Proof pool - page 490. (Read 2591928 times)

hero member
Activity: 737
Merit: 500
And is it fair enough to let a miner mine blocks that include previous shares from others? I mean the work the others did before the actual miner found the real bitcoin block is lost then right?

Yes, but that is the entire point of pooled mining.  Miners agree to split the reward for a block with those that were helping to find blocks at the same time even though those other miners failed to find anything.

The block reward cant be sent to some zentralised server first that then determines how many btc every miner gets right?

That is how a normal pool works, and is entirely viable, but it's also contrary to the entire point of p2pool which is to avoid having any centralized server that everyone depends on and has to trust.
hero member
Activity: 896
Merit: 1000
I have some questions regarding p2pool. Sorry if already asked and i didnt find the answer.

I read the previously found shares from other miners are included in the block as transactions

To clarify (hopefully), for each destination address where at least one share is in the sharechain there's a transaction in the coinbase provided to miners by all p2pool nodes. If an address got 100 valid shares, there's only one transaction in the coinbase.

so that every miner gets his reward for the real found bitcoins block. But is the reward of a newly found block already splitable?

The reward is automatically splited as valid shares on p2pool are the results of work on a coinbase that split the block proportionally to each address effort (as detected by the number of shares computed by each address).
A p2pool block is simply a share hitting the Bitcoin network difficulty, so it has the same properties: it automatically splits the reward.

And is it fair enough to let a miner mine blocks that include previous shares from others? [...]

Yes, that's PPLNS, look it up (several pools use it, not only p2pool).

I often read about problems with the time in p2pool because of the setup of p2pool. Is this a serious factor? I mean compared to mining at normal pools or solomining since all p2pool miners would have the same chance.

I don't understand.

When mining bitcoins one has to run a bitcoin server. But what about merged mining? Thats possible with p2pool too like i read. So does one have to run a wallet server of each coin one is merge mining? That would lead to p2pool only able to run on linux when merge mining since many merge mine able coins only have linux wallets. Right?

Indeed you'll need a node for each coin you want to merge mine. It doesn't need to be a local one though, so you may run p2pool on a Windows server and access nodes running on Linux.
I don't see the point of using Windows myself (especially when revenue depends on it) so I wouldn't be of any help here sorry.
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
I have some questions regarding p2pool. Sorry if already asked and i didnt find the answer.

I read the previously found shares from other miners are included in the block as transactions so that every miner gets his reward for the real found bitcoins block. But is the reward of a newly found block already splitable? And is it fair enough to let a miner mine blocks that include previous shares from others? I mean the work the others did before the actual miner found the real bitcoin block is lost then right? The block reward cant be sent to some zentralised server first that then determines how many btc every miner gets right?

I often read about problems with the time in p2pool because of the setup of p2pool. Is this a serious factor? I mean compared to mining at normal pools or solomining since all p2pool miners would have the same chance.

When mining bitcoins one has to run a bitcoin server. But what about merged mining? Thats possible with p2pool too like i read. So does one have to run a wallet server of each coin one is merge mining? That would lead to p2pool only able to run on linux when merge mining since many merge mine able coins only have linux wallets. Right?
legendary
Activity: 1420
Merit: 1010
Loving the blocks found over the last 2 days on p2pool Smiley been epic!!

Also I have a 0% p2pool btc fee pool almost ready with merge mining coins Namecoin, Ixcoin, devcoin and a few others hopefully coming soon...

http://fuzzypool.mine.bz/

should be ready for beta testers real soon, but have a look and please PM me any questions / opinions / thoughts or suggestions Smiley

Many thanks

FuzzyBear
sr. member
Activity: 448
Merit: 250
I'm pleased to announce that I've released my modified low-rate friendly P2Pool bitcoin node at cryptominer.org:9332 for general usage:

Announcement: https://bitcointalksearch.org/topic/22thann-australian-p2pool-backed-low-rate-friendly-pool-280780
Website: http://cryptominer.org/bitcoin/

This node should allow miners with a low hash rate (eg < 1Gh/s) to receive a consistent reward for their hash rate instead of the high-variance of having an expected-time-to-share of tens of hours or days.
zvs
legendary
Activity: 1680
Merit: 1000
https://web.archive.org/web/*/nogleg.com
No you set the difficulty in advance.  If you aim for a difficulty 1 share, but get a hash that is good enough for a difficulty 10 share, it still only counts as a difficulty 1 share.

When you get a share, it gets added to the share chain and when the next block is hit, you will get a share of that block's payout.

OK, that makes sense.  I thought that p2pool paid out each successful share based on the difficulty of that share itself (how close to zero the hash was), not based on the target difficulty.  Interesting how there's still so much variance in the payout table, then: each payout is slightly different, they don't quantize themselves into recognizable patterns.

Since there's only 8640 shares, I would think that it would be easy to see groupings in the payout table, but instead, each number is slightly different.  Does the target difficulty continuously update itself and change a little each time?  Maybe that is why.

yes, the difficulty target adjusts based on that particular pools hashrate

not sure when the source starts calling for the actual share difficulty to go up, but the pseudo shares used to measure your hash rate start to increase in difficulty somewhere around 5-10ghash?

Blah/50000+5

would make difficulty 5 share targets for showing hash rate on graph,  and 50000 difficulty target for you to get a portion of the block (assuming your hash rate wasnt high enough to push it above those #'s already)

long polling still opens just as many connections, it may reduce bandwidth usage for stratum, but stratum has never worked properly for me on p2pool.    oh, the new non-punish policy is great,  it would be slow nodes fault if something got orphaned anyway

http://www.nogleg.com:9332/static/share.html#000000000000ce061ad134ff10377613d98ae66dc1fc065311c8e039600a63c9

correct share, orphaned:  http://www.nogleg.com:9332/static/share.html#000000000000386801f1d4782ab9931c9a4c63c6eec24c6c4eaa2ba903e03b48

incorrect share, not orphaned:  http://www.nogleg.com:9332/static/share.html#00000000000166aebdbc7be75dfec1657a15706e21e2ad89bbb13cfa7b0ec5e2

but note in came over a minute earlier.   so this reduces your risk of punishing yourself
member
Activity: 106
Merit: 10
No you set the difficulty in advance.  If you aim for a difficulty 1 share, but get a hash that is good enough for a difficulty 10 share, it still only counts as a difficulty 1 share.

When you get a share, it gets added to the share chain and when the next block is hit, you will get a share of that block's payout.

OK, that makes sense.  I thought that p2pool paid out each successful share based on the difficulty of that share itself (how close to zero the hash was), not based on the target difficulty.  Interesting how there's still so much variance in the payout table, then: each payout is slightly different, they don't quantize themselves into recognizable patterns.

Since there's only 8640 shares, I would think that it would be easy to see groupings in the payout table, but instead, each number is slightly different.  Does the target difficulty continuously update itself and change a little each time?  Maybe that is why.
hero member
Activity: 737
Merit: 500
p2pool allows you to submit lower difficulty shares to make it's stats work to show how it thinks you are hashing

That is what the +x difficulty is all about to set it above the old default 1 difficulty

Yes, but in the context of this discussion I think we're talking about the '/x' difficulty username option (vs '+x') which actually does allow you to change the target for the real p2pool shares and not just the pseudo shares (as long as you choose something higher than what p2pool network requires by default).
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
Statistics fail + p2pool understanding fail Tongue

p2pool understanding fail
If you submit a share - it counts for nothing at all unless it is above p2pool difficulty - currently 25500 as I type this

p2pool allows you to submit lower difficulty shares to make it's stats work to show how it thinks you are hashing

That is what the +x difficulty is all about to set it above the old default 1 difficulty

So if you submit a share below around 25000 diff, but >= x, you will never get anything for it except a counter saying you found it.

Statistics fail
The simplest explanation is to see the fail Tongue

If I submit shares at 10 difficulty, I will find on average 1/10 of the shares as if I was submitting shares at 1 difficulty.

Thus each share >= 10 difficulty is worth 10x as much as a 1 difficulty share.

The share difficulty itself is irrelevant except for that fact that is it >= 10 (... unless you mine ozcoin PoT ...)
legendary
Activity: 1232
Merit: 1094
Interesting.  I thought p2pool already paid out on a prorated basis, based on how difficult your share was, regardless of what the target was? 

No you set the difficulty in advance.  If you aim for a difficulty 1 share, but get a hash that is good enough for a difficulty 10 share, it still only counts as a difficulty 1 share.

When you get a share, it gets added to the share chain and when the next block is hit, you will get a share of that block's payout.

Quote
Let's say I submit 6 shares, at these difficulties: 1, 1, 1, 1000, 1000, 1000.

That will pay me about 3003 difficulties worth of work.  Now, let's say I change my minimum difficulty to 500.  Instead of submitting 6 shares, I submit only 3 shares, at these difficulties: 1000, 1000, 1000.  This only pays me about 3000 difficulties worth of work, not 3003.  So I have lost a little income here.  In case there's something I'm missing.

If you set the difficulty target at 1, you will get around 1 share for every 4 billion hashes and each one will be worth 1 share.  However, some will have a hash that is low enough to be a valid difficulty 10 share. 

If you set the difficulty target to 10, you get 1 share for every 40 billion hashes, but each one of those shares would count for 10.

What happens with hashing is that you (in effect) pick a random number between 0 and a huge number (2 to the power of 256).  You need to get an result lower than the target.  A difficulty 10 share has a target that is 10 times lower than a difficulty 1 share.  This means you are 10 times less likely to hit it.
hero member
Activity: 516
Merit: 643
You have to decide what the target is.

For example, if you do 1 trillion hashes, you will get approx

25 difficulty 10 shares

250 difficulty 1 shares (includes the 25 difficulty 10 shares)

Interesting.  I thought p2pool already paid out on a prorated basis, based on how difficult your share was, regardless of what the target was?  If you just barely limp in over the minimum required difficulty, you get paid a small amount, but if you find a massively high difficulty share (such as a share that's good enough to become a real Bitcoin block) you get paid that much more.  You don't get paid the same amount per share, I have noticed.

Didn't know it took into account the ability to optionally choose a higher minimum required difficulty, and increased the payout ratio proportionally, to compensate you for your loss from all the little low shares that you've now chosen to throw away instead of submit.  What's the ratio that it uses?

Let's say I submit 6 shares, at these difficulties: 1, 1, 1, 1000, 1000, 1000.

That will pay me about 3003 difficulties worth of work.  Now, let's say I change my minimum difficulty to 500.  Instead of submitting 6 shares, I submit only 3 shares, at these difficulties: 1000, 1000, 1000.  This only pays me about 3000 difficulties worth of work, not 3003.  So I have lost a little income here.  In case there's something I'm missing.

Anybody have the exact formula?

Josh


How "difficult" a share is depends only on its target. Any share mined around the same time will yield the same contribution to your payout. I say "around the same time", because the minimum difficulty changes to keep it at one share every thirty seconds, and that will affect a share's reward.
member
Activity: 106
Merit: 10
You have to decide what the target is.

For example, if you do 1 trillion hashes, you will get approx

25 difficulty 10 shares

250 difficulty 1 shares (includes the 25 difficulty 10 shares)

Interesting.  I thought p2pool already paid out on a prorated basis, based on how difficult your share was, regardless of what the target was?  If you just barely limp in over the minimum required difficulty, you get paid a small amount, but if you find a massively high difficulty share (such as a share that's good enough to become a real Bitcoin block) you get paid that much more.  You don't get paid the same amount per share, I have noticed.

Didn't know it took into account the ability to optionally choose a higher minimum required difficulty, and increased the payout ratio proportionally, to compensate you for your loss from all the little low shares that you've now chosen to throw away instead of submit.  What's the ratio that it uses?

Let's say I submit 6 shares, at these difficulties: 1, 1, 1, 1000, 1000, 1000.

That will pay me about 3003 difficulties worth of work.  Now, let's say I change my minimum difficulty to 500.  Instead of submitting 6 shares, I submit only 3 shares, at these difficulties: 1000, 1000, 1000.  This only pays me about 3000 difficulties worth of work, not 3003.  So I have lost a little income here.  In case there's something I'm missing.

Anybody have the exact formula?

Josh
legendary
Activity: 1232
Merit: 1094
It seems to me that all you would be doing by increasing your difficulty would be to forfeit some potential earnings from weak shares, by throwing those shares away instead of submitting them to the sharechain, unless there's something I'm missing.

You have to decide what the target is.

For example, if you do 1 trillion hashes, you will get approx

25 difficulty 10 shares

250 difficulty 1 shares (includes the 25 difficulty 10 shares)

You have to decide in advance what your target is.  You can set it to anything you want, as long as it is higher than the minimum required.

If you set it to 10, then only the 25 difficulty 10 shares count, so you get 25 shares worth 10 each.

If you set it to 1, then the 250 difficulty 1 shares all count, so you get 250 shares worth 1 each.  The 25 difficulty 10 shares also count, but since you set them to difficulty 1, you can't claim a reward of 10.

In other words, the "value" of a share is not fixed.  It's based on the target difficulty for that share relative to the target difficulties of all the other shares that make up that block.  In other words, if you opt-in to a difficulty that is 2x as high, you get paid 2x as much for that share. 

Right, and you have to pick the target before you run the hash, so you can't change it afterwards.
hero member
Activity: 737
Merit: 500

That's something I don't get: how voluntarily increasing your difficulty will help you.

Wouldn't you want to claim as many shares as possible, even if they are small amounts?


You will get paid more for the higher difficulty shares.  It will reduce your bandwidth usage if you are putting out a lot of hash power.

In other words, the "value" of a share is not fixed.  It's based on the target difficulty for that share relative to the target difficulties of all the other shares that make up that block.  In other words, if you opt-in to a difficulty that is 2x as high, you get paid 2x as much for that share.  
sr. member
Activity: 434
Merit: 250
That's something I don't get: how voluntarily increasing your difficulty will help you.

Wouldn't you want to claim as many shares as possible, even if they are small amounts?

Since each share you find increases your payout allocation within p2pool's 8640-share window, it should always be beneficial to add more shares, right?

Think of the difficulty as the number of difficulty 1 shares you are submitting at the same time, if that helps. If someone submits 10 shares of difficulty 2, and someone else submits a single share of difficulty 20, they have the same score. They have both submitted 20 difficulty 1 shares worth of work.

The reason for increasing the difficulty on the miner's side is to reduce the bandwidth being used by reporting shares and server load on the pool. It won't hurt your income at all. A large enough ASIC could submit thousands of difficulty 1 shares per minute which is a huge waste of resources.
legendary
Activity: 1904
Merit: 1002
Which, for a low rate, low payout miner is what increasing your difficulty will accomplish.  When you are averaging near just a few shares in the window there will be some blocks when you just don't have any shares in the window. If you up the difficulty there will be more (of course, on average you'll still end up with a fair payout, just higher variance with bigger quanta).  Since the difficulty can be set dynamically you could even mine higher diff while you had no shares in the window... and lower when you did. (I think on LTC it does this by default).

That's something I don't get: how voluntarily increasing your difficulty will help you.

Wouldn't you want to claim as many shares as possible, even if they are small amounts?

Since each share you find increases your payout allocation within p2pool's 8640-share window, it should always be beneficial to add more shares, right?

It seems to me that all you would be doing by increasing your difficulty would be to forfeit some potential earnings from weak shares, by throwing those shares away instead of submitting them to the sharechain, unless there's something I'm missing.

Maybe if p2pool is CPU limited, or you're running remote miners that are network-limited between them and p2pool, and you have so many miners that you're constantly spamming p2pool with 1-difficulty shares, that might make a difference.  In that case, wouldn't it just be better to voluntarily increase your difficulty to something that is still well below the minimum required for a valid share (perhaps a difficulty of 1000 or so), which would cut down traffic dramatically from your miners, but still allow you to claim every share in the sharechain that you are entitled to?  Or, am I missing something here?

Josh


You will get paid more for the higher difficulty shares.  It will reduce your bandwidth usage if you are putting out a lot of hash power.
member
Activity: 106
Merit: 10
Which, for a low rate, low payout miner is what increasing your difficulty will accomplish.  When you are averaging near just a few shares in the window there will be some blocks when you just don't have any shares in the window. If you up the difficulty there will be more (of course, on average you'll still end up with a fair payout, just higher variance with bigger quanta).  Since the difficulty can be set dynamically you could even mine higher diff while you had no shares in the window... and lower when you did. (I think on LTC it does this by default).

That's something I don't get: how voluntarily increasing your difficulty will help you.

Wouldn't you want to claim as many shares as possible, even if they are small amounts?

Since each share you find increases your payout allocation within p2pool's 8640-share window, it should always be beneficial to add more shares, right?

It seems to me that all you would be doing by increasing your difficulty would be to forfeit some potential earnings from weak shares, by throwing those shares away instead of submitting them to the sharechain, unless there's something I'm missing.

Maybe if p2pool is CPU limited, or you're running remote miners that are network-limited between them and p2pool, and you have so many miners that you're constantly spamming p2pool with 1-difficulty shares, that might make a difference.  In that case, wouldn't it just be better to voluntarily increase your difficulty to something that is still well below the minimum required for a valid share (perhaps a difficulty of 1000 or so), which would cut down traffic dramatically from your miners, but still allow you to claim every share in the sharechain that you are entitled to?  Or, am I missing something here?

Josh
zvs
legendary
Activity: 1680
Merit: 1000
https://web.archive.org/web/*/nogleg.com
i changed the whole punishing share code,  curious as to how it'll work now.   i'm guessing better, since it seems some people have super high latency times

hmm, if i can see any examples.  looks like stratum messed up, always use long poll

much better, i think.  it gives you all the possible routes:

http://www.nogleg.com:9332/static/share.html#000000000002889fc03f2c9739b7ea6e43eb45ae9dba97fef40bc0511923f5ea

this way you aren't cornholed onto a path that'll be orphaned

(so essentially you wont be penalized for being 20 seconds faster than person X with hundreds of ghash)

btw, i sent all the people that were on the pool when i took it down the equivalent to what the bitcoin mining calculator says you'd make in a day w/ the hash rates you were getting on my pool  Grin

eh, one more edit, here's a great example:

http://www.nogleg.com:9332/static/share.html#00000000000176b9ab070f7f71cf4e3a7f83211d6341bf08f85e65f33c77553c

if you were 'punishing' shares, then you would have followed the f0ec7e42 route, which (despite being the "proper" route), was orphaned by

http://www.nogleg.com:9332/static/share.html#000000000000470ee8f03b070dfce951c8dd5ed2ed65eee5c2c4016416d5803f
staff
Activity: 4284
Merit: 8808
Mathematically, that would be the same as manually boosting the difficulty of the shares returned to the network. You'll get paid less often (large variance), but your payouts will be larger. This features is already built-in.
I don't think so.  The point is that some people wouldn't be paid when a block is found, who previously would have been. 
Which, for a low rate, low payout miner is what increasing your difficulty will accomplish.  When you are averaging near just a few shares in the window there will be some blocks when you just don't have any shares in the window. If you up the difficulty there will be more (of course, on average you'll still end up with a fair payout, just higher variance with bigger quanta).  Since the difficulty can be set dynamically you could even mine higher diff while you had no shares in the window... and lower when you did. (I think on LTC it does this by default).
zvs
legendary
Activity: 1680
Merit: 1000
https://web.archive.org/web/*/nogleg.com
I've always had those, never occurred to me that there was something wrong.  Shocked  I though it was just part of the way nodes talked to each other.

Well, something else that's interesting is that;

2013-08-23 02:36:38.222282 Punishing share for 'Block-stale detected! height(156670399e288c80f42d97518c063cc8763742d5fcec2c0141) < height(19197727fe571665ea6fdb4ee72ca33f95e997b1ffbc6b0d32) or 19548732 != 19548732'! Jumping from 35417e20 to 15364f72!

if you go back a few shares, a parent share is 35417e20, not 15364f72.  so some node built off of 35417e20, while mine was trying to jump to something else.

i guess we've come to the conclusion that the share right before a block "almost always" gets orphaned (35417e20 didn't), and a share directly after a block has a good chance of getting orphaned (i.e. if I had built off of 15364f72, instead of 35417e20)
 
so i guess after taking everyone's hashrate into consideration, you'd want to get your getblocktemplate slow enough to not cause problems.  the largest node is private, but you can see on http://p2pool-nodes.info/ that one at 740ghash is *10 seconds behind

ed:  slightly over 2/3rds of all my orphans are within 30 seconds of a block solve,  and, yes, i have logs for this
Jump to: