Author

Topic: [1500 TH] p2pool: Decentralized, DoS-resistant, Hop-Proof pool - page 369. (Read 2591964 times)

legendary
Activity: 1258
Merit: 1027
As BTC has the highest difficulty of any coin you will mine, if you want to mine BTC + any other coin the difficulty will need to match BTC, and when matching BTC will match any merged coin with a lower difficulty....

No the others will mine before

So your saying that all accepted shares will be submitted to every coin? I thought only shares that met the BTC block criteria were submitted to merged coins?
legendary
Activity: 2912
Merit: 1060
As BTC has the highest difficulty of any coin you will mine, if you want to mine BTC + any other coin the difficulty will need to match BTC, and when matching BTC will match any merged coin with a lower difficulty....

No the others will mine before
legendary
Activity: 1258
Merit: 1027
As BTC has the highest difficulty of any coin you will mine, if you want to mine BTC + any other coin the difficulty will need to match BTC, and when matching BTC will match any merged coin with a lower difficulty....
legendary
Activity: 2912
Merit: 1060
Since I have p2pool already setup and running with BTC as main coin.

If i switch BTC to merged and IXC as main will it work?

BTC is not merge-mineable.


Yes it must be master
legendary
Activity: 1066
Merit: 1098
Since I have p2pool already setup and running with BTC as main coin.

If i switch BTC to merged and IXC as main will it work?

BTC is not merge-mineable.
legendary
Activity: 1540
Merit: 1001
Since I have p2pool already setup and running with BTC as main coin.

If i switch BTC to merged and IXC as main will it work?

Yes you can merge mine any mergeable coin without affecting your BTC mining.

M

I guess i didn't make it clear Tongue

My target is solo mine BTC.

So I use p2pool to MAINLY mine IXC and then merge mine (if you merge mine with p2pool you actually solo mine) BTC,NMC,FTC etc..

I think that'll work.  Not 100% sure.

M
full member
Activity: 196
Merit: 100
Since I have p2pool already setup and running with BTC as main coin.

If i switch BTC to merged and IXC as main will it work?

Yes you can merge mine any mergeable coin without affecting your BTC mining.

M

I guess i didn't make it clear Tongue

My target is solo mine BTC.

So I use p2pool to MAINLY mine IXC and then merge mine (if you merge mine with p2pool you actually solo mine) BTC,NMC,FTC etc..
legendary
Activity: 1540
Merit: 1001
Since I have p2pool already setup and running with BTC as main coin.

If i switch BTC to merged and IXC as main will it work?

Yes you can merge mine any mergeable coin without affecting your BTC mining.

M
full member
Activity: 196
Merit: 100
Since I have p2pool already setup and running with BTC as main coin.

If i switch BTC to merged and IXC as main will it work?
legendary
Activity: 1540
Merit: 1001
Can someone use p2pool to solo mine and merge mine at the same time?

So mainly is there an option to turn btc solo mining on?

If not - would mainly mine other coin e.g. IXC and merge mine BTC offer the same result?
I imagine you could... you'd have to hack the code a bit (there's a guide on how to do it here: https://bitcointalksearch.org/topic/antminer-s1-solomining-setupeasy-soloown-pool-setup-512042.  Then you'd just start up p2pool as you normally would for merged mining.  Remember, though, since merged mining is solo mining, you're now solo mining everything.  Maybe you get lucky and hit the lottery by generating the BTC block.

You could also use an earlier version ... that'd essentially put you on a different chain, so you'd be solo mining.

M
legendary
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
Can someone use p2pool to solo mine and merge mine at the same time?

So mainly is there an option to turn btc solo mining on?

If not - would mainly mine other coin e.g. IXC and merge mine BTC offer the same result?
I imagine you could... you'd have to hack the code a bit (there's a guide on how to do it here: https://bitcointalksearch.org/topic/antminer-s1-solomining-setupeasy-soloown-pool-setup-512042.  Then you'd just start up p2pool as you normally would for merged mining.  Remember, though, since merged mining is solo mining, you're now solo mining everything.  Maybe you get lucky and hit the lottery by generating the BTC block.
full member
Activity: 196
Merit: 100
Can someone use p2pool to solo mine and merge mine at the same time?

So mainly is there an option to turn btc solo mining on?

If not - would mainly mine other coin e.g. IXC and merge mine BTC offer the same result?
full member
Activity: 155
Merit: 100
Hi, new node @ Poland - http://bitcoin.fastlink.pl:9332
Please add to http://p2pool-nodes.info/


Best Regards, Dexu.
legendary
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
Quote
Hmmm, I'll have to give this some thought, just so I'm clear your suggestion is to include good+doa and exclude orphan from http://mining.coincadence.com:9332/web/graph_data/pool_rates/last_hour
Actually, I hadn't thought about discounting orphans, since they again could potentially solve the BTC requirements.

Quote
As an aside, can anyone clarify exactly what http://mining.coincadence.com:9332/rate is reporting? from web.py:
Code:
web_root.putChild('rate', WebInterface(lambda: p2pool_data.get_pool_attempts_per_second(node.tracker, node.best_share_var.value, decent_height())/(1-p2pool_data.get_average_stale_prop(node.tracker, node.best_share_var.value, decent_height()))))
I was about to reply that it's just calculating the overall hash rate of the p2pool network, but that little bit about the "get_average_stale_prop" threw me off.  I'd have to dig further into the code to answer you more accurately.  If I get the time tomorrow, I'll try to get back to you on it.  Hopefully someone more familiar with that could answer it sooner.
legendary
Activity: 1258
Merit: 1027
Quote
You're cherry picking, but remember, those DOA shares could potentially still be solutions for the BTC block chain.  I think you'd be doing the calculations a disservice by not including the DOA, since that hashing does indeed add value.

Hmmm, I'll have to give this some thought, just so I'm clear your suggestion is to include good+doa and exclude orphan from http://mining.coincadence.com:9332/web/graph_data/pool_rates/last_hour

As an aside, can anyone clarify exactly what http://mining.coincadence.com:9332/rate is reporting? from web.py:
Code:
web_root.putChild('rate', WebInterface(lambda: p2pool_data.get_pool_attempts_per_second(node.tracker, node.best_share_var.value, decent_height())/(1-p2pool_data.get_average_stale_prop(node.tracker, node.best_share_var.value, decent_height()))))

Quote
miners on my node with known hash rates.
Sorry, was not specific, I mean my own miners.... I keep data, and can make some luck calculations as well to validate/invalidate results...

Quote
I think you're doing a fantastic job with your site.  Keep up the great work.

Thanks!
legendary
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
Quote
For the luck calculations I'm considering using "pool_nonstale_hash_rate" from "global_stats", while at first look it seems like I'm cherry picking the best data to shine a good light on p2pool, with pool orphan+DOA rates often touching 20% I feel like including them in the luck calculation could significantly throw things off, so I want to base luck on actual valid work... Open to feedback on this one.
You're cherry picking, but remember, those DOA shares could potentially still be solutions for the BTC block chain.  I think you'd be doing the calculations a disservice by not including the DOA, since that hashing does indeed add value.

Quote
However, unless I'm missing something obvious, p2pool does not publicly expose its version of the blockchain, which makes it difficult to determine submitted shares by miner for miners who are not on my node.
No, there is no "sharechain.info" site as far as I know.  However, by running the node, you have a complete copy of the share chain.  You can use the information in there to calculate how many shares each address has submitted across the network.  You're still bound by the limitations I pointed out (excessive luck - both good and bad) which would affect the calculations.

Honestly, though, your method of using percentage of expected block payout to then determine the approximate hash rate of that address will probably get you a relatively close approximation.  Close enough for government work, anyway... which about the best you could hope for here since there's no way to actually accurately garner this information.

Quote
I'm going to test my original proposal, basing the estimated hash rate on the per-miner payout from the last found block and see how accurate it looks based on miners on my node with known hash rates (I'm already collecting/storing this data so it wont bloat the DB, and requires less new code).
I emphasized that because it's a pretty important point.  Your node only approximates the miners' true hash rates based upon shares it receives from those miners.  The value can swing pretty wildly, especially if you've got a wide range of hashing power.  This is where setting the pseudo-diff comes in to play.  If all of your miners actually set an appropriate value, then you'd get a better approximation.  Of course, you could always throw in the auto worker diff patch, which would dynamically assign difficulties to each worker, instead of having difficulty assigned based on the node's hash rate.  Again, though, these are approximations, but we're back to the government work again Smiley

I think you're doing a fantastic job with your site.  Keep up the great work.
legendary
Activity: 1258
Merit: 1027
Hey windpath,

...

Thanks Jonny,

I appreciate you taking the time!

I did not understand the "attempts / rate" either, my assumption is that it may be based on p2pools high share diff, but not sure.

Using the standard formula certainly seems to make sense, with the exception that I'll probably store the value in seconds so that when p2pool is finding 2-3 blocks an hour in the near future it will still work Wink

Code:
Difficulty * 2**32 / hashrate  = number of seconds to find a block

I'm finding even getting the "best" value for global hash rate is going to be a challenge as: http://mining.coincadence.com:9332/global_stats and http://mining.coincadence.com:9332/web/graph_data/pool_rates/last_hour report very different values....

After closer examination it looks like "global_stats" is the total hash rate, including orphans and DOA shares. "graph_data" separates them out by good, orphan and doa. I'm going to stick with the "global_stats" number for the overall pool speed, as that does represent the "overall" speed.

For the luck calculations I'm considering using "pool_nonstale_hash_rate" from "global_stats", while at first look it seems like I'm cherry picking the best data to shine a good light on p2pool, with pool orphan+DOA rates often touching 20% I feel like including them in the luck calculation could significantly throw things off, so I want to base luck on actual valid work... Open to feedback on this one.

This would be the ideal solution for miner hash rate:

Code:
100 shares in 24 hours = 864 seconds to find a share.
Current share difficulty = 1508968.56
1508968.56 * 2**32 / hashrate = 864
hashrate = 7501123398023.39555555555556 hashes per second =  7.5TH/s

However, unless I'm missing something obvious, p2pool does not publicly expose its version of the blockchain, which makes it difficult to determine submitted shares by miner for miners who are not on my node. Also, storing historical p2pool-wide share data with p2pools higher share rate will become unsustainable 60x faster then storing the Bitcoin blockchain (share every ~10 seconds vs. ~10 minutes)...

I'm going to test my original proposal, basing the estimated hash rate on the per-miner payout from the last found block and see how accurate it looks based on miners on my node with known hash rates (I'm already collecting/storing this data so it wont bloat the DB, and requires less new code).

For tonight I'll be setting up the data collector to grab the following and place it in its own table:

ID: unix time stamp
Global Hash Rate: http://mining.coincadence.com:9332/global_stats (pool_hash_rate)
Accepted Hash Rate: http://mining.coincadence.com:9332/global_stats (pool_nonstale_hash_rate)
Difficulty: bitcoind_rpc->getdifficulty
Number of Miners: http://mining.coincadence.com:9332/current_payouts

I already have found blocks and payouts per miner per found block in a separate table....

In looking at the "last_hour" graph data it reports 150 data points per 1 hour period, that seems like overkill for our purposes, most of my other data collectors run every minute on a standard unix cron job, I'm going to collect every minute to keep things simple, it should give plenty of resolution for any statistically valid reports and will limit DB growth to ~526k records per year...

Your point about starting data collection immediately after a found block is an excellent one. I'll start collecting ASAP and remove any data stored before the next found block.

Again thanks.

Still very open to any other feedback, rushing this a bit based on the non-availability of p2pool.info, but would still like to get it right the first time Wink



legendary
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
Alternative p2pool.info Suggestions:

I'm working on getting this live this week. I plan to try and reproduce all the data from the original, and include some other stuff we are already storing. I've looked over the p2pool.info Azure code (not something I am familiar with) and have decided that it will be faster to write my own version on a LAMP stack as that is already the foundation of the Coin Cadence site.

As you might expect there are already some substantial differences between Coin Cadence and p2pool.info as to what and how p2pool data is stored and collected.

The biggest change/advantage of my version will be the fact that all data is gathered locally from p2pool and bitcoind compared with the current p2pool.info implementation which pulls block data from blockchain.info

I have a few questions for the community as a whole as to how to calculate and present some of the stats:

Estimated time to block:
This is currently calculated on the fly by p2pool using the pool hash rate and "attempts_to_block" from:

http://mining.coincadence.com:9332/local_stats

Code:
      attempts_to_block=
        parseInt(local_stats.attempts_to_block || 0);
      time_to_block= attempts_to_block / pool_hash_rate;
      $('#expected_time_to_block').text((''+time_to_block).formatSeconds());

The problem I see is when to store this value in the DB. The pool hash rate fluctuates pretty wildly even when miners are relatively consistent, this is a fact of life when trying to calculate the global rate of the distributed pool. Add the fact that miners are joining and leaving p2pool on a regular basis and it becomes even more complicated.

Should the expected time to block be stored immediately after or before a block is found? Should it be stored every x minutes and an average calculated on a per block basis? Very open to suggestions!

Pool Luck:
Assuming we have decided on a satisfactory answer to storing the expected time to block above, this is pretty straight forward...

The question is how the luck stats are presented. In the current p2pool.info implementation luck is presented as a % over the last 7, 30 and 90 days.

I'm considering 2 alternatives:

1. Borrowing Slush's time frame and using 1, 7 and 30 days.

2. Basing it on blocks rather then time, i.e. current block luck, last 5 blocks, last 15 blocks, etc...

What do you think?

Estimated Hashrate:
in current p2pool.info implementation:
Quote
The following users / addresses have submitted at least 1 valid share in the past 24 hours. Note: Hashrates are very rough estimates based on the number of shares submitted in the past day. They may be off by 10-20% or more due to variance.

This uses some fuzzy math that I don't fully understand. If anyone has a method of calculating this and can explain it to me I'd love to hear it....

Here is my proposed solution, and to be honest I'm not sure if it is better or worse then the current implementation, and am very open to suggestions:

Using data provided by p2pool here: http://mining.coincadence.com:9332/current_payouts

Retrieve the following: current total block payout (including tx fees), payout per miner, global pool hashrate

Calculate miner % of total expected block payout.

Miner estimated hash rate = % of global pool speed based on % of expected payout??

So for example if a miner has 10% of the expected total payout, we can assume they have 10% of the global hash rate...

Again, fuzzy math at best and am open to suggestions....

Summary
I'd like feedback on my 3 suggested methods:

1. When to store estimated time to block (every x minutes and use average, just before or just after a block is found)
2. Calculating/Display format for pool luck
3. Estimating miner hash rates

Thanks!



Hey windpath,

You can always calculate the expected time to block.  I'm not sure why they use the formula "attempts / rate".  Expected time to block is based on the following:
Code:
Difficulty * 2**32 / hashrate / 86400 = number of days to find a block
As you can see, that value is going to fluctuate considerably based on hash rate.  One minute it could be 1.2 days, and the next 0.5 days.  This actually happened just a few days ago when one minute the pool reported about 500TH/s and the next it reported just over 1PH/s.

I guess what I'm stating is that the best you can hope for providing is an "average" value.  Collect expected time to block values every X units of time.  You can't just use either just before, or just after the block is found.

This will impact your luck calculations as well.  The constant in that equation is the time between blocks.  Since they are timestamped, you know exactly how long it took.  Then, depending on how many "expected time to block" values you've recorded, you can make an educated guess on the luck.  By the way, you'd have to start your stats collection immediately after p2pool finds a block for the most "accurate" calculations.

The reason the other calculations are so far off is because they are all based on submitted shares.  That's your "fuzzy math".  Your miners know their own hashing rate.  Why?  Because every single work unit is considered.  The pool does not, because not every work unit is considered.  It's an estimate based upon how much work of a given value is submitted.  Therefore, while extremely unlikely, it is entirely possible that a miner with 1GH/s submits 100 shares in a minute.  The pool would report that miner having a substantially higher hash rate that it actually does because of it.

Unfortunately, that's what we're stuck with.  Using a variation of the formula I gave above, you can estimate the miner's hash rate, since you know all of the other variables.  Let's use my example of the miner submitting 100 shares in a minute.  For simplicity's sake, we're going to make the assumption that those 100 shares were all that were submitted in a 24 hour period.
Code:
100 shares in 24 hours = 864 seconds to find a share.
Current share difficulty = 1508968.56
1508968.56 * 2**32 / hashrate = 864
hashrate = 7501123398023.39555555555556 hashes per second =  7.5TH/s
So, the miner is actually a 1GH/s unit, but p2pool thinks it's 7.5TH/s.  Obviously, this is a contrived example to display the effect of a miner not falling within expected parameters.  However, looking at the expected payouts, you would think this miner is in reality 7.5TH/s.

Alright, I've rambled on long enough and should probably get back to work Smiley
sr. member
Activity: 295
Merit: 250
I received 2 donations as well that were not from coin generation:

Amount: 0.00015444
Time: 2014-06-10 08:28:17

Amount: 0.00010090
Time: 2014-06-10 08:28:17

Thanks to whoever you are Smiley
Sadly, it seems my piddly 14 shares didn't rate. Sad Ah, well. Maybe I'll qualify next time. Smiley
legendary
Activity: 1270
Merit: 1000


Thanks for the link. I have not put my S1s on my p2pool yet because with my S2 on it they get high diff on the s1's. Now that I hear S2 is not currently efficient on p2pool I may swap and put my S1's on it with the new cgminer and put my S2 on reg pool.

I assume you void warranty when you upgrade cgminer this way Smiley

Is it possible to restore to default after this update by using the reset button on S1? or would I have to remove the updated cgminer manually?

you can revert back easily as you only rename the old one. you simply rename it back to go back to default. I don't think changing cgminer could void hardware warranty but you'll need to ask Bitmain about that.
If you have your s1's and the s2 on your node then use the diff setting as specified a couple of pages back.

essentially you add a /0+diff to the end of your miner address. for an s1 at around 200gh/s it's /0+230

Quote
If your running S1's set to 0 ("0" defaults to lowest p2pool diff, currently 1677854.73)

And to 220.4 (optimized for 190GH/s)

is calculated as your hash rate in KH/s times 0.00000116

i.e. 190,000,000 * 0.00000116 = 220.4

Is the pseudo share diff setting Address+number ever really necessary other than for graphs?
Jump to: