Pages:
Author

Topic: Neighbourhood Pool Watch - page 6. (Read 49907 times)

donator
Activity: 2058
Merit: 1007
Poor impulse control.
August 02, 2012, 07:37:39 AM
Orphan rates for all pools are published in the Weekly Pool Stats thread. Orphans have been very low and round lengths shorter for p2Pool for a few weeks (I think).

I hope that the luck increase is real and not just an artefact of the round length hashrate calculation method - everyone seeing an increase in earnings?
zvs
legendary
Activity: 1680
Merit: 1000
https://web.archive.org/web/*/nogleg.com
August 02, 2012, 07:14:25 AM
Evil. PPS is so doomed...

Has it been adressed, that p2pool includes lots of transactions in the block genrating a bigger size of block.

The p2pool nodes are often/almost anytime behind home broadband. even with 1 mbit upstream a 130 kb block would take 8 seconds to propagate fully... could this explain the higher orphan rate?



it hasnt gotten an orphan since i started using it & set up my server (5.9.24.81).. about 3 weeks?
sr. member
Activity: 270
Merit: 250
August 02, 2012, 04:25:57 AM
8secs would account for slightly over 1% orphans, if it's even that high.
legendary
Activity: 1361
Merit: 1003
Don`t panic! Organize!
August 02, 2012, 04:16:31 AM
The p2pool nodes are often/almost anytime behind home broadband. even with 1 mbit upstream a 130 kb block would take 8 seconds to propagate fully... could this explain the higher orphan rate?
Thats why info about new block is spread between p2pool nodes Smiley
sr. member
Activity: 339
Merit: 250
dafq is goin on
August 01, 2012, 11:08:11 PM
Evil. PPS is so doomed...

Has it been adressed, that p2pool includes lots of transactions in the block genrating a bigger size of block.

The p2pool nodes are often/almost anytime behind home broadband. even with 1 mbit upstream a 130 kb block would take 8 seconds to propagate fully... could this explain the higher orphan rate?

legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
August 01, 2012, 10:33:22 PM
And to wake up the thread before it gets old ... Smiley
So ... if someone was to get some BFL MiniRigs ... or some time in the far distant future ... a bunch of ASIC ...
Wouldn't it be best for them to mine on a PPS pool and withhold blocks ... so they don't increase the difficulty and thus get more BTC ...
donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 18, 2012, 07:19:06 AM
IF we have slight over calculation there (like I rounded 4.295 to 4.3) pool hash rate can be inaccurate reported and p2pool.info block ETA can be wrong!


That's right. However, it would only be out by the rounding error - in your example -0.005/4.3 = - 0.11%. However, the mean round shares are +10% compared to expected - that's a big rounding error. There could be some other miscalculation though.
legendary
Activity: 1361
Merit: 1003
Don`t panic! Organize!
July 18, 2012, 07:07:35 AM
Diff1 share = 2^48/65535 hashes = 4`295`032`833 hashes
If you have 4,295GH/s you get 1 diff1 share every 1 second

1. How p2pool calculate miner hash rate?
- it counts diff1 shares over time diff1 share/sec is 4,3GH so 1 share ever 10 second is 430Mh/s
2. How p2pool calculate network hash rate?
- it counts last shares in share chain over time and multiply it by current chain share diff (maybe every share, not sure)

IF we have slight over calculation there (like I rounded 4.295 to 4.3) pool hash rate can be inaccurate reported and p2pool.info block ETA can be wrong!
donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 18, 2012, 04:49:01 AM
Hash rate should NEVER be used to determine pool performance since there is no way to even gauge it accurately.
The hash rates shown for p2pool are simply the share rates converted to a hash rate (and who knows if that has even been done correctly or not ...)

Actually, it's the other way around I think - hashrates determine the shares per round. But otherwise I agree. I would prefer to have actual round length data, but that's not possible unfortunately and could be a source of error.
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
July 18, 2012, 04:40:52 AM
For every gigahash the number of shares per day is a constant value, 20,116.5676 shares per day.
1000 megahashes = 20,116.5676 shares per day.
1000 megahashes = 838.190317 shares per hour.
...
The number of shares you state are simply the expected number of shares.
Every hour you check, you will see a different value.

Over time with a LARGE sample you would expect the numbers towards converge to the expected values, however, making a statement like "1000 megahashes = 838.190317 shares per hour" is simply naive.

GH/s -> shares is a random function.

No pool can supply total hashes accurately.
Miner programs can not even supply that accurately to the pools that accept it.

However, more importantly, it doesn't matter what the calculation of GH/s -> shares is since that is in no way relevant to the expected performance of a pool.

The performance of a pool is simply to show that over an extended period of time, it will find close to an expected number of blocks given a number of shares supplied by miners.
It is certainly not any issue at all whatsoever to a pool, if a miner supplies less shares per MH/s than the miner expects.
In fact no pool would even give a damn about it.

... and to give a very specific example ... I have a faulty BFL that seems to only find shares with one of the two FPGAs in it.
I send the work to it and it responds in the expected amount of time (~5.1s) every time, however it finds half the expected number of shares.
Why would my pool give a damn about that?

In fact EVERY pool would estimate my hash rate as ~half what my BFL is saying it is doing ... as EVERY pool should.

Hash rate should NEVER be used to determine pool performance since there is no way to even gauge it accurately.
The hash rates shown for p2pool are simply the share rates converted to a hash rate (and who knows if that has even been done correctly or not ...)
donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 18, 2012, 03:26:50 AM
BTC found per 2 weeks / average gigahashes per 2 weeks = (actual value paid) PPGH for that 2 week period
versus
Expected Payout, like that determined using a Bitcoin Calculator.
or
(current block reward) / (current difficulty) = Expected PPS
It appears that complex statistical analysis is NOT necessary to determine p2pools profitability.

Appearances can be deceptive.

You might not realise it, but you are actually proposing quite a "complex statistical analysis" here. It might not be hard to calculate results, but how do you determine the the confidence interval? Stating that the results are "close to expected" is not enough. You need to be able to provide at minimum a confidence interval and show that both theoretical and empirical results lie within it.

I agree that this method would provide a more robust analysis than shares per round/D. The problem with an analysis such as you suggest is that you need to have consistent payment data for a large portion of the pool and for a large portion of the pool's history. I'm not sure that the payment results for one miner - even starting from the history of the pool - would be sufficient to produce results with a suitable confidence interval. I did attempt to do this in the weeks following my post, but even with a good many miners sending me data it wasn't sufficient to rule out either the expected payments or payments at 0.91*expected.

I'd still like you to post your data here so I can have a look at it though.


Edit: Perhaps a better way to explain this is by asking you exactly what you're trying to prove. You can't just prove that your results are close to expected, you have to prove that your results exclude what you're trying to disprove, to a selected confidence level - usually p<0.05 or p<0.01. Start by reading about the null hypothesis. The link to the online stats & probability course I gave in a previous post devotes a whole section to it.

I hope you realise I'm not being contrary for the sake of it, but simply pointing out that analyses of this type are harder than you  think if you want to be able to come to a provable conclusion.







full member
Activity: 196
Merit: 100
Web Dev, Db Admin, Computer Technician
July 18, 2012, 03:03:08 AM
For every gigahash the number of shares per day is a constant value, 20,116.5676 shares per day.
1000 megahashes = 20,116.5676 shares per day.
1000 megahashes = 838.190317 shares per hour.

Once discovered, a block becomes a constant value.
Hash rate for a given period needs to be averaged, (2weeks to coincide with difficulty?).

BTC found per 2 weeks / average gigahashes per 2 weeks = (actual value paid) PPGH for that 2 week period
versus
Expected Payout, like that determined using a Bitcoin Calculator.
or
(current block reward) / (current difficulty) = Expected PPS

It appears that complex statistical analysis is NOT necessary to determine p2pools profitability.
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
July 18, 2012, 02:09:24 AM
...
@organofcorti  If round shares are so chaotic why would you use them in your calculations when simpler values are available?
Shares are the only accurate way to measure pool performance.
Any other choice is meaningless (including saying 'I earned more than PPS')
donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 18, 2012, 02:02:02 AM
You seemed to be confused. I have posted results and that's not enough for you. The only logical extension that would end the disagreement is a complete actual v. expected comparison, to which you reply 'NO'.  Roll Eyes
I have 114 days of data sampled from various periods of time, beginning, middle, middle+ and recent; 4 separate periods. I did post results on this forum 2 here and once in the p2pool thread, none of my samples shows a profit loss by mining at p2pool. Can I reasonably conclude that a statement which says 'p2pool is not profitable' is erroneous?

I haven't seen these results - can you repost here?

Quote
@organofcorti  If round shares are so chaotic why would you use them in your calculations when simpler values are available?

Which simpler values would you be referring to? I can't see a way to work out if the mean pool round length is significantly above the expected mean or not. (EDIT: without using total round shares)

I'm not sure you can say round shares are chaotic, and I was wrong if I wrote that somewhere. p2Pool round shares/D still appear geometrically distributed, as is clear from the blog post.




full member
Activity: 196
Merit: 100
Web Dev, Db Admin, Computer Technician
July 18, 2012, 01:49:59 AM
Quote from: smoovious
So far, I haven't seen any comparable analysis from your side yet. So... put up or shut up?

-- Smoov
So then, a graph, from beginning to current, showing actual v. expected, would be the only counter point you would except, is that right?
No... I'm saying, that... when faced with being up against one man's opinion, which he backs up with the data, which he put a lot of work into collecting, and processing it into something meaningful, explaining what he is looking for, what to expect, and what it is actually showing, in his interpretation...

...that you're going to have to do better to counter his opinion than with the equivalent of, "No it's not."

You're going to have to counter his effort, with actual effort of your own, which proves his conclusions in error.

We have yes-it-is, no-it-isn't, oh-yes-it-is, oh-no-it's-not debates all day every day in IRC, and in the forums too.

This isn't one of those debates. He stepped up. Your turn.

-- Smoov
You seemed to be confused. I have posted results and that's not enough for you. The only logical extension that would end the disagreement is a complete actual v. expected comparison, to which you reply 'NO'.  Roll Eyes
I have 114 days of data sampled from various periods of time, beginning, middle, middle+ and recent; 4 separate periods. I did post results on this forum 2 here and once in the p2pool thread, none of my samples shows a profit loss by mining at p2pool. Can I reasonably conclude that a statement which says 'p2pool is not profitable' is erroneous?

To get a length of string into 8 pieces, you only need to divide by 7. Wink 1 / 7 = 8 Cheesy

@organofcorti  If round shares are so chaotic why would you use them in your calculations when simpler values are available?
hero member
Activity: 504
Merit: 500
Scattering my bits around the net since 1980
July 03, 2012, 04:46:52 AM
Quote from: smoovious
So far, I haven't seen any comparable analysis from your side yet. So... put up or shut up?

-- Smoov
So then, a graph, from beginning to current, showing actual v. expected, would be the only counter point you would except, is that right?
Considering my math skill level that would take a month at least, that includes the youtube videos I will need to watch to learn how to use OoCalc. Wink
No... I'm saying, that... when faced with being up against one man's opinion, which he backs up with the data, which he put a lot of work into collecting, and processing it into something meaningful, explaining what he is looking for, what to expect, and what it is actually showing, in his interpretation...

...that you're going to have to do better to counter his opinion than with the equivalent of, "No it's not."

You're going to have to counter his effort, with actual effort of your own, which proves his conclusions in error.

We have yes-it-is, no-it-isn't, oh-yes-it-is, oh-no-it's-not debates all day every day in IRC, and in the forums too.

This isn't one of those debates. He stepped up. Your turn.

-- Smoov
donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 02, 2012, 06:59:36 PM
From what I've read of your posts, I don't think that's something you know - it's something you've been told.
Actually, by Actuaries, one of which is a family member, who passed the first 4 tests without a retest, and does complex math in the head I would have trouble entering into a calculator. Yes it's true my math sk1llz are limited to some simple algebra at best and Boolean equations make me cross-eyed.
Get your actuary family member to have a read the blog post (if they understand pooled bitcoin mining) - I'm sure there will be at least something in there they'll disagree with, and I would like the feed back. I don't know any actuaries. I'd thought they were a myth, something to scare baby statisticians with when they're naughty.

I know you're emotional about this, but that doesn't excuse you from accusing me of falsifying data when you don't yet have the knowledge on which to base such a claim.
I speak for and to those who can't do complex math to solve a simple problem. You say emotional I say concerned, but I guess we are each attempting to portray a perspective.
You didn't answer my main question - are you still ok with so forcefully stating that I'm anti-p2Pool and that I have falsified or skewed data? If you suspect I have not been honest in my analysis, then read the blog post. If you come to something you don't follow, post a message here and I'll try to explain it or provide links. It'll be a good learning experience for lots of people who would like to understand the analyses better.

Quote from: smoovious
So far, I haven't seen any comparable analysis from your side yet. So... put up or shut up?
-- Smoov
So then, a graph, from beginning to current, showing actual v. expected, would be the only counter point you would except, is that right?
Considering my math skill level that would take a month at least, that includes the youtube videos I will need to watch to learn how to use OoCalc. Wink
Here's one I provided in the blogpost with a 100 block rolling average (smooths the graph):





In case you're wondering what effect the rolling mean might have, here's the original data:





Finally, if you're going to use anything for math software and you know even a little BASIC or python, I'd try R .


full member
Activity: 196
Merit: 100
Web Dev, Db Admin, Computer Technician
July 02, 2012, 03:54:51 PM
From what I've read of your posts, I don't think that's something you know - it's something you've been told.
Actually, by Actuaries, one of which is a family member, who passed the first 4 tests without a retest, and does complex math in the head I would have trouble entering into a calculator. Yes it's true my math sk1llz are limited to some simple algebra at best and Boolean equations make me cross-eyed.

So I'm encouraging you to learn about statistical analyses. http://oli.cmu.edu/courses/free-open/statistics-course-details/ is good, and assumes no prior stats knowledge on your part.
I appreciate the learning link, this will be handy in the future.

I know you're emotional about this, but that doesn't excuse you from accusing me of falsifying data when you don't yet have the knowledge on which to base such a claim.
I speak for and to those who can't do complex math to solve a simple problem. You say emotional I say concerned, but I guess we are each attempting to portray a perspective.

Quote from: smoovious
So far, I haven't seen any comparable analysis from your side yet. So... put up or shut up?

-- Smoov
So then, a graph, from beginning to current, showing actual v. expected, would be the only counter point you would except, is that right?
Considering my math skill level that would take a month at least, that includes the youtube videos I will need to watch to learn how to use OoCalc. Wink
donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 01, 2012, 07:09:38 PM
@ all the p2Pool miners that posted me data:

Thanks for all your help! Unfortunately, the data is insufficient and in some cases measures different things.

As much as I would have liked to have written a follow up from an earnings point of view, there were too few who collected actual numerical data, and I had to make too many guesses with the public node data. I'm just not very confident about the result.

If anyone wants to start collecting data now, I need:

timestamp* |  block number   |   valid shares  submitted for block   |   local average earnings for block  | difficulty for block*

* = preferred data but not as essential as the other data listed.


If I can get at least half the pool hashrate doing this for a two months, I'll have data which will have a much narrower and useful 95% confidence interval.

check_status, I'd like you to be involved. If I show you what to do with the data once collected you might gain more of an insight into probability and statistics, and you can be confident that the results are correct since you will have collected some of the data and performed the analysis yourself as well.



donator
Activity: 2058
Merit: 1007
Poor impulse control.
July 01, 2012, 07:31:21 AM
I know that statistics can be skewed to promote a particular view.

From what I've read of your posts, I don't think that's something you know - it's something you've been told. So I'm encouraging you to learn about statistical analyses. http://oli.cmu.edu/courses/free-open/statistics-course-details/ is good, and assumes no prior stats knowledge on your part.

As far as skewing data goes, I did the same analysis as always. If there was a problem with it, other analyses would have shown the same problem. Also, forum members better at math than I am would certainly have let me know.

I know you're emotional about this, but that doesn't excuse you from accusing me of falsifying data when you don't yet have the knowledge on which to base such a claim. Get some learning, then start showing me where the analysis is wrong.


Pages:
Jump to: