Pages:
Author

Topic: [~1000 GH/sec] BTC Guild - 0% Fee Pool, LP, SSL, Full Precision, and More - page 14. (Read 379084 times)

legendary
Activity: 1750
Merit: 1007
BTC Guild will likely be offering a PPS option in the near future for those who do not like variance, in a similar vein with Deepbit.  I have not decided if PPS will be included on the primary BTC page, or if I want to handle it on a forked page (like pps.btcguild.com).  It will be TRUE PPS, not SMPPS, so the pool's luck will never affect the rates/cause backpay situations.  All donator perks on BTC Guild will be automatically granted to PPS users.

The fee percentage hasn't been decided on yet, I'm going to run the numbers on how low the pool's luck has been since inception as a percentage of expectation [including invalid blocks] to determine how high my risk is based on our current relatively large historical sample size.  Just like Deepbit, over the long run you'll make more with standard proportional, but PPS is a nice insurance level where you have -guaranteed- revenue over any period of time, rather than a "close to average revenue" over a sufficiently long period of time.
legendary
Activity: 1750
Merit: 1007
"When I shut down I0 Guild, I realized that the payouts would likely not make it to the exchange in time for what looks like a bug that will completely destroy the I0 Coin fork until its patched and everybody updates.  "

If you're not busy, I'm a bit curious to hear about that.

Also, I had 0.5 i0coins on my account and couldn't ever get them out (even after I saw the warning) since it was below the payout threshhold of 1.0  (Not that half an i0coin is worth anything anyway)

Keep up the excellent work good sir.

My understanding is paraphrased from what I was told by ArtForz, so if I butcher the explanation I hope somebody smarter than me can correct my understanding.

I0Coin put in a time-based difficulty adjustment of 1 week, so if the network died as much as it has, the difficulty would drop without waiting for a ton of blocks that may never get made.  This check SHOULD be run based off the timestamp of the most recent block, so that individual system clock variances don't affect how the node determines what the difficulty should be dropped to.  This wasn't done.  If you turn on a fresh node today, it tries to adjust the difficulty before the chain is even accepted, refusing the valid block chain generated thus far.

Once 7 days have elapsed since the previous difficulty increase (which was from blocks, not time), the nodes that were already up and running with the block chain will likely have variances in the difficulty calculation because they may not be on and checking the NTP server at the exact time the change will occur.  This means all the i0coind nodes out there will have variances in what they think the valid difficulty is, refusing blocks from any node that does not agree.
sr. member
Activity: 418
Merit: 250
"When I shut down I0 Guild, I realized that the payouts would likely not make it to the exchange in time for what looks like a bug that will completely destroy the I0 Coin fork until its patched and everybody updates.  "

If you're not busy, I'm a bit curious to hear about that.

Also, I had 0.5 i0coins on my account and couldn't ever get them out (even after I saw the warning) since it was below the payout threshhold of 1.0  (Not that half an i0coin is worth anything anyway)



Keep up the excellent work good sir.
legendary
Activity: 1750
Merit: 1007
According to block explorer, block 142300 was found at 20:49:38 but btcguild claimed to find this one at 19:18:15. Is there any problem with your server?

Sorry, I thought I posted this on the forums as well but apparently I only talked about it on IRC [it occurred while I was at work]:

There was a small problem due to closing I0 Guild.  The code is literally a copy of BTC Guild, with an added flag at run time which tells it not to sync round data with the master server.  When I shut down I0 Guild, I realized that the payouts would likely not make it to the exchange in time for what looks like a bug that will completely destroy the I0 Coin fork until its patched and everybody updates.  So I turned the pool server back on, to help push the coins to the exchange for the last payouts.

The no-sync flag was not included, so when I0Guild found an i0 block, it pushed a new round to the master server which caused some 2547 shares to be logged as 2548.  I caught it at 2549.  The total time-shifting was about 1h 30 minutes, thus the time differences on 2547/2548 vs BlockExplorer.  At that point 2547 had already been calculated.  I made the decision to leave the shares alone, rather than risk duplicating or not counting some shares.  The effect on payouts would have been negligible, and altering rewards which already showed up as confirmed is something I would only do if there was a major problem.


I've gone in and corrected the end/start times of those rounds to keep them in line with the actual times per Block Explorer.  It makes the estimated hash rates for those blocks a bit off, but it keeps our database in line with actual block data, which is more important than things like estimated hash rates.  The problem that happened at I0 Guild is one that can't be repeated on SC Guild since it is on a different physical server that cannot access the MySQL servers on BTC Guild.  Future forks if they arise will also be launched on non-BTC Guild servers both as a precaution, and to avoid putting unrelated load on the BTC Guild servers.
newbie
Activity: 23
Merit: 0
According to block explorer, block 142300 was found at 20:49:38 but btcguild claimed to find this one at 19:18:15. Is there any problem with your server?
legendary
Activity: 1750
Merit: 1007
My btcguild Account page shows two blocks found, but the API page shows 0 blocks found for all workers.  I have not added/edited workers during the time when the most recent block was found (not 100% about the 1st block) so it shouldn't be because the worker that found it no longer exists.  Any ideas?

This is something I will be fixing soon.  The way block finders are recorded was drastically changed from the original code back when we were in our first 200 rounds, due to splitting up the servers.  The API still tries to pull that information from old data.
newbie
Activity: 51
Merit: 0
My btcguild Account page shows two blocks found, but the API page shows 0 blocks found for all workers.  I have not added/edited workers during the time when the most recent block was found (not 100% about the 1st block) so it shouldn't be because the worker that found it no longer exists.  Any ideas?
sr. member
Activity: 404
Merit: 250
I was asked to comment on this issue.  I'm providing a claim that Vlad's analysis is incorrect.  Since many of the people that read this thread are statistical laymen, I'm going to walk through this step by step.

First off, approximating the distribution with one huge encompassing Poisson Distribution is not the best choice in this scenario.  A much better choice would be use the central limit theorem to approximate this Binomial Density separated by difficulty.  The Poisson is only valid under certain criteria while the central limit theorem is pretty much good whenever n is large.


I will show my work for one iteration of this procedure and produce the results for the rest.  Taking difficulty 434877 as the example to go off:


// Notes - bolded letters are estimates of what the true value should be (estimates taken from Vlad's sheet)
// n - sample size
// p - estimated probability of success
// p = x/n, where x is the number of successes (estimated blocks found)

Y ~ Binomial(n,p)

n = 2016
p = 135.705/2016 = 0.0673

Y ~ Binomial(2016,0.0673)


From the central limit theorem, we know that a Binomial of sufficient sample size will follow a normal with mean n*p and variance of n*p*q.  Therefore in our example our distribution becomes the following

Y ≈ Normal(n*p, n*p*q)
Y ≈ Normal(135.677, 126.546)

To calculate the probability of getting less than some number observed Y (actual blocks found), all that's left is to convert our normal distribution to a standard normal and look up the p-value.

P(Y ≤ 134)
= P(Z ≤ (134-135.677)/sqrt(126.546))
= P(Z ≤ -0.149)
= 0.440

What does this value mean?  This means we are 44% likely to see a value this extreme or more at this difficulty(434877) which is completely acceptable.  

Things to consider with a grain of salt.  We gave an estimate for p when in fact p actually changes quite a lot during each difficulty with all the hashing power changes. I've also made a mention on the spreadsheet for occurrences that might indicate something odd happening, explained by DDoS attacks or other systematic errors fixed by patches later on.

If this original value value that Vlad had stated was true, I would be concerned.  However, thankfully this is not the case and I hope everyone can see the sense and reasoning posted here.

The remaining p-values are below for convenience and the sheet that I used to calculate said values is linked:

https://spreadsheets.google.com/spreadsheet/ccc?key=0AoAyWRmssbLKdHduLURqdENHckw0SzRNX3JhN3ZKV2c&hl=en_US

Difficulty| P-value       Verdict
434877.04   |0.439   Nothing wrong at all
567269.53   |0.000   Check For DDoS/Other Systematic Errors
876954.49   |0.449   Nothing wrong at all
1379192.28 |0.341   Nothing wrong at all
1563027.99 |0.040   Statistical Anomaly 4% chance?
1690895.8   |0.331   Nothing wrong at all
1888786.7   |0.001   Check For DDoS/Other Systematic Errors
1805700.83 |0.720   Nothing wrong at all

Before I say this, I just want to say that I completely trust Eleuthria, and I do not think that he is gaming the pool. I have chatted with him many times on IRC, and though I don't mine here anymore, it is only because it is proportional.

Anyway, I am not sure I follow the logic of breaking it out by individual difficulty. If there was theft, it would show up by everything being slightly lower, and only when you add them up do you get something significant. I added everything back together and get .02% chance it was caused by luck. This really is the best one to use.

I do believe that this was caused by technical issues coupled with legitimate bad luck, and not theft.

As an aside, this only covers the hiding blocks from bitcoind from the pool. This won't detect fake workers.
newbie
Activity: 21
Merit: 0
I was asked to comment on this issue.  I'm providing a claim that Vlad's analysis is incorrect.  Since many of the people that read this thread are statistical laymen, I'm going to walk through this step by step.

First off, approximating the distribution with one huge encompassing Poisson Distribution is not the best choice in this scenario.  A much better choice would be use the central limit theorem to approximate this Binomial Density separated by difficulty.  The Poisson is only valid under certain criteria while the central limit theorem is pretty much good whenever n is large.


I will show my work for one iteration of this procedure and produce the results for the rest.  Taking difficulty 434877 as the example to go off:


// Notes - bolded letters are estimates of what the true value should be (estimates taken from Vlad's sheet)
// n - sample size
// p - estimated probability of success
// p = x/n, where x is the number of successes (estimated blocks found)

Y ~ Binomial(n,p)

n = 2016
p = 135.705/2016 = 0.0673

Y ~ Binomial(2016,0.0673)


From the central limit theorem, we know that a Binomial of sufficient sample size will follow a normal with mean n*p and variance of n*p*q.  Therefore in our example our distribution becomes the following

Y ≈ Normal(n*p, n*p*q)
Y ≈ Normal(135.677, 126.546)

To calculate the probability of getting less than some number observed Y (actual blocks found), all that's left is to convert our normal distribution to a standard normal and look up the p-value.

P(Y ≤ 134)
= P(Z ≤ (134-135.677)/sqrt(126.546))
= P(Z ≤ -0.149)
= 0.440

What does this value mean?  This means we are 44% likely to see a value this extreme or more at this difficulty(434877) which is completely acceptable.  

Things to consider with a grain of salt.  We gave an estimate for p when in fact p actually changes quite a lot during each difficulty with all the hashing power changes. I've also made a mention on the spreadsheet for occurrences that might indicate something odd happening, explained by DDoS attacks or other systematic errors fixed by patches later on.

If this original value value that Vlad had stated was true, I would be concerned.  However, thankfully this is not the case and I hope everyone can see the sense and reasoning posted here.

The remaining p-values are below for convenience and the sheet that I used to calculate said values is linked:

https://spreadsheets.google.com/spreadsheet/ccc?key=0AoAyWRmssbLKdHduLURqdENHckw0SzRNX3JhN3ZKV2c&hl=en_US

Difficulty| P-value       Verdict
434877.04   |0.439   Nothing wrong at all
567269.53   |0.000   Check For DDoS/Other Systematic Errors
876954.49   |0.449   Nothing wrong at all
1379192.28 |0.341   Nothing wrong at all
1563027.99 |0.040   Statistical Anomaly 4% chance?
1690895.8   |0.331   Nothing wrong at all
1888786.7   |0.001   Check For DDoS/Other Systematic Errors
1805700.83 |0.720   Nothing wrong at all
legendary
Activity: 1148
Merit: 1001
Radix-The Decentralized Finance Protocol
OK, that's just my 2c worth, but, well, the mining community is seriously full of superstition and I think a level minded comment is needed.

Judging by the reaction I say that the mining community is very mature.
kjj
legendary
Activity: 1302
Merit: 1026
I would like to add that statistics is the most difficult branch of mathematics that most people are even aware of.  In general, if you can follow a statistical argument, it is almost certainly wrong.

His analysis probably isn't very wrong, other than the way he stacked all of the gray areas against, but there are some questionable areas.

For example, finding a block isn't really a function of the number of shares found.  Shares and blocks are just two different thresholds on the same probability distribution.  And hashes aren't counted, they are estimated.  I don't know that it is necessarily wrong to use a probability distribution as the interval input to the Poisson function, but I can promise that if you get a number out of it, and not another probability distribution, you've missed some important steps.

At any rate, he is talking about a 5.6 to 6.5% shortfall.  That's not nearly enough for me to worry about.
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
Vladimir has alleged that ele has 'stolen' 8,000 BTC from the guild miners since 1 June 2011...

https://bitcointalksearch.org/topic/m.474649

Not saying I agree with him, but I wouldn't mind someone debunking his methodology...

Well firstly, I went and visited that link and had a bit to say about the comment about hopping ...

But I will also add this:
MINING IS STATISTICAL

Many people REALLY do not understand this nor what it means.

It means that % will go up and down.

I could point out that for most of the past week I have been mining here almost continuously (I did a day or so of I0Mining) and been paid MORE than the expected BTC
(and anyone can work their own expected BTC using my calculator in my sig: http://tradebtc.net/bitcalc.php )

So should someone now be saying OMG I'm ripping of BTCGuild?

I don't hop coz I know it makes no difference (as I said I've been mining here for most of a week) and unless someone has come up with a bizarre mathematics to determine another way to beat a share% paying pool, then the reason why I'm getting more than my expected BTC is: probabilities of course Tongue

Yes BTCGuild could be putting in fake members and stealing shares that way, but again it all boils down to percentages and they already charge a percentage (and rip people off with lose on the invalid share bit Cheesy) but otherwise, there really is no reason to even consider the rest of the scams included in that page for a big pool like BTCGuild

OK, that's just my 2c worth, but, well, the mining community is seriously full of superstition and I think a level minded comment is needed.
legendary
Activity: 1750
Merit: 1007
Really nothing I can do at this point.  You can't disprove a negative.  You can prove someone stole, you can't prove someone didn't steal.  At this point I can just sit here arguing or ignore it, and I'm choosing the latter.  The last time I responded I gave our stats over the last few difficulties that I had recorded and display. It showed about a 20% chance of our luck over the recorded difficulties using Vladmir's method to calculate the probability with a Poisson distribution calculator.  I didn't bother going back into the past because I did not track difficulties/luck at previous levels, so I would've had to hunt down the proper block #s and calculate our luck for each one at a time where I was trying to respond to the witch hunt as quickly as possible.

All I know is there were a LOT of software problems between June and July, which was when we were massively expanding (peaking at 8 servers).  There were times where servers ran with duplicate wallet files (meaning they were [possibly] hashing the same headers/duplicating work).  There were times where the pools were going up/down regularly, which I believe caused them to resend the same getwork space a few times thus awarding shares to duplicate work due to the crashes.  And there were the JoelKatz patches trying to keep up with scaling the servers which had a number of issues before becoming stable.

At this point I'll just let http://l0ss.net do the talking, which is monitoring the luck for 6 of the largest pools, now with nearly 1 month of time included.  We swing up, we swing down, and I have no desire to make my life a living hell by refreshing our block page every 15 minutes to see which way we're swinging.

EDIT:  I can definitely say his method of adding shares from invalids to other blocks skews results further in his favor, since our invalid rate in that time was much higher due to server attacks, botnets making our connections unreliable, and the problems implementing JoelKatz' patches.  Still looks bad, but as I said at the start:  It's not possible to disprove the assertion.
full member
Activity: 196
Merit: 100
Vladimir has alleged that ele has 'stolen' 8,000 BTC from the guild miners since 1 June 2011...

https://bitcointalksearch.org/topic/m.474649

Not saying I agree with him, but I wouldn't mind someone debunking his methodology...
legendary
Activity: 2072
Merit: 1001
Oh god.  Now solidcoins.  What's next ??

I am not sure there will be any next ones. After 3 different scam artists creating new clones of bitcoin have come out...
people are going to run out of steam for them. There is only so many greater fools out in the world that have BTC they
want to trade for a forked clone's coins. I expect the idea will settle down for a while as these fade away into obscurity.

but on the topic of this pool mining BTC.. the pool is on fire. Great times.
sr. member
Activity: 418
Merit: 250
Oh god.  Now solidcoins.  What's next ??
hero member
Activity: 784
Merit: 1009
firstbits:1MinerQ
Ok. I mistakenly thought a share was a fixed portion we did and only at the block level it was probabilistic.

Seems the BTC calculators project 0.334 BTC/day whereas I just got 0.289 in reality.
On BitClockers I had three days below 0.190/day. I siwtched here as I figured BTCGuild, finding blocks more quickly due to it's larger size, would smooth out the variance more and provide averages closer to the projected values.

If the shares are also statistical then I guess I should expect some days that are also above the 0.334 projection.
(value based on 600 MH/s).
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
A share isn't a fixed amount of work.

There is a statistically expected number of shares in each work unit.

For calculation example just assume there is expected to be 1 share per 2G Hashes.
Thus if you find 2 shares in every 2G Hash work unit it would actually assume your hash rate is twice what it is.
Thus depending on how often your hash rate is calculated from your share acceptance, it is normally expected to vary.

In case it's not obvious, a block is just a share that has higher difficulty.

Comparison: There is currently expected to be one block per 7755544.3780275 G Hashes - however blocks don't appear exactly every 10 minutes, in fact one pair of blocks this morning (141,798 & 141,798) were 42 minutes apart - thus if you used the appearance of those 2 blocks (42 minutes) to estimate the total internet Hash rate over those 42 minutes, your estimate would most likely be about 1/4 of what it really was.
hero member
Activity: 784
Merit: 1009
firstbits:1MinerQ
I'm talking about my own hash rate not the pool estimate. Isn't a share a fixed amount of work, or MHash calculations?  (That's a real question as I'm not sure.) I thought for each share I submit I do 2^32 hashs (or something like that) and given the server knows time between shares submitted, shouldn't it also know the exact hash rate I maintain? But it always reports varying rates different from what the miner reports.
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
Any pool estimate of hash rate is based on your share acceptance rate.

Since finding shares is random and thus is only a statistically expected value, the pool's estimate of your hash rate will of course vary and may even vary widely.

Once you go from a single sample to the (statistical) population of the pool - then the total shown by the pool is quite accurate.

There is no obvious solution to this since the only way for a pool to know your real hash rate would be for your software to supply it.
Then if people were to not report that correctly, the pool's estimate of their full hash rate would come in to doubt.

Since the pools report their total hash rate and using share rate is actually quite accurate over the full population, it is certainly best to use that.
Pages:
Jump to: