Pages:
Author

Topic: variance in block times --- std deviation (Read 1846 times)

legendary
Activity: 1456
Merit: 1081
I may write code in exchange for bitcoins.
April 27, 2015, 10:06:18 PM
#22
- snip -
I think I understand your question as "how can they always target 10, since adding or removing a requirement for a zero moves the difficulty target exponentially".  I would also like to know the answer to this,
- snip -

We don't just add or remove a zero.

Adding or removing a zero was an example in a concept document about how a difficulty could be adjusted.  It does not describe how the difficulty adjustment was actually implemented once the software was written.

Instead, the difficulty target is simply a 256 bit number.  Any hash value that is less than the target is valid regardless of how many zeros there are.

Now in practice, a number that is less than the target will never have LESS leading zeros than the target has, but it might have the exact same number of zeros if the rest of the hash is less than the target difficulty.

I think that pretty much explains it.  Thanks Danny Hamilton!
legendary
Activity: 924
Merit: 1132
It isn't about a particular number of leading zero's, it's about getting under a particular target value.  

Satoshi's code from 2008 had an 'nBits' variable which gave a leading number of zeros that the hash had to be under; that would allow only doubling or halving.  But by the time it deployed, he had changed it to the current scheme to allow finer-grained adjustments.  I think the revision was Hal Finney's idea.  

The target value these days is encoded in 32 bits: a 24-bit integer raised to an 8-bit power of 2, which is then converted to a 256-bit integer - the set of values is the same as a 24-bit integer unsigned integer times 2 raised to some power up to 212.  

Short version of the story, the difficulty target goes up or down with a granularity of 1 part in ~8 million, so it's pretty sensitive in terms of adjustability.
newbie
Activity: 21
Merit: 2
Cheers for that Danny.

Btw, would a realtime difficulty adjustment be feasable in order to garantee a 10-12 min range for instance ? I mean from a architectual perspective ?

I guess not since it would probably be done already. But it would have advantages I take it.


sr. member
Activity: 362
Merit: 262
The other thing with the exponential distribution is it has no maximum value.

So you could potentially have a long wait (but with low probability).

Here are some more probabilities:
min   probability 1 in
10   36.7879%   2.7
20   13.5335%   7.4
30   4.9787%   20.1
40   1.8316%   54.6
50   0.6738%   148.4
60   0.2479%   403.4
70   0.0912%   1096.6
80   0.0335%   2981.0
90   0.0123%   8103.1
100   0.0045%   22026.5
110   0.0017%   59874.1
120   0.0006%   162754.8

So chances of a block taking longer than 2 hours is 0.0006% (or 1 in 162 755 blocks).

Assuming difficulty and hash rate is in perfect balance and stable.
legendary
Activity: 3472
Merit: 4801
- snip -
I think I understand your question as "how can they always target 10, since adding or removing a requirement for a zero moves the difficulty target exponentially".  I would also like to know the answer to this,
- snip -

We don't just add or remove a zero.

Adding or removing a zero was an example in a concept document about how a difficulty could be adjusted.  It does not describe how the difficulty adjustment was actually implemented once the software was written.

Instead, the difficulty target is simply a 256 bit number.  Any hash value that is less than the target is valid regardless of how many zeros there are.

Now in practice, a number that is less than the target will never have LESS leading zeros than the target has, but it might have the exact same number of zeros if the rest of the hash is less than the target difficulty.
legendary
Activity: 1456
Merit: 1081
I may write code in exchange for bitcoins.
We can work out using http://en.wikipedia.org/wiki/Exponential_distribution#Quantiles that:
99% of blocks to be found in 46mins..  Which means 1 in 100 blocks will take longer than that.

This is assuming a stable hash-rate/difficulty.

I'm sure I'm missing something, but isn't that equation based on continuous parameters?  Whereas the number of leading zeros is always discreet.  So what happens when, say 2 leading zeros at a given hashrate would lead to 8 min block times but 3 leading zeros would lead to 16 minutes.  You can't require 2.5 zeros.  Sorry for my dumbness.
sr. member
Activity: 362
Merit: 262
We can work out using http://en.wikipedia.org/wiki/Exponential_distribution#Quantiles that:
99% of blocks to be found in 46mins..  Which means 1 in 100 blocks will take longer than that.  This is assuming a stable hash-rate/difficulty.

Might help with the original query.
legendary
Activity: 1456
Merit: 1081
I may write code in exchange for bitcoins.
I have a question complementing this thread.
I have copied a chunk from the whitepaper:

Quote
The average work required is exponential in the number
of zero bits required and can be verified by executing a single hash.
For our timestamp network, we implement the proof-of-work by incrementing a nonce in the
block until a value is found that gives the block's hash the required zero bits

Every 2 weeks difficulty is adjusted to keep the 10 minutes avarage in place, but how can this be garanteed since adding or subtracting a nonce will change the difficulty and thus the average exponentialy.
Is there no risk the average will go from lets say 8 minutes to 16 minutes ?

I think I understand your question as "how can they always target 10, since adding or removing a requirement for a zero moves the difficulty target exponentially".  I would also like to know the answer to this, but to clarify the quote from the whitepaper, the nonce they're incrementing is just a value added to the block hash in order to make the block hash to the right number of leading zeros.  They keep incrementing the nonce until they find a block hash the satisfies the difficulty requirments.  But the "nonce" is not the same thing as the leading zeros themselves.  I await an answer to your actual question from someone smarter than me Smiley
newbie
Activity: 21
Merit: 2
I have a question complementing this thread.
I have copied a chunk from the whitepaper:

Quote
The average work required is exponential in the number
of zero bits required and can be verified by executing a single hash.
For our timestamp network, we implement the proof-of-work by incrementing a nonce in the
block until a value is found that gives the block's hash the required zero bits

Every 2 weeks difficulty is adjusted to keep the 10 minutes avarage in place, but how can this be garanteed since adding or subtracting a nonce will change the difficulty and thus the average exponentialy.
Is there no risk the average will go from lets say 8 minutes to 16 minutes ?
legendary
Activity: 1456
Merit: 1081
I may write code in exchange for bitcoins.
Block finding is just a poisson process; its variance of the intervals well defined-- 1/(lambda)^2; and thus the std dev is the same as the mean.  Observed empirical variance will be slightly different due to shot noise, changes in hashrate, etc... but close enough.

It's important to keep in mind that the variance is a necessary component; without variance the network would never converge again after a fork.

Thanks gmaxwell, this definintely answers my question concretely.  I can look up the poisson process in wikipedia or something to know more.

Std Deviation is 10 minutes, but it's asymmetric.  Obviously, the odds of the next block arriving before the current one is zero (although timestamps sometimes lie to the contrary), but the odds of a block taking 20 minutes, 30 minutes, etc, are positive. 

It's got what I think of as a "half-life."  I'm not entirely sure how long the half-life is but I think it's something around 7 minutes.  So I'd expect a distribution something like: it's ALWAYS (how long since the last one doesn't matter) got a 50% chance of arriving within the next 7 minutes, 25% chance of arriving between 7 and 14 minutes, 12.5% chance of arriving between 14 and 21 minutes, etc. 

But I'm not entirely sure how long the "half-life" period is.

Thanks cryddit, your answer makes it really clear why the tail on th right is so much longer than the one on the left (you hit a wall on the left where the previous block happened and there's no such wall on the right).

Thanks to others as well, I'm going to read some wikipedia articles now and try to get a little smarter.  Cheers!
legendary
Activity: 1246
Merit: 1011
And a weird property of the Poisson distribution is that it doesn't matter how long you've already been waiting, the expectation time for the next block is always 10 minutes from now.  

Indeed.  This seems to catch many people out.  Given the information: "there is one block on average every 10 minutes", one would naturally expect the age of the current block to have some bearing on the expected time to the next block.

Oddly, absent knowledge of the most recent blocks, the expected age of the current block is also always 10 minutes.  This paradoxically suggests that the average gap between two consecutive blocks is 20 minutes.

legendary
Activity: 1162
Merit: 1007
And a weird property of the Poisson distribution is that it doesn't matter how long you've already been waiting, the expectation time for the next block is always 10 minutes from now
legendary
Activity: 1246
Merit: 1011
Std Deviation is 10 minutes, but it's asymmetric.  Obviously, the odds of the next block arriving before the current one is zero (although timestamps sometimes lie to the contrary), but the odds of a block taking 20 minutes, 30 minutes, etc, are positive.  

It's got what I think of as a "half-life."  I'm not entirely sure how long the half-life is but I think it's something around 7 minutes.  So I'd expect a distribution something like: it's ALWAYS (how long since the last one doesn't matter) got a 50% chance of arriving within the next 7 minutes, 25% chance of arriving between 7 and 14 minutes, 12.5% chance of arriving between 14 and 21 minutes, etc.  

But I'm not entirely sure how long the "half-life" period is.

Yes, this is an apt analogy.

Also, good guess!  The corresponding period is
-log(1 - 50%) * 10 minutes
=log(2) * 10 minutes
=6 minutes and 55.89 seconds (2.d.p)
legendary
Activity: 924
Merit: 1132
Std Deviation is 10 minutes, but it's asymmetric.  Obviously, the odds of the next block arriving before the current one is zero (although timestamps sometimes lie to the contrary), but the odds of a block taking 20 minutes, 30 minutes, etc, are positive. 

It's got what I think of as a "half-life."  I'm not entirely sure how long the half-life is but I think it's something around 7 minutes.  So I'd expect a distribution something like: it's ALWAYS (how long since the last one doesn't matter) got a 50% chance of arriving within the next 7 minutes, 25% chance of arriving between 7 and 14 minutes, 12.5% chance of arriving between 14 and 21 minutes, etc. 

But I'm not entirely sure how long the "half-life" period is.

staff
Activity: 4284
Merit: 8808
Block finding is just a poisson process; its variance of the intervals well defined-- 1/(lambda)^2; and thus the std dev is the same as the mean.  Observed empirical variance will be slightly different due to shot noise, changes in hashrate, etc... but close enough.

It's important to keep in mind that the variance is a necessary component; without variance the network would never converge again after a fork.
legendary
Activity: 1456
Merit: 1081
I may write code in exchange for bitcoins.
Ok, some kinda sig-ad warfare happened here with CIYAM and someone else.  Anyway, from what I recall from a course taken a long time ago, we calculate variance by doing the sum of square differences from the mean  and divide that by the sample size.  Then maybe the square root of that is std-deviation?  I guess I was hoping that one of the whizzes out there would already have this for me (ie, it's a known) rather than me having to look into it.

@CIYAM I think the longest possible time is infinity.  Since we can never guarantee that the next block will be found (ie, the true distribution of times between blocks should have tails that asymptotically approach infinity (one one side) and zero (on the other)), but I'm not so much intersted in the max and min but in an empirical sampling over which we can calculate variance (I'm sure there's some such sampling being done in order to make the difficulty adjustments).

EDIT: @cr1776: thanks for those links.  Some of them have more info than others.   I'm not sure anyone actuall did the empirical sampling.  But the consensus seems to be that the distribution is asymmetrical and has a std dev of 10 minutes.  I'd consider my question more or less answered, but I'll leave this open in case anyone does the sampling or has something else to add here.
legendary
Activity: 4214
Merit: 1313
Hi,
Here are some discussions about it block times, standard deviations, means, medians, probability densities and distributions.  Instead of re-doing the math, I think these do a reasonable job of explaining it.  But, perhaps they will spawn other questions or not answer it.  ;-)

In regard to the 30 minute comment,  ~95% of the time there is at least one block found every 30 minutes.


http://www.reddit.com/r/Bitcoin/comments/2vocap/what_is_the_standard_deviation_of_time_between/

http://bitcoin.stackexchange.com/questions/25293/probablity-distribution-of-mining

https://bitcointalksearch.org/topic/m.9955875


legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
p.s. i didn't include it at first because those(rising to one hour or more) were only isolated cases

Respect - you didn't make a separate post (sorry but I do tend to get annoyed by ad-sigs) and yes such very long confirmation times are not very often.

What would be more interesting would be a mathematical explanation for the longest possible time one might have to wait for a confirmation (assuming that the network hashing rate hadn't decreased radically).
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
it would go all the way up to 20 mins, but no more then 30 for sure, it is based on a exponential distribution

Is what you wrote.

Now - if you want to do the honourable thing then edit your original reply rather than making a new post (which would be the "ad-sig" thing).
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
it would go all the way up to 20 mins, but no more then 30 for sure, it is based on a exponential distribution

You clearly have not actually looked at the history as it has taken over an hour to create a new block in the past.

(of course your post was made because of your ad-sig rather than for any actual "real reason" as you didn't even bother doing the slightest bit of research before posting)
Pages:
Jump to: