I have been spending some time trying to understand how much of what we see on a Day by Day / Week by Week basis is variance and how much of it is real hash rate additions or subtractions from the network?
First a word of warning... I am no statistician all of my observations are just that with no mathematical foundation to support them. A lot of the thinking and head scratching has been preparation for Phil's guess the next Difficulty increase contest, so thanks to Phil for running the Contest. Anyway on with the show.
There are plenty of Graphs on the net but having your own helps because you can play with the data & the way it is presented. So here is my graph, very much a work in progress as at each iteration you have some ideas, some of which work and some do not. It probably raises more questions than it answers, but it has helped me get to grips with what is happening on a short term basis.
So to explain it covers the current 14 day difficulty period with the Block number across the top, Days across the bottom.
The Blue graph is a Block by Block cumulative hash rate change for the 14 day period read from the scale on the Right, so at the end of the period it will be showing the % increase. Period is not over yet so do not pay too much account to the projection....
The Purple graph is the same except that each Day it is reset showing just the Days % increase and is read from the left hand scale. First few points again very wild until the average builds.
The vertical Yellow lines are the Day markers & the Triangles mark the Days % increase.
I have tried but chosen to not use moving averages as often the data dropping off the back has more effect than the new more interesting data on the front. So here are some of my observations.
There appear to be 3 phases. Phase 1 Days 1-5 Hash rate drops and then recovers back to where you would expect around zero.
Phase 2 Days 6-9 Hash rate is stable varying but close to zero.
Phase 3 Days 10-14 Hash rate Increases
Finally to kick things off a few thoughts / possibilities for each of the phases.
Phase 1So the first question is there really a drop off or is this just network variance? My feeling is that is too large to be entirely variance, some of it will be as we have very few data points, resulting in some wild points initially as particularly shown in the first partial Day.
So the theory I am going to suggest for phase 1 is that if you are managing a large mine and you have some major work to do, removing old miners, making significant infrastructure changes that involve down time etc. Then given the choice between doing these tasks at the end of a Difficulty period or the beginning, you would make the most of the lower difficulty and delay the work until the new period.
Phase 2A period of apparent stability, small variance to the hash rate which I think could be explained as being normal network variance.
Phase 3 On Day 10 Hash is added to the network, yes of course there is variance in this but looking at the daily figure it indicates that as much as 20% has been added to the network hash rate which then of course is there each day until the end of the period.
Finally a thought that to some degree contradicts the above, and may be blindingly obvious to some but was not to me....
If the network was otherwise stable and Hash was added at the beginning of a period, say 12%. Then that would result in a 12% increase to the measured Hash rate for that period and when the Difficulty change was made we would be back at zero.
However, and here is where it contradicts phase 1. If the 12% was added half way through the 14 Day period then it would only result on a 6% increase in difficulty.
Finally and here's the key point. You would be carrying into the next difficulty period an uncorrected for 6% to kick off the next period?
And on that though I welcome any thoughts you have?
Rich