IM trying to figure out the hashrate where mining costs and income are in balance (for a given BTC exchange rate and for the current block reward). Thats just a number that depends on a lot of variables, but mostly electricity cost, energy efficiency, and to some extend, investment horizon, hardware production costs. There is no timeline on when we will approach this, there is no historical data to check against.
This has already happened at least once previously, when difficulty levelled out for a year or so until the exchange rate increased. Does your calculator indicate that would have happened?
What happened in 2012 was different. We mined with GPU's and gpu pricing is not dependent on bitcoin profitability. AMD (and nVidia) price their products for gamers mostly. They didnt charge huge premiums when GPU mining was highly profitable, and when bitcoin mining stopped being profitable, it didnt cause AMD to lower its prices. ASICs pricing will behave very different since they serve no other market besides mining.
Still, lets see what we get. Lets take January 2012
BTC exchange rate ~$5
Assuming most people were doing GPU mining. Lets take a 5870 @350Mh @200W at the wall.
Lets say it costed $250.
If I plug those numbers in I get a network speed of
In reality it was ~8000 GH.
If you take in to account FPGA mining, and given the range of possible outcomes of my current spreadsheet, thats close enough in my book.
Points taken: miners were more heterogenous then so an average miner profitability can't be easily determined. Especially since those with the largest hashrates were FPGAs. Even so your result was at least the correct order of magnitude. If that's as accurate as it gets, that's still good enough to plan for.
Another question - in your example you've used average miner electricity costs at 0.12c per kWh. Do you still think this is a reasonable estimate? I suppose you assume that miners will move rigs to the lowest cost areas - overseas if necessary?