Read your linked post. I think you're disregarding one-time cost of the asic miner unit. With asic the one-time cost is the large factor, energy cost less so. At least for now.
So I'm guessing energy used for mining should be considerably less than $2 to $3 million when rate is at $1000/BTC.
Difficulty grows with total hash rate, it doesn't care about power efficiency per hash.
Once you have the hardware, it makes economic sense to keep mining as long as your costs per one Bitcoin mined are less then one Bitcoin.
And if your cost is, for example 0.3 coins per coin, it makes sense to expand (in a sense of buying new hardware). And buying new hardware will increase difficulty up to the point, perhaps at 0.5, perhaps higher, where you stop expanding.
Those two incentives create a situation where you will always have some miners that are close to zero profit, and not very many of those that are spending 0.1 BTC to mine 1 BTC.
We can never know the true efficiency distribution of all miners, but if you take a straight line from zero to one, on a chart where you display efficiency of all miners, sorted by efficiency, you would get total surface of 0.5, and if you start from an assumption that the most efficient miner is at 0.5, you would get total of 0.75 under the curve. Thus my assumption for total miner efficiency of 0.5 to 0.75. In practice, it could be higher, but if it is lower, it would be a temporary situation.
Unfortunately electricity is not the only cost of mining. There are many costs associated with it. Some of these are minor (pun not intended), for example:
However, you are ignoring what is easily the second-greatest cost to mining: hardware depreciation. ASICs are useful for mining and only mining. With the current arms race, which is likely to continue, an ASIC unit becomes obsolete as soon as it can no longer mine efficiently enough to beat its own electrical (& storage and maintenance) costs. Over an ASIC's lifetime, its value shrinks until it reaches zero as soon as it starts losing money.
If miners behaved as you stated, the lifetime of an ASIC will be very short. Even assuming a conservative estimate of ASIC speed growth, the cost per coin should double every few months. How much an ASIC unit makes could be modelled with the equation:
Profit = 0.5 × (C−e) × d − $
(C = initial gross earnings per day
e = daily operation costs
d = number of days until ASIC worthless
$ = initial cost of the ASIC)
Miners, as a whole, will always get more ASICs as long as the profit is positive. Assume that we reach an equilibrium state, with buying a new ASIC equally likely to be profitable or unprofitable. Assuming the a conservative rate of growth (let's use 200 days because it's a nice round number), we can simplify this model thus:
0 = 0.5 × C × d − $
0 = 0.5 × C × 200 (log C − log e) − $
0 = 100C × (log C − log e) − $
$ = 100C × (log C − log e)
(log represents to the base of 2)
Your number, "efficiency", is given as e/C. As C and $ should in theory be related proportionally, we can introduce a new constant Q = $/C. Our equation is then thus:
Q = 100 × [log (C/e)]
2^Q = (C/e)^100
(1/2^Q)^(1/100) = e/C
(1/2)^(Q/100) = e/C
(log represents to the base of 2)
This equation points out that the efficiency constant will decrease with an increase in one-time cost of the ASIC. An Avalon ASIC costs ~120
BTC at market price and earns approximately 4
BTC every day, so the experimental value for Q is 30 days (note that the formula has units of 2
x days on both sides; the right side has the value hidden after the substitution of d). Therefore:
e/C = (1/2)^(Q/100)
= (1/2)^(30/100)
= (1/2)^(3/10)
≈ 0.8
Therefore, miners will actually use significantly less electricity than they generate in
BTC—and that's with a zero margin.