Has anybody ever done any power calculations for the entire network? Like estimating the current global hash rate and then using an average of power consumption per hash? I think it might be interesting to know that and see the efficiency of the network overtime as mining technology improves.
This page:
http://blockchain.info/statsHas some estimates based on assumptions of power consumption of 650 Watts per gigahash and electricity price of 15 cent per kilowatt hour.
Feel free to adjust the numbers if you think that it too conservative or aggressive of an estimate.
What is difficult to factor right now is how much of the network is ASICs. It is likely the most important factor because ASICs are so much more efficient (<10W per GH/s) compared to GPUs that the overall network efficiency varies significantly depending on how much ASICs contribute to total network hashing power.
For example if we assume the average GPU is 650W per GH/s and the average ASIC is 10W per GH/s. Lets convert that to 650 KW per TH/s and 10KW per TH/s respectively. The network is roughly 170TH/s. I am going to assume 10 cent per KWh instead of 15 cents as marginal miners are pushed out first, which should make the average energy cost of the network lower than average global energy cost. $0.10 per kWh = $100,000 per GWh
Is the network made up of 100TH/s of GPUs and 70 TH/s of ASICs?
100 * 650KW + 70*10 = 65,700 KW = 575 GWh annually = ~$58 mill USD annual electrical cost
Is the network made up of 150TH/s of GPUs and 20 TH/s of ASICs?
150 * 650KW + 20*10 = 97,700 KW = 855 GWh annually = $86 mill USD annual electrical cost
Is the network made up of GPUs and 120 TH/s of ASICs?
50 * 650KW + 120*10 = 33,700 KW = 295 GWh annually = $30 mill USD annual electrical cost
One thing is certain at current difficulty if ASICs replaced all GPU the annual electrical cost would be ~$2M USD (10 kW per TH/s * 170TH/s * 24 * 365 *$0.10 per kWH) and at $25,000 per TH/s, the deployed capital would only be a mere $5M. Both are much smaller than the value of total mining reward. If ASICs completely replaced GPU and the network hash power remained 170TH/s the annual return on capital deployed is in excess of 2,400% [ ($120M - $2M ) / $5M ]. It doesn't take a rocket scientist to say that reward greatly exceeds risk and as such there will be demand to deploy more hashing power. This means difficulty is going up ... a lot. The only reason difficulty isn't 20x higher is the slow rate of ASIC deployments. People want more hashing power they just can't deploy it fast enough.
Lets assume eventually this bottleneck breaks and miners continue to add hashing capacity until they (collectively) feel it is no longer worth the risk. That will be expressed as a return on capital. How much return? Hard to say but it depends on what return miners are willing to accept. Lets assume 100% annual ROI% and exchange rate remains the same.
Global Mining Revenue - Electrical Costs = Return on Capital
Given:
Annual Gross Mining Revenue = $120M
Annual hardware cost = $25,000 per TH/s
Annual electrical cost = $8,760 per TH/s
Then:
$120,000,000 - $8,760 * x = $25,000 * x
$120,000,000 = $33,760 x
x = 3,554 TH/s Yes that is 3.5 PH/s or difficulty ~20x higher than today (difficulty 400M).
On edit: fixed error ($87,600 vs $8,760 annual electrical cost per TH/s).