Well, the City of NY did. And they shared it, indirectly:
Actually, that's pretty easy to calculate. We know the difficulty and thus how many hashes are needed. And thus we can tell how many ATI 5970s it would take. With that, we know how much electricity each 5970 draws, and thus can estimate fairly accurately how much electricity is consumed (presuming all mining is on 5970s).
Here's the raw time series data on difficulty:
- https://docs.google.com/spreadsheet/ccc?key=0AmcTCtjBoRWUdHVRMHpqWUJValI1RlZiaEtCT1RrQmc
This?
Coincidentally, that one building would be just a little less than all Bitcoin mining combined.
The problem is, we don't know the breakdown between FPGAs and GPUs (and maybe some ASICs even). The "one B of A building" was when difficulty was 1.5 million. At over 3.0 million, Bitcoin is now at "two B of A buildings", assuming every hash is done on power-hungry GPUs.
ASICs could push out GPUs (for all except those with free or really cheap electricity) and the total network consumption drop well below the "one B fo A building" metric once again.