You don't even need to check anything with sensationalist claims like that. Shit article by a shit author.
Light Pollution in the U.S. alone: 120 terawatt-hours/year
Bitcoin Network: 29.05 terrawatt-hours/year (
Confirmation? Some articles cite 11 Twh/yr ?)
CERN Hadron Collider: 1.3 terrawatt-hours/year
Yeah, let's get the global light pollution levels solved first before we start considering Bitcoin energy consumption a problem...
Assuming a network at 10 Exahash/10,000 Petahash/10 Million Terahash/10 Billion GH/s and an average miner efficiency of .3 Watts per GH then I come up with 26.28 terrawatt-hours/year. That is about the efficiency of an Antminer S7 but the majority of miners are probably S9s or equivalent so that seems really high. .2 Watts per GH/s would still be high but closer to reality considering other power costs such as cooling. That would bring us down to 17.52 terrawatt-hours/year.
10,000,000,000 GH/s X .2 Watts/GH/s X 24hours X 365days = 17,520,000,000 KW/h or 17,520,000 MW/h or 17.52 TW/h
It's all based on your assumptions and of course the article didn't say what assumptions it used.