one million dollar can buy 12-15 pb space depending on the infastructure. running costs about 500$ a day.
so the current 7-8 pb networksize would roughly cost 500.000$ and 250$ a day.
on this scale plotting can be done on the storagenodes themselve and maybe a bit speeded up with some gpu plotters within 1-2 weeks after deployment.
Assuming my $100M estimate for bitcoin is correct, that means that a correspondingly secure burst network would have about 1200-1500pb. So the question is how much energy that much storage would consume.
Personally, I think 1200-1500 pb is on the low side. Consumer prices are ~$33/tb. So $100m could buy a total of 3000pb of space at that price.
As far as energy consumption, I think it's fair to ignore the cost of running the computer that the storage is attached to. Mining that takes place on end-user's desktop computers aren't using any extra energy to keep the computer on (other than the energy of the hard drive itself), and large mining operations will likely be able to be pretty efficient with their operations, with one computer powering many hard drives.
Estimates vary, but it seems that the average power consumption for a single hard drive is about 7-13 watts. They can consume more when under heavy load, but mining will typically be very low load, with only a few reads every 4 minutes (especially if the plot files are optimized with a large stagger size). So, I think it's safe to say burst mining will be on the low end, and only consume 7 watts per drive.
Let's assume the entire 3000pb network is powered by a million 3 TB drives. This makes the entire power consumption for a burst mining network with $100m in capital equal to
7 million watts.
Let's compare this to bitcoin. I'm going to use figures from the bitcoin
Mining hardware comparison wiki page. For the sake of easy math, let's assume that the entire bitcoin mining network is powered by AntMiner S3's. These cost $382 each, so $100m could theoretically buy 261780 AntMiner S3's. Each S3 consumes 340 watts, which would put the entire network's energy consumption at
89 million watts.
This is all just back-of-the-napkin math, but it shows pretty clearly that burst is significantly more energy efficient per dollar of capital than typical proof-of-work mining. Ultimately, it comes down to the dollars per watt that the main mining instrument consumes. Bitcoin miners (ASICs) tend to consume about 1 watts per dollar, whereas burstcoin mining equipment (hard drives) only consume 0.1 watts per dollar.
Also, I think burst mining can be further optimized for energy consumption due to the nature of the mining algorithm that only requires reading once per block. Imagine an advanced mining setup with 4096 hard drives. The plots could be specially arranged such that the miner would only have to read from one drive per block. This would allow the miner to leave all the hard drives
completely unpowered the majority of the time. This setup would consume something like a 1/4096th of the power of a traditional setup. That brings the watts/dollar figure for burst down to only 0.000024414. This more optimized mining method is
40,000 times more energy efficient than typical PoW mining.