That... sounds like total nonsense to me. I don't know anyone personally who hasn't encountered a voltage irregularity or a power surge at some point in their lives. Personally, in every location I have ever lived - in both the United States and in Canada - I had power surges. Sometimes the device is protected by a surge protector, sometimes it isn't. It depends on the quality of the protector if it survives or not, hence why I prefer to have more than one line of protection. (I've only ever had one case of a surge protector actually being responsible for killing a device)
That doesn't make me feel any better. Last time I ran a miner on ATX a power surge resulted in the PSU erupting in flames. Thankfully I was there to put it out. I was lucky. It could have burned my house down, and that one was brand new. Besides that, a surge protector tends to be less expensive overall than a PSU (not to mention the time delay in getting the miners back online if the PSUs all fried - which represents lost revenue). Obviously I'd rather it failed than the miner, but ideally I want neither to fail.That's the whole point of surge protection.
For my 120v equipment, I use voltage regulators, surge protectors, and uninterruptible power supplies rather to keep me going for an hour even in the event of a power outage or worse - a power flicker. I have NEVER had any problems with power when using such a setup, hence why I want to replicate it in 240v.
Why?
At 240 volts and 15 amps, the max power draw is 3600 watts. Technically, the line gets derated, but at 20% derating it SHOULD still be sufficient to run 2 miners, assuming the power draw in the specifications is accurate that is. According to Bitmain, the power draw for it's Antminer S9 plugged into it's APW3++ (on 240v) is 1323W with a power discrepency of +/- 10% which means that the maximum power draw should be 1455.3 watts each. Assuming two miners were plugged into the same circuit, and both just happened to have the worst power rating, they would draw 2910.6 watts from the wall. At 240v that would use 12.1275 amps. Derating 15 amps to 12 amps would mean that it's 0.1275 amps over, but only in the case of the worst possible power draw. If I was able to actually read the power draw I'd have a clear idea of what is needed though. I could, for example, combine an efficient with an inefficient to balance things out. As long as the two miners working together are not drawing more than 2880 watts that should meet within the derating standards and draw only 12 amps. Right now, I only have one miner plugged into a single circuit. There's a reason why I want to read the watt usage.
In any case, the electrician actually installed 20 amp breaker switches because that's all they had on hand. It's just that the outlet itself is rated for 15 amps. Technically, I could switch them to NEMA 6-15 out for NEMA 6-20 if there really is a problem with that and use the breakers in place, but the problem there is that the breaker box and all the breakers are rated for 240v max and the NEMA 6-20 is rated for 250v so I am unsure how that would affect things, even though I can easily plug a NEMA 6-15 plug into a NEMA 6-20 outlet. (that, and I literally JUST dropped the money to have these outlets installed.)
Even if I did that, I'd still want proper surge protection, and I'd want to know exactly how much power is being drawn. Even an inline watt meter to go in front of the planned sub panel that will be used specifically for the miners would help if there is nothing else.