I am using 230V.
The PSU has 850W, I am currently using around 640W at the 12V rail. Lets assume for a moment I'd be drawing the complete 850W.
The PSU has 88% efficiency typical at this load (= 12% inefficiency).
850*1.12 = 952W.
At 230V that means the PSU should require a current of 4.14A (952/230). Why does the PSU say the following on the box:
AC Input: 100-240V~ 11-5.5A 60-50Hz
I know it's 240V not 230V, but 10V difference make 1.36A???
How am I supposed to estimate how much power it draws, how many of them I can connect to a single 13A or 16A outlet, etc? If I go by the 5.5A number, I can only connect two PSUs :/
Does anybody feel like teaching a dork like me some basic electrical skills?
It's more about the total wattage it can provide, the voltage and amperage are just the two parts to that. 120 Volts at 11 Amps is ... 1320 Watts. 240 Volts at 5.5 Amps is... 1320 Watts still. So if your actual voltage is 230 volts, you max amperage would be... 5.74 Amps for the PSU max and 6.5 amps at the wall.
Your total amperage is dependent on what your actual voltage is. Which can fluctuate quite a bit, so always be on the safe side and error on the side of caution. Electrical fires are a real bitch.