OK then - honest question from a point made earlier.
Calculating power used by a graphics board should be easy, with my limited understanding. It's a DC circuit, running with 12V feeds, so assuming you can measure the current flowing on each of the feeds into the GPU board (two direct feeds from PSU and the feed from the PCIe slot itself), and assuming your 12V feed is a clean 12V (if not, I suppose you could measure the exact '12V' potential at all three points), prep school physics says that the power used (converted / dissipated as heat / available to do work / etc.) is simply the voltage multiplied by the total current. If the current is measured in amps then multiplying this by the voltage gets you power in watts.
OK, so getting a representative overclocked 5850 mining at full load, then measuring the input current and voltages (the different inputs could have different potentials... after all, the two PCIe power feeds are usually fat cables straight to the PSU, with minimal resistance and hence minimal voltage drop... but the PCIe *slot* feed of up to 75W has to come from the ATX / logic board connector, and then across a bunch of PCB traces... my guess is that this route (and the thinner-gauge PCIe pins) will have a higher resistance and hence a larger voltage drop), multiplying them together will give you a total power draw from *one* typical 5850 card.
Let's say the measurement and calculation gives 175W. I've got 5 slots, so if I filled them all with 'typical' 5850s, I'll need 875W from the PSU purely on the 12V supply, ignoring the CPU and other loads.
My question is - at what point do I have to take the nature of AC into account? Is it only an issue when comparing 'power-from-wall' readings (in the USA, from 'kill-a-watt' type devices - in the UK I've not seen specific brand name power analysers, I just use Maplin power meters and extension multiplugs with power analysers built in)??
Residential AC is typically single phase, isn't it? Are the mains-electricity power meters measuring power the same way (volts x amps)?? If so, then the RMS issue comes into play, and you can't assume (in the UK) 240V times the current to be the power. Or can you?
Sine-wave alternating current and RMS voltage makes things annoyingly confusing when switching to DC output. Is this a complete non-issue, and when a PSU says their 12V rail will handle, say, 850W, and my mains power meter says I'm pulling 650W from the wall, I've got a nice fat buffer and am safe and efficient? Or is the rated 850W based off 'peak' power supplied in AC form, whereas in reality only RMS is continuously available to convert to DC, and my 'buffer' may not be anywhere near as large as I think?
I'd like someone who really knows the detail of AC conversion to DC, and how mains power meters measure, to educate me here... Haploid23 - you said that reading 1000W from the wall is a measurement of AC power... is that peak (non-continuous), needing an RMS calc, or are 'watts' always 'watts' regardless of AC, DC, single or three phase, etc??
I buy UPS backups for my servers, and have an emergency generator, and notice that *these* appliances often mess about with 'VA' ratings instead of good old *watts* - this suggests that 'watts are watts', and the 'VA' is just a scam to make the power supply seem more powerful (since it's going to be peak, not RMS, voltage - surely)??
But WTFDIK.