Unless the idle draw of the cards represents the same amount of logical work done, you can't just subtract them out to get the "mining" power consumed.
If one card idles hot doing tons of work that is not mining related, subtracting that from the final mining number misrepresents how much power that card actually draws in order to mine. Especially if it stops doing that work while mining. Conversely, if one card idles cold it won't have any idle wattage to subtract making its mining numbers seem like they draw more power.
The only numbers that can be compared are 100% mining vs 100% mining. Unless someone wants to map out exactly how much logical work each card does while idle and pro-rate their wattage for the extra work done...
Idle card does no work. i.e. 0 mhash/s.
Look at it this way.
System at load: 300W
System at idle (including GPU idle wattage): 100W
GPU idle wattage: 10W
The reason we wan't to subtract the GPU idle wattage is to get the true GPU load wattage.
100W - 10W = 90W (system idle w/o no GPU).
300W - 90W = 210W (GPU full wattage at load).
Now we have apples to apples comparison, the GPU wattage at load.
We can also predict other system values.
system w/ 1 GPU = 90W + 1*210W = 300W
system w/ 2 GPUs = 90W + 2*210W = 510W
system w/ 3 GPUs = 90W + 3*210W = 720W
system w/ 4 GPUs = 90W + 4*210W = 930W
system w/ 5 GPUs = 90W + 5*210W = 1140W
system w/ 6 GPUs = 90W + 6*210W = 1350W
Idle cards do no
hashing related work. But it consumes watts. Therefore it does work. Explain exactly all of the instructions executed (like monitoring and answering driver polls, DMA channel, etc). If this work is accomplished only on an idle card and not on a mining card, then we need to account for that. Once those are accounted for we can compare to see if one card is providing say "full service with frills" to the OS, and the other card is "self service" while idle. That condition can skew the measurements. Especially with a card designed to go very cold while idle vs one that runs hot with idle cycles.
Apples to oranges.
Edit: A concrete example follows.
The AMD Phenom II X6 1100T (3.3GHz) consumes 20w while idle, and 109w under full load.
The Intel Core i7 2600K @ 4.4GHz consumes 5w while idle, and 111w under full load.
While under full load (in theory), the CPUs are not executing any idle cycles.
Also, voltage may be stepped down while idle and parts of the chip shut off further skewing the comparison.
Subtracting those 20 from 109 & 5 from 111 will not give you anything useful.