IDLE SYSTEM : 86-watts
CPU MINING ONLY (RandomX @ 1500 H/S) : 186-watts
GPU MINING ONLY (1x1060 + 5x1050Ti on X16Rv2) : 465-Watts
CPU + GPU MINING : 550-Watts
But yes, your point is totally accurate if I am only CPU mining -- H/S per watt is not super great with all that idle usage.
Seems to be some synergy here, though... "CPU Only" adds 100-watts of draw, if I'm already GPU mining it only goes up 85-watts more (this is the number I was recalling).
So at my power cost it would run me $0.3122 per day in power to ONLY mine on the CPU -- that only jumps to $0.924 per day to do both ($0.61 more)... and worst case I can make $1 a day on NiceHash with this system GPUs. This is one of my lesser system in terms of GPU earnings; it's just the one I had a monitor hooked up to this morning.
So that is the way I'll go is to stack the RandomX CPU mining on top of the GPU and my *additional* earnings rings in at the ~16 H/S per watt figure.
Is the 1500 H/s maintained when you are also mining on the GPU's or does it take a hit?
I have just now tested my Dell T5500 with dual X5670's. This is my main workstation and it is powered on 24/7.
At idle the power used is 110 watts.
When mining RandomX it goes to a constant 308 watts. Hash rate is 3124 H/s or 1562 H/s for each X5670 Xeon. So the power used mining RandomX is 198 watts for two X5670 Xeon's or 99 watts for one X5670 Xeon. That makes your original 95 watt number for the X5660 right on the mark.
So my Dell T5500 with dual X5670's rig mining RandomX produces 3124 H/s for an additional power usage of 198 watts which makes 15.78 H/s per watt figure.