2.62 Mh/J is meaningless as it's not comparing like-for-like, you're adding in the unknown variable of the system, which will be vastly different (as you noted) system to system.
3.74 Mh/J is meaningful, as other people, who wish to compare their values, can use this number by factoring out their own baseline system power.
Your argument is invalid because 3.74 Mh/J is also influenced by unknown variables, such as the efficiency of power supplies which varies with load: http://www.anandtech.com/show/2624/3
Here is a thought experiment: yochdog's load/idle power draw is 512/154 Watt. He replaces his power supply with one that is just as efficient at high loads, but more efficient at low loads, changing his measuremnts to 512/130 Watt. Suddenly his mining efficiency went down from 1340/(512-154) = 3.74 Mh/J to 1340/(512-130) = 3.51 Mh/J ! Explain to me why using a formula in which efficiency becomes worse when using better hardware components is useful?
Of course, if everybody had clamp meters, the ultimate way to measure the efficiency of a card would be to measure current at the PCIe power connectors and PCIe slot, like I demonstrated a while ago: http://blog.zorinaq.com/?e=42