and socket free
+ You're in the wrong thread, pal
Just a little, there are so many of us here that we wont admit to AMD being more efficient when it comes to power.
Your biggest bill is always power, you spend $80 on a 300W card and you still have to pay for the power every year.
You pay $280 on a 50W card and you have paid more initially but less in the long run
0.3Kw * 24H * 365 days = 2628Kw @ $0.10 = $262.8
0.05Kw * 24H * 365 days = 438Kw @ $0.10 = $43.8
Year 1:
$80 + $262.80 = $342.88
$280 + $43.80 = $323.80
Year 2:
$80 + $262.80 + $262.80 = $605.60
$280 + $43.80 + $43.80 = $367.60
Difference: $238
It doesnt matter how you say it, or what card you have. If it isn't energy efficient, and you pay for power, you lose. Yes one may do more hashing, but over 2 years, the more power hungry card would have to do twice as many hash's to be worth it.
You can't argue basic math. Now imagine the difference over 10 or 20 GPU's