Well, we can get all wet about the power figures but the prices and performance / price ratio still sucks big time AND ( biggest of all ) we still have not seen Nvidia's 28nm arch and products.
I just pray every night they woke up and decided to make a BTC mining card this time around !
Doubtful. AMD went to a more Nvidia like architecture as the trend is towards more complex shaders. Games aren't growing in pixel or polygon count (or at least not growing exponentially). The push is into more complex and realistic effects on the same number of pixels/polygons. That means more complex shaders are more efficient.
NVidia moving to a less complex but more shader architecture simply makes no sense.
Also for everyone except those w/ free power what matters is TOTAL LIFECYCLE cost. Capital cost + electrical cost
IMHO the best way to look at that is cost per PH (petahash).
5970 runs $300 used, pulls about 250W, and gets ~750MH/s. For someone (like me) w/ ~$0.10 electrical rate if we estimate the card will have a 36 month effective lifespan then
750 MH/s * 60 * 60 * 24 * 30 * 36 = 69984000000 MH or ~70 PH
Lifecycle cost is $300 (capital cost) + 250/1000 * 24 * 30 * 36 *$0.10 = $648. (electrical consumption is 2/3 of total cost).
$948 total cost / 70 PH = $13.55 per PH
That is all that matters can a new product get better price per Petahash.