Pages:
Author

Topic: 3x7970 Mining Results. - page 11. (Read 61697 times)

legendary
Activity: 1162
Merit: 1000
DiabloMiner author
January 11, 2012, 05:44:01 PM
#63
NICE!  so if you factor in the hardware costs....  5970's will rule for a while.

How about the 5870's at bens for 140$, wonder how they will stack up to the 7970 at 550$

Well, remember, slots have a very large premium. 5970s aren't getting any faster, but the 7970 already is essentially as fast as one, and the 7990 will be twice as fast as that.

Any guesses on where the 7970 is headed with optimizations? Cheesy

Well, before anyone had one, I said it'll probably be over 600 mhash/sec at stock speeds optimized. We're over the 500 mhash hurdle, so thats great.
hero member
Activity: 667
Merit: 500
January 11, 2012, 05:28:51 PM
#62
NICE!  so if you factor in the hardware costs....  5970's will rule for a while.

How about the 5870's at bens for 140$, wonder how they will stack up to the 7970 at 550$

Well, remember, slots have a very large premium. 5970s aren't getting any faster, but the 7970 already is essentially as fast as one, and the 7990 will be twice as fast as that.

Any guesses on where the 7970 is headed with optimizations? Cheesy
legendary
Activity: 1162
Merit: 1000
DiabloMiner author
January 11, 2012, 05:27:33 PM
#61
NICE!  so if you factor in the hardware costs....  5970's will rule for a while.

How about the 5870's at bens for 140$, wonder how they will stack up to the 7970 at 550$

Well, remember, slots have a very large premium. 5970s aren't getting any faster, but the 7970 already is essentially as fast as one, and the 7990 will be twice as fast as that.
legendary
Activity: 3878
Merit: 1193
January 11, 2012, 05:18:05 PM
#60
That 3-card setup full out putting down over 1.6 GH/s at roughly 650W is pretty damn good.  I'm still not convinced it's better (for mining anyway) than a 5970 setup though.
It's not. Don't compare an underclocked/undervolted 7970 to an overclocked/overvolted 5970 if you want to compare performance per watt.

I can get 1.7 GH/s at 625W with 3x5970 by underclocking/undervolting.
full member
Activity: 131
Merit: 100
January 11, 2012, 05:14:26 PM
#59
If you had a system idle of around 118W (like 1onevvolf in the other 7970 thread), that would put your single-card mining power draw at 240W total!  That's amazing!!  Now if only these cards were about $300-350.

No... My single card is still 217 watts @ default and 122 watts undervolted.
full member
Activity: 131
Merit: 100
January 11, 2012, 05:07:42 PM
#58
fuck, updated the OP again with correct single card wattage.

OP - may be a silly question, but I haven't figured out how to get the voltage down that low yet.. what tool are you using?

thanks!

MSI Afterburner. Make sure you set overclocking mode to 2.
legendary
Activity: 1876
Merit: 1000
January 11, 2012, 05:07:34 PM
#57
IMHO the best way to look at that is cost per PH (petahash).

5970 runs $300 used, pulls about 250W, and gets ~750MH/s.  For someone (like me) w/ ~$0.10 electrical rate if we estimate the card will have a 36 month effective lifespan then

750 MH/s * 60 * 60 * 24 * 30 * 36 = 69984000000 MH  or ~70 PH
Lifecycle cost is $300 (capital cost) + 250/1000 * 24 * 30 * 36 *$0.10 =  $648.  (electrical consumption is 2/3 of total cost).

$948 total cost / 70 PH = $13.55 per PH

That is all that matters can a new product get better price per Petahash. 


NICE!  so if you factor in the hardware costs....  5970's will rule for a while.

How about the 5870's at bens for 140$, wonder how they will stack up to the 7970 at 550$
hero member
Activity: 896
Merit: 1000
Seal Cub Clubbing Club
January 11, 2012, 05:05:12 PM
#56
If you had a system idle of around 118W (like 1onevvolf in the other 7970 thread), that would put your single-card mining power draw at 240W total!  That's amazing!!  Now if only these cards were about $300-350.
legendary
Activity: 1162
Merit: 1000
DiabloMiner author
January 11, 2012, 05:03:26 PM
#55
Doubtful.  AMD went to a more Nvidia like architecture as the trend is towards more complex shaders.  Games aren't growing in pixel or polygon count (or at least not growing exponentially).  The push is into more complex and realistic effects on the same number of pixels/polygons.  That means more complex shaders are more efficient.

NVidia moving to a less complex but more shader architecture simply makes no sense. 

Thats not the entire thing though. Nvidia uses a purely streaming setup with all their pipes (ie, its like how a CPU executes an instruction stream), AMD on GCN still has something like VLIW's VLIW clauses, but its mainly inside of the CUs now and not part of the ALUs anymore* (on VLIW5/4, it was part of the ALUs, each instruction was tagged with which ALU it ran on).

As far as I can tell, the ALU design itself is rather similar (except for the end of it that plugs into the CU, thats obviously different and enhanced), the big feature they added was SIMD execution to reduce the size of the clause and decouple the CU's manhandling of the ALUs to get shit done.

GCN is really a hybrid of both schools of thought, for highly complex code, AMD did really well. Mining just... isn't exactly highly complex.

* As in, the compiler still choses which ALUs to use, the GCN CUs just seem to demux the VLIW-like clauses the compiler produces instead of running it as a unified VLIW arch
hero member
Activity: 667
Merit: 500
January 11, 2012, 05:02:43 PM
#54
fuck, updated the OP again with correct single card wattage.

OP - may be a silly question, but I haven't figured out how to get the voltage down that low yet.. what tool are you using?

thanks!
full member
Activity: 131
Merit: 100
January 11, 2012, 04:58:36 PM
#53
fuck, updated the OP again with correct single card wattage.
sr. member
Activity: 270
Merit: 250
January 11, 2012, 04:57:20 PM
#52
getting towards impressive, might be worth buying the 7xxx cards once the price drops.
hero member
Activity: 667
Merit: 500
January 11, 2012, 04:54:11 PM
#51
subbbbbbbbbed

i'm starting to like it more and more..  anyone know what the story is on linux drivers?
hero member
Activity: 518
Merit: 500
January 11, 2012, 04:53:28 PM
#50
Well, we can get all wet about the power figures but the prices and performance / price ratio still sucks big time AND ( biggest of all ) we still have not seen Nvidia's 28nm arch and products.

I just pray every night they woke up and decided to make a BTC mining card this time around !

Doubtful.  AMD went to a more Nvidia like architecture as the trend is towards more complex shaders.  Games aren't growing in pixel or polygon count (or at least not growing exponentially).  The push is into more complex and realistic effects on the same number of pixels/polygons.  That means more complex shaders are more efficient.

NVidia moving to a less complex but more shader architecture simply makes no sense. 


Also for everyone except those w/ free power what matters is TOTAL LIFECYCLE cost.  Capital cost  + electrical cost

IMHO the best way to look at that is cost per PH (petahash).

5970 runs $300 used, pulls about 250W, and gets ~750MH/s.  For someone (like me) w/ ~$0.10 electrical rate if we estimate the card will have a 36 month effective lifespan then

750 MH/s * 60 * 60 * 24 * 30 * 36 = 69984000000 MH  or ~70 PH
Lifecycle cost is $300 (capital cost) + 250/1000 * 24 * 30 * 36 *$0.10 =  $648.  (electrical consumption is 2/3 of total cost).

$948 total cost / 70 PH = $13.55 per PH

That is all that matters can a new product get better price per Petahash. 

Really nice way of putting it. But what about the 7970 price per Petahash ?
full member
Activity: 131
Merit: 100
January 11, 2012, 04:45:49 PM
#49
I updated the OP with the correct power figures of 270 watts system idle. Either way it doesn't change anything drastically.

EDIT: On water I wouldn't be amazed if they dip below 100 watts load.
vip
Activity: 1358
Merit: 1000
AKA: gigavps
January 11, 2012, 04:40:33 PM
#48
That is all that matters can a new product get better price per Petahash.

Amen.
legendary
Activity: 1162
Merit: 1000
DiabloMiner author
January 11, 2012, 04:36:07 PM
#47
Well, we can get all wet about the power figures but the prices and performance / price ratio still sucks big time AND ( biggest of all ) we still have not seen Nvidia's 28nm arch and products.

I just pray every night they woke up and decided to make a BTC mining card this time around !

nvidia still refuses to update their arch (which has been nearly identical since the 8000 days, zero innovation) to get good integer performance.
donator
Activity: 1218
Merit: 1079
Gerald Davis
January 11, 2012, 04:33:51 PM
#46
Well, we can get all wet about the power figures but the prices and performance / price ratio still sucks big time AND ( biggest of all ) we still have not seen Nvidia's 28nm arch and products.

I just pray every night they woke up and decided to make a BTC mining card this time around !

Doubtful.  AMD went to a more Nvidia like architecture as the trend is towards more complex shaders.  Games aren't growing in pixel or polygon count (or at least not growing exponentially).  The push is into more complex and realistic effects on the same number of pixels/polygons.  That means more complex shaders are more efficient.

NVidia moving to a less complex but more shader architecture simply makes no sense. 


Also for everyone except those w/ free power what matters is TOTAL LIFECYCLE cost.  Capital cost  + electrical cost

IMHO the best way to look at that is cost per PH (petahash).

5970 runs $300 used, pulls about 250W, and gets ~750MH/s.  For someone (like me) w/ ~$0.10 electrical rate if we estimate the card will have a 36 month effective lifespan then

750 MH/s * 60 * 60 * 24 * 30 * 36 = 69984000000 MH  or ~70 PH
Lifecycle cost is $300 (capital cost) + 250/1000 * 24 * 30 * 36 *$0.10 =  $648.  (electrical consumption is 2/3 of total cost).

$948 total cost / 70 PH = $13.55 per PH

That is all that matters can a new product get better price per Petahash. 



legendary
Activity: 1876
Merit: 1000
January 11, 2012, 04:33:37 PM
#45

Yeah, correct. But I am re testing the idle wattage as I have a GPU that wont go into idle mode.

even if you change the 290, will not change the wall number, so the above conclusion stands.
full member
Activity: 131
Merit: 100
January 11, 2012, 04:31:20 PM
#44
Three 7970's:

System Idle: 290 watts
Mining: 925/1375mhz, 1.17v, 630 watts
Mining: 925/340mhz, 1.17v, 625 watts
Mining: 925/340mhz, 880mv, 360 watts
Mining: 925/340mhz, 865mv, 345 watts

~1650mh/s


so correct me if I am wrong, we have to add 290 watts to these figures to get the wall number:

Mining: 925/1375mhz, 1.17v, 630 watts  -- 920 watts @w
Mining: 925/340mhz, 1.17v, 625 watts  -- 915 watts @w
Mining: 925/340mhz, 880mv, 360 watts  -- 650 watts @w
Mining: 925/340mhz, 865mv, 345 watts  -- 635watts @w


so, lets take the best case  635watts, 1650Mh.  if you had 6 of them in a rig to compare one of my 4X5970 rig

4x5970 2950Mhash  at 1250watts
6X7970 3300Mhash  at  1270 watts


so, basically you can get 350 more Mhash for another 20 watts.  

in conclusion, it beats the 5970, but just barely, and this does not take into account the price of hardware!!




Yeah, correct. But I am re testing the idle wattage as I have a GPU that wont go into idle mode.
Pages:
Jump to: