Author

Topic: Water-cooling saved me 50W per 7970? (Read 5328 times)

member
Activity: 98
Merit: 10
April 29, 2012, 02:48:49 AM
#20
Nice results.
Water cooling had the same effect on my 6850s but not to that extent.
With stock cooling I could only overclock to 960mhz for 245Mh/s each card. And pulling 448W from the wall.
After water cooling use 2X Antec 620 coolers I got for $80NZD each.
Could overclock to 1010mhz for a 260Mh/s pulling 440W from the wall,
Cards with stock cooling were running up to 78C, After using the 620 water coolers they were running at 45C.
Got some air ducting so the radiators draw in air from out side. Bought the temp down to under 40C and also allowed me to overclock a little bit more to 1030mhz for 272Mh/s each card at 450W at the wall.

So I went from
490Mh/s @ 448W
To
544Mh/s @ 450W
for $180NZD with 2X Antec 620 water coolers. Also PC is whisper quite.
donator
Activity: 1218
Merit: 1079
Gerald Davis
April 27, 2012, 10:01:27 AM
#19
The GPU are in the garage.
donator
Activity: 1218
Merit: 1079
Gerald Davis
April 26, 2012, 10:09:27 AM
#18
Hmm wonder how much less power it will use @ 20C. Smiley

Guess I will find out this winter as my WC radiator will be outside.
member
Activity: 93
Merit: 10
April 25, 2012, 11:25:12 PM
#17
something stinks... check your kill-a-watt! I'd buy 10w each but the 50w is unreal.
This is very real. I did a short test of how temperature affected the power consumption, and power difference was between 40-50W. The test: gtx480 was stressed with furmark with fan off until it reached 100C (yes I know, this is very hot but it only reached this temp for a few sec), then I measured the power consumption. When I let the fan run at full blast again, power consumption shot up instantly by 10W, which means the fan itself consumes 10W by itself. Then as it cooled the GPU down to around 60C, the power difference was between 40-50W. I forgot exact numbers, but the take-home message I got out of that was that low temps will have a significant affect on power draw.
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
April 21, 2012, 06:43:08 PM
#16
I tested using cgminer's auto from 75 degrees (where it sits  around 73 most of the time) to doubling the fan speed manually which brought it down to 65 degrees. It used 2 watts more. So whatever die effects are there between 65-75 degrees are offset by the fan power.
* ckolivas puts it back on default autofan settings
legendary
Activity: 1344
Merit: 1004
April 21, 2012, 06:37:28 PM
#15
FWIW You can get similar results on air. Leave your GPU fans on auto. No CGMiner auto, but actual auto. Yknow, the auto that comes with the GPU BIOS. Mine with that for about 15 minutes, check temperatures and power (temps will likely be in the 80-95C range). Now fix your fans to 75% and check back in 15 minutes. You should have a significant difference in power. Just for Air cooling, I had 30 watts less between 3 5830's at stock volts stock core and 205 memory at auto fan and 75% fan.
full member
Activity: 131
Merit: 100
April 21, 2012, 06:33:56 PM
#14
What you are seeing is true. I lost about 42watts per 7970 while switching to water cooling.

You can see my results here: https://bitcointalksearch.org/topic/m.683755

Yeah but he's talking about 50W per card.  Do the stock fans really consume that much power?  I have a full size box fan that doesn't even use that much.
It isn't the fans, it's because of the GPU die leakage at high temperature. The phenomenon is much more pronounced in smaller architectures such as 28nm.

yep
hero member
Activity: 518
Merit: 500
April 21, 2012, 05:09:41 PM
#13
Lucky for you I have an even better solution. I have an in-line valve on my water-cooling loop to stop the inbound flow on my reservoir so I can drain the loop. The power usage jumps more than 100W between 55C and 90C when I close that valve. A tiny bit is due to the pumps drawing more power, but the majority of it is definitely the graphics cards.

Wow.. thats.. surprising indeed.  Thanks for testing and posting.
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
April 21, 2012, 12:50:03 PM
#12
What you are seeing is true. I lost about 42watts per 7970 while switching to water cooling.

You can see my results here: https://bitcointalksearch.org/topic/m.683755

Yeah but he's talking about 50W per card.  Do the stock fans really consume that much power?  I have a full size box fan that doesn't even use that much.
It isn't the fans, it's because of the GPU die leakage at high temperature. The phenomenon is much more pronounced in smaller architectures such as 28nm.
legendary
Activity: 1554
Merit: 1222
brb keeping up with the Kardashians
April 21, 2012, 12:49:05 PM
#11
What you are seeing is true. I lost about 42watts per 7970 while switching to water cooling.

You can see my results here: https://bitcointalksearch.org/topic/m.683755

Yeah but he's talking about 50W per card.  Do the stock fans really consume that much power?  I have a full size box fan that doesn't even use that much.
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
April 21, 2012, 12:48:36 PM
#10
Hmm.. interesting. Kinda hard to believe tho.
Would you mind doing some testing, by shutting down the fans on your radiator and allow the water temp to go up, and see if there is indeed a tangible impact on power consumption?

I think its more likely you accidentally changed something else, say, something that lowered CPU usage, but it would be worth checking.


Lucky for you I have an even better solution. I have an in-line valve on my water-cooling loop to stop the inbound flow on my reservoir so I can drain the loop. The power usage jumps more than 100W between 55C and 90C when I close that valve. A tiny bit is due to the pumps drawing more power, but the majority of it is definitely the graphics cards.
What would be cool is if you could slow down the fans instead, which would allow a somewhat gradual rise in temperature. While doing that, monitor the temp and power consumption, and see if you can find a point at which the efficiency falls off a cliff, or whether it is linear. It would be nice to know that at (say) 82 degrees, the efficiency suddenly tanks, so we can tell people to always remain under 80, or something like that.
full member
Activity: 131
Merit: 100
April 21, 2012, 10:51:59 AM
#9
What you are seeing is true. I lost about 42watts per 7970 while switching to water cooling.

You can see my results here: https://bitcointalksearch.org/topic/m.683755
full member
Activity: 238
Merit: 100
★YoBit.Net★ 350+ Coins Exchange & Dice
April 21, 2012, 09:59:01 AM
#8
Hmm.. interesting. Kinda hard to believe tho.
Would you mind doing some testing, by shutting down the fans on your radiator and allow the water temp to go up, and see if there is indeed a tangible impact on power consumption?

I think its more likely you accidentally changed something else, say, something that lowered CPU usage, but it would be worth checking.


Lucky for you I have an even better solution. I have an in-line valve on my water-cooling loop to stop the inbound flow on my reservoir so I can drain the loop. The power usage jumps more than 100W between 55C and 90C when I close that valve. A tiny bit is due to the pumps drawing more power, but the majority of it is definitely the graphics cards.
member
Activity: 85
Merit: 10
April 21, 2012, 09:07:54 AM
#7
I think its more likely you accidentally changed something else, say, something that lowered CPU usage, but it would be worth checking.

I for one have observed a great difference of CPU load between a cgminer and Phoenix, the latter having almost 40% load across the cores(windows).
hero member
Activity: 518
Merit: 500
April 21, 2012, 03:04:44 AM
#6
Hmm.. interesting. Kinda hard to believe tho.
Would you mind doing some testing, by shutting down the fans on your radiator and allow the water temp to go up, and see if there is indeed a tangible impact on power consumption?

I think its more likely you accidentally changed something else, say, something that lowered CPU usage, but it would be worth checking.
hero member
Activity: 812
Merit: 510
April 21, 2012, 01:57:04 AM
#5
something stinks... check your kill-a-watt! I'd buy 10w each but the 50w is unreal.

However if the efficiency of the PSU falls off rapidly after the sweet spot then the power consumption measured from the wall could change drastically when the actual change on the DC side wasn't really that huge. The effect would be even more extreme when approaching the maximum load capacity of the PSU.

So what kind of load you got and what PSU?  Cheesy


It kind of does...the stock fan draws 0.8 A on 12 V, for a total of 9.6 Watts...19.2 Watts between the 2 cards? I can't believe that the leakage accounts for 80 W between 2 cards...
full member
Activity: 238
Merit: 100
★YoBit.Net★ 350+ Coins Exchange & Dice
April 20, 2012, 11:41:18 PM
#4
something stinks... check your kill-a-watt! I'd buy 10w each but the 50w is unreal.

However if the efficiency of the PSU falls off rapidly after the sweet spot then the power consumption measured from the wall could change drastically when the actual change on the DC side wasn't really that huge. The effect would be even more extreme when approaching the maximum load capacity of the PSU.

So what kind of load you got and what PSU?  Cheesy


Corsair AX1200. The load was 850W on air, became 750W after I put in the water. I checked my APC BX1500G and then looked at the Kill-A-Watt. After factoring in the UPS inefficiency, they both agree.

This is normal if you were running the 7970s at high temperatures when you had fans.  If you were running at 80C+ on fans and then dropped the GPUs to 35-40C with water, there will be a pretty significant difference in power consumption.  My 7970s seem to have a pretty incredible amount of leakage once they hit the mid/high 70's.

Makes sense. I had 91C load when on air and water brought them down to 51-55C.
hero member
Activity: 642
Merit: 500
April 20, 2012, 11:31:48 PM
#3
This is normal if you were running the 7970s at high temperatures when you had fans.  If you were running at 80C+ on fans and then dropped the GPUs to 35-40C with water, there will be a pretty significant difference in power consumption.  My 7970s seem to have a pretty incredible amount of leakage once they hit the mid/high 70's.
member
Activity: 85
Merit: 10
April 20, 2012, 11:29:14 PM
#2
something stinks... check your kill-a-watt! I'd buy 10w each but the 50w is unreal.

However if the efficiency of the PSU falls off rapidly after the sweet spot then the power consumption measured from the wall could change drastically when the actual change on the DC side wasn't really that huge. The effect would be even more extreme when approaching the maximum load capacity of the PSU.

So what kind of load you got and what PSU?  Cheesy
full member
Activity: 238
Merit: 100
★YoBit.Net★ 350+ Coins Exchange & Dice
April 20, 2012, 09:39:04 PM
#1
Apparently not having to run that fan and the lowered temperature (leading to lowered leakage currents) was enough to save me 100W across my two cards when I installed my GPU waterblocks today. (Note I already had a waterloop in place for my CPU and my GTX580, so the costs of watercooling those 7970s was basically just the two blocks at $100 each.)

Anyone else experience something similar?
Jump to: