Pages:
Author

Topic: Undervolting a 5870 and a 5770 to achieve better MH/J performance - page 2. (Read 15511 times)

donator
Activity: 1218
Merit: 1079
Gerald Davis
I tried some undervolting on my cards, a couple had VRM chip, so I could change the voltage without any problems. Other cards didn't had any voltage control, so I tried to modify the BIOS of the card, to change the voltage of the card.
I'm pretty sure all non-ancient graphics cards have voltage control. If they didn't, power management would be non-existent.

In Linux you should be able to use AMDOverdriveCtrl to control the voltage (within the limits set by the BIOS). I'm not sure which Windows programs achieve the same. Here's a list that provides some candidates: https://en.bitcoin.it/wiki/GPU_overclocking_tools

Many modern (mostly cheaper) models lack voltage control.  The VRM are not adjustable. Sure it wastes power but it makes the card $5 to $10 cheaper. 
legendary
Activity: 980
Merit: 1008
I tried some undervolting on my cards, a couple had VRM chip, so I could change the voltage without any problems. Other cards didn't had any voltage control, so I tried to modify the BIOS of the card, to change the voltage of the card.
I'm pretty sure all non-ancient graphics cards have voltage control. If they didn't, power management would be non-existent.

In Linux you should be able to use AMDOverdriveCtrl to control the voltage (within the limits set by the BIOS). I'm not sure which Windows programs achieve the same. Here's a list that provides some candidates: https://en.bitcoin.it/wiki/GPU_overclocking_tools
hero member
Activity: 632
Merit: 500
So,

I tried some undervolting on my cards, a couple had VRM chip, so I could change the voltage without any problems. Other cards didn't had any voltage control, so I tried to modify the BIOS of the card, to change the voltage of the card.

My test card was a Powercolor AX6770 with those default settings:
850 Mhz
1.2V

It worked partially, and I was able to modify the default Clock speed and the default voltage. I created a new BIOS with RBE, with the clock at 700 mhz and the voltage at 0.950. The BIOS has some "default" clocks, and I modified every one of them so they had 0.950V in their instructions.

When I booted the card on my mining rig, everything was in order. The card was by default at 700 Mhz, and cgminer showed me that the voltage was at 0.950. The thing is, while mining, the card didn't care about the voltage instructions in the BIOS. It was still consuming the same amount of Watts as before, with the stock BIOS. I could also clock the card at 900 Mhz without any problem, and with the same watts consumption as before.

Overall, the card just used the voltage it needed, ignoring the instructions in the BIOS.

Any of you tried something similar? I believe we have a deep topic on our hands, and it would be interesting to compile the informations of what you guys did.

Here's my results of my testing:
Diamond Radeon 5850
http://www.diamondmm.com/5850PE51G.php
Went from 2.68 MH/J to 3.51 MH/J
700 MHz at 0.88 V
260 MH/s

Sapphire Radeon 5830
http://devicegadget.com/hardware/sapphire-radeon-hd-5830-xtreme-review/3760/
From 2.07 MH/J to 3.16 MH/J
725 MHz at 0.95 V
215 MH/s

Sapphire Vapor 5770 (this thing is a beauty!!!)
http://www.sapphiretech.com/presentation/product/?psn=0001&pid=305&lid=1
From 2.13 MH/J to 3.42 MH/J
825 MHz at 0.95 V
178 MH/s

PowerColor 5870 (this one has an Arctic Cooler on it, because I broke physically the stock fan)
http://techiser.com/powercolor-radeon-hd-5870-ax5870-graphic-card-118806.html
From 2.33 MH/J to 3.51 MH/J
750 MHz at 0.95 V
330 MH/s

These cards have no VRM control:
Powercolor AX6770
http://www.shopping.com/power-color-powercolor-ax6770-1gbd5-h-radeon-hd-6770-1gb-128-bit-gddr5-pci-express-2-1-x16-hdcp-ready-video-card/info

Sapphire 6950
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102914
Yeah, this one is surprising. You can use the Powertune to adjust the consumption, but it keeps the same MH/J.

Powercolor 5770
http://www.guru3d.com/article/powercolor-radeon-hd-5770-pcs-review/

hero member
Activity: 535
Merit: 500
 Underclocking <300 even gives some performance!

 Here's what I've had before for my Sapphire Radeon 5770:
960/300/1.005v:
temp1: 67
temp2: 72
temp3: 70
fan: 50%
221.39

 And what now:
960/244/1.005v:
temp1: 66
temp2: 70
temp3: 69
fan: 48%
222.25 mhash/s!!!!
legendary
Activity: 980
Merit: 1008
What would be the theoretical correlation between stable voltage and clock pairs? I mean, if I decrease the voltage by 10%, should I also be able to decrease the clock by 10% and get a stable GPU? Or is it more complex, perhaps so much that trial and error is the only way to know? In any case, it'd be useful to have some rule of thumb to go by, even if it isn't completely accurate.

Currently, my mining rig is placed in a shed, where I would like the temperature to never go below 10C. Because of this, I'd like to be able to change voltage/clock dynamically, in order to control the power that my mining rig dissipates, thus acting as a sort of radiator with a thermostat. So it would be useful to be able to derive stable voltage/clock pairs from a known stable voltage/clock pair.
hero member
Activity: 896
Merit: 1000
Seal Cub Clubbing Club
I pay .0202 cents a watt

Damn, that's pretty good. I have one of those variable plans. It toggles from $0.05815/kwh during on-peak hours, down to $0.04273/kwh during off-peak hours. Of course I have to pay a flat "demand" charge of something like $10 per kW I use.  My power company likes to stack on little rinky dink charges that add up Sad
legendary
Activity: 1190
Merit: 1000
Even with 3x 5970s at 880 mhz (2400 mhash) around 900 watts its only  18 dollars a month to run them. The current payout is $343 a month. That leaves power at 5.2% of expense. Outside of keeping temps down it seems to make little sense to underclock. I realize power costs are higher in some places but it seems like if power was that high mining would be a bad idea to start with, maybe fpga would make more sense.

You pay $0.0202 per kilowatt-hour. At home I pay ~$0.32 per kilowatt-hour. My electricity costs at home are 16 times yours. At your power consumption rate that would put my expenses at ~80%. Undervolting can cut it from ~80% to ~40%, which makes it worth doing.
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
I pay .0202 cents a watt
You don't pay "per watt", you pay per watt-hour or to be precise, per kilowatt-hour (1000 watts per hour is $0.0202). Further, there are likely taxes and generation fees on top of that. That is a good rate however.
sr. member
Activity: 392
Merit: 250
Even with 3x 5970s at 880 mhz (2400 mhash) around 900 watts its only  18 dollars a month to run them. The current payout is $343 a month. That leaves power at 5.2% of expense. Outside of keeping temps down it seems to make little sense to underclock. I realize power costs are higher in some places but it seems like if power was that high mining would be a bad idea to start with, maybe fpga would make more sense.

.9kw x 24 x 30 = 648.

$18 / 648kwh = $0.027/kwh  Huh

You pay 3 cents a kilowatt-hour?

More reasonable cost are 10 cents a kilowatt-hour, for around $60 a month in electricity.  I think undervolted GPU mining will keep many systems up a while longer but my with FPGA's doing 400+ MH/s for 15 watts the day of the GPU is going to come to an end.
I pay .0202 cents a watt
sr. member
Activity: 420
Merit: 250
Even with 3x 5970s at 880 mhz (2400 mhash) around 900 watts its only  18 dollars a month to run them. The current payout is $343 a month. That leaves power at 5.2% of expense. Outside of keeping temps down it seems to make little sense to underclock. I realize power costs are higher in some places but it seems like if power was that high mining would be a bad idea to start with, maybe fpga would make more sense.

.9kw x 24 x 30 = 648.

$18 / 648kwh = $0.027/kwh  Huh

You pay 3 cents a kilowatt-hour?

More reasonable cost are 10 cents a kilowatt-hour, for around $60 a month in electricity.  I think undervolted GPU mining will keep many systems up a while longer but my with FPGA's doing 400+ MH/s for 15 watts the day of the GPU is going to come to an end.
sr. member
Activity: 392
Merit: 250
Even with 3x 5970s at 880 mhz (2400 mhash) around 900 watts its only  18 dollars a month to run them. The current payout is $343 a month. That leaves power at 5.2% of expense. Outside of keeping temps down it seems to make little sense to underclock. I realize power costs are higher in some places but it seems like if power was that high mining would be a bad idea to start with, maybe fpga would make more sense.
member
Activity: 114
Merit: 10
Hello all

I just recently found out how dramatic an increase in efficiency undervolting can achieve. In my mining rig I'd like to optimize this, and preferably get it to consume around 250W in total. Right now it's consuming around 265W. I'm running with the following voltage/clock settings currently:

5870: 750 MHz / 1.000V / 97W / 343 MH/s / 3.54 MH/J
5770: 750 MHz / 1.010V / 63W / 168 MH/s / 2.67 MH/J

As you can the 5870 is significantly more efficient than the 5770. Then again it's also running at 1V vs. the 5770's 1.01V. But I simply can't get the 5770 to run properly at 1V at 750 MHz. At this voltage it keeps stalling in cgminer ("declared SICK") unless I run it at 700 MHz.

What are people's experiences with 5870's and 5770's and stable voltage/clock combinations of these?

By the way, the 5870 is an HD-587X-ZNFV V1.3 5870, and the 5770 is this Sapphire card, though I'm not sure if it has 512 or 1024MB RAM.

Hiho!

I have 2x 5830 and 4x 5770 cards in two rigs. Every card is overclocked and undervolted. My settings are:

"Sapphire HD5830"  core=900 memory=300 vddc=1.140
"XFX 5770"            core=880 memory=600 vddc=1.200
"Sapphire HD5770"  core=905 memory=300 vddc=0.960

"Sapphire HD5830"  core=910 memory=300 vddc=1.080
"ASUS HD5770"      core=925 memory=300 vddc=1.050
"Sapphire HD5770"  core=880 memory=300 vddc=1.010

Like you can see, there are many different setting on the 5770 cards. The XFX for example needs 600Mhz mem clock, no way to go more down. The Sapphire runs at 905 Mhz with only 0.960 vddc. I think you have to test a lot, till it's stable. Both RIGs need 650 Watts BUT with 3,5 HDD and the 100% Bug. So I think I can bring it down to 550 Watts @ 1,5 GHash. I'm looking forward to my new XFX 5970 Black Edition with 2x 920 Mhz @ 1.200 vddc ;-) (tested but not build into RIG yet).

Btw.: The great thing about my 5770s is, that all are "used" and I got them for around 60$ each on Sept. 2010! They are still a good catch I think.

Greetz
NetworkerZ
legendary
Activity: 1190
Merit: 1000
Unless the idle draw of the cards represents the same amount of logical work done, you can't just subtract them out to get the "mining" power consumed.
If one card idles hot doing tons of work that is not mining related, subtracting that from the final mining number misrepresents how much power that card actually draws in order to mine. Especially if it stops doing that work while mining. Conversely, if one card idles cold it won't have any idle wattage to subtract making its mining numbers seem like they draw more power.

The only numbers that can be compared are 100% mining vs 100% mining. Unless someone wants to map out exactly how much logical work each card does while idle and pro-rate their wattage for the extra work done...

Idle card does no work.  i.e. 0 mhash/s.

Look at it this way.

System at load:  300W
System at idle (including GPU idle wattage): 100W
GPU idle wattage: 10W

The reason we wan't to subtract the GPU idle wattage is to get the true GPU load wattage.

100W - 10W = 90W (system idle w/o no GPU).

300W - 90W = 210W (GPU full wattage at load).

Now we have apples to apples comparison, the GPU wattage at load.


We can also predict other system values.
system w/ 1 GPU  = 90W + 1*210W = 300W
system w/ 2 GPUs = 90W + 2*210W = 510W
system w/ 3 GPUs = 90W + 3*210W = 720W
system w/ 4 GPUs = 90W + 4*210W = 930W
system w/ 5 GPUs = 90W + 5*210W = 1140W
system w/ 6 GPUs = 90W + 6*210W = 1350W

Idle cards do no hashing related work. But it consumes watts. Therefore it does work. Explain exactly all of the instructions executed (like monitoring and answering driver polls, DMA channel, etc). If this work is accomplished only on an idle card and not on a mining card, then we need to account for that. Once those are accounted for we can compare to see if one card is providing say "full service with frills" to the OS, and the other card is "self service" while idle. That condition can skew the measurements. Especially with a card designed to go very cold while idle vs one that runs hot with idle cycles.

Apples to oranges.

Edit: A concrete example follows.

The AMD Phenom II X6 1100T (3.3GHz) consumes 20w while idle, and 109w under full load.
The Intel Core i7 2600K @ 4.4GHz consumes 5w while idle, and 111w under full load.

While under full load (in theory), the CPUs are not executing any idle cycles.
Also, voltage may be stepped down while idle and parts of the chip shut off further skewing the comparison.
Subtracting those 20 from 109  & 5 from 111 will not give you anything useful.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Unless the idle draw of the cards represents the same amount of logical work done, you can't just subtract them out to get the "mining" power consumed.
If one card idles hot doing tons of work that is not mining related, subtracting that from the final mining number misrepresents how much power that card actually draws in order to mine. Especially if it stops doing that work while mining. Conversely, if one card idles cold it won't have any idle wattage to subtract making its mining numbers seem like they draw more power.

The only numbers that can be compared are 100% mining vs 100% mining. Unless someone wants to map out exactly how much logical work each card does while idle and pro-rate their wattage for the extra work done...

Idle card does no work.  i.e. 0 mhash/s.

Look at it this way.

System at load:  300W
System at idle (including GPU idle wattage): 100W
GPU idle wattage: 10W

The reason we wan't to subtract the GPU idle wattage is to get the true GPU load wattage.

100W - 10W = 90W (system idle w/o no GPU).

300W - 90W = 210W (GPU full wattage at load).

Now we have apples to apples comparison, the GPU wattage at load.


We can also predict other system values.
system w/ 1 GPU  = 90W + 1*210W = 300W
system w/ 2 GPUs = 90W + 2*210W = 510W
system w/ 3 GPUs = 90W + 3*210W = 720W
system w/ 4 GPUs = 90W + 4*210W = 930W
system w/ 5 GPUs = 90W + 5*210W = 1140W
system w/ 6 GPUs = 90W + 6*210W = 1350W
legendary
Activity: 1190
Merit: 1000
Unless the idle draw of the cards represents the same amount of logical work done, you can't just subtract them out to get the "mining" power consumed.
If one card idles hot doing tons of work that is not mining related, subtracting that from the final mining number misrepresents how much power that card actually draws in order to mine. Especially if it stops doing that work while mining. Conversely, if one card idles cold it won't have any idle wattage to subtract making its mining numbers seem like they draw more power.

The only numbers that can be compared are 100% mining vs 100% mining. Unless someone wants to map out exactly how much logical work each card does while idle and pro-rate their wattage for the extra work done...
vip
Activity: 1358
Merit: 1000
AKA: gigavps
newbie
Activity: 58
Merit: 0
Anyone try this with 5830's yet?
hero member
Activity: 896
Merit: 1000
Seal Cub Clubbing Club
Ohhh yeah I'm an idiot.com/index.php?herp=derp.  I totally misunderstood the definition of "headless"
sr. member
Activity: 406
Merit: 257
I thought about that, and I guess the only way to figure out how much power the card draws at idle is to have two identical cards.

You wouldn't need two identical cards if you can boot headless.  Boot and measure idle w/ no cards installed and with one card installed.

Also a logic puzzle would be to figure out the idle wattage of 2 different cards using no other cards and no headless boot (it can be done but involves multiple boots). Smiley



Quote
Of course this doesn't apply to the 79XX series since it has the ability to hibernate GPUs that aren't currently active, so the MH/J for a multi-card 79XX setup are going to appear unusually low simply because system idle draw of a single 7970 is the same as that very close to that of a quad 7970 setup as I understand it.

The hybernating 7970 uses much less idle power but it isn't 0.  I think AMD claim is <3W.  With 3x 7970 one could get the exact idle and hibernating wattage.
your 2 unknown card idle wattages are A and B, your system is X
measure: (X + A), (X + B), (X + A + B)
(X + A) + (X + B) - (X + A + B) = X
(X + A + B) - (X + A) = A
(X + A + B) - (X + B) = B
donator
Activity: 1218
Merit: 1079
Gerald Davis
I'm still trying to figure out the answer to the logic puzzle.  Embarrassed

Really.

System A = system w/ only card 1
System B = system w/ only card 2
System C = system w/ card 1 & 2

Boot system 3 times in the configurations above and record total wattage. (Wattage A, B, C).
Calculate the difference for
Wattage C - Wattage A
Wattage C - Wattage B

Wattage A = system idle + card1 idle
Wattage B = system idle + card2 idle
Wattage C = system idle + card1 idle + card2 idle

Thus
(Wattage C - Wattage A) = system idle + card1 idle + card2 idle - ( system idle + card1 idle )
(Wattage C - Wattage A) = system idle + card1 idle + card2 idle - system idle - card1 idle
(Wattage C - Wattage A) =  card2 idle


(Wattage B - Wattage A) = system idle + card1 idle + card2 idle - ( system idle + card2 idle )
(Wattage B - Wattage A) = system idle + card1 idle + card2 idle - system idle - card2 idle
(Wattage B - Wattage A) =  card1 idle

An example (and verification)

Unknown actual values
System Idle = 100W
Card 1 Idle = 15W
Card 2 Idle = 20W

Measured Results
Wattage A = 115W
Wattage B = 120W
Wattage C = 135W

Calculated Results
(Wattage C - Wattage A) = 135W - 115W = 20W  (graphics card 2)
(Wattage C - Wattage B) = 135W - 120W = 15W  (graphics card 1)
Pages:
Jump to: