Pages:
Author

Topic: New command-line tool for overclocking ATI cards (Linux) (Read 50925 times)

hero member
Activity: 721
Merit: 523
Not sure where this leaves us but I did manage to get my card running at 1.000v down from 1.188v under Linux (think I can go a tiny bit lower)

Stats
Power Color HD7950 same settings machine etc on litecoin (old P4 motherboard uses 120-130 watts on its own)

533.6kh/s using ~388watts at 72 degrees c

after undervolt

555.5kh/s using ~330watts at 61 degrees c

the undervolt allows me to up the powertune to 10 (with out the temp racing up to over 90) and intensity and I was getting 580kh/s using ~335watts at ~70 degrees c

ignoring my motherboard power problem, this then gets nice and close to:

Quote
If you are paying for electricity you can get 560 kH/s for 170W (925/1250/.962V) per card plus 55W overhead for the computer via undervolting the core. - Source linky link

Considering that I haven't tried 0.962v yet



Now for the part some people wont want to hear, I hex edited the bios  Tongue

If there's enough interest perhaps ill make a thread with huge disclaimers everywhere.  HERE


Update: Got two 7950s doing ~585kh/s each the whole rig is pulling 460W at the wall (same inefficient motherboard) voltage at 0.950
newbie
Activity: 47
Merit: 0
There has to be support forums for afterburner, or trixx, where you might be able to get feedback like this (?).
I wish I could help more, but I'm not sure what calls are made to what .dll's to get accurate information for the voltage, without using the native calls to the driver (that appear to be incorrect).

The BEST way to resolve this, would be to prod the ATI driver devs, and have them fix their driver to provide correct information.. ?

If there's anything that I could try to do to help, I'm also invested in getting this working better on Linux...
hero member
Activity: 721
Merit: 523

Hmm, interesting.  Really hoping this would be working for linux, not switching to windows if I can help it.
Have a 7950, and here are my observations:
--When I set clocks and voltage for performance level 0/1, it appears to work.  atitweak appears to correctly be setting the clocks and voltage!
--When I start mining with reaper, the clock settings show up correctly, but voltage does not.
... here's where I get confused.  atitweak -s gives:
---x---
1. AMD Radeon HD 7900 Series  (:0.1)
    engine clock 950MHz, memory clock 1400MHz, core voltage 1.25VDC, performance level 3, utilization 99%
---x---

"performance level 3" ?  I can't set anything for performance level 3.. could that be why the voltage settings aren't being applied?  (If I try to specify -P 3, it doesn't affect anything, and there is no output.. )?

Also, this topic is stale.. anyone know how to contact the developer to see if this might be something easy?  I'd be happy to test!

Edit:  I believe Trixx and MSI Afterburner (both windows) will allow voltage changes... I just happened to notice the voltage change on performance levels 0 and 1, but then when I mine, -s shows level 3...
Edit2:  Shows 4x different clock states here:  http://www.techpowerup.com/vgabios/117750/Gigabyte.HD7970.3072.120308.html
I have hope that this is something fixable...  Smiley


I had a play with this tonight seems like im not the only one http://devgurus.amd.com/thread/158840

I was able to change the atitweak code to set the voltage for profile 3 on my power color 7950 but it didn't change the voltage, no error though. I was watching temps as well nothing changed so even if it told me the wrong voltage I would see the temps drop as they do under windows. Also wall plug power monitor.

Digging around in the documentation it seems theirs a ADL_Overdrive6.     - I was wrong the card chooses ADL_Overdrive5.

//microsoft windows
If anyone knows of a program like process monitor that will actually tell me the calls a exe is making calls to a dll?

If I can get something like that I can see how powerup Tuner works.


I am determined to undervolt my card under linux so I will exhaust every possible software way to do it. I have already done some extensive research on how to hex edit my bios as a backup. This would be a last resort though.
newbie
Activity: 15
Merit: 0
Doesnt' seem to have any effect on the Radeon 6990 -- the temp is still the same (around 83 degrees centigrade). Do I have to switch the BIOS of the card to the overclock position for this to work?
newbie
Activity: 58
Merit: 0
Hi, adl3 developer here. I'm not able to spend any time developing the python bindings any more (had a kid back in January -- ZERO free time). I'd be happy to turn over the PyPI keys to anyone who's able to maintain adl3.

I'll update the github readme this week to let downstream users know. I see there's a bunch of folks who have forked the repo, so I'll put out some feelers -- maybe someone's already solved this problem and hasn't bothered to submit a pull request, or would like to take over maintenance.

cheers
.m.
sr. member
Activity: 280
Merit: 260
Hi does anybody have any idea what to do ?

[test@localhost adl3]$ aticonfig --lsa
* 0. 01:00.0 AMD Radeon HD 7900 Series
  1. 02:00.0 AMD Radeon HD 7800 Series

* - Default adapter
[test@localhost adl3]$ aticonfig --adapter=0 --od-enable
AMD Overdrive(TM) enabled
[test@localhost adl3]$ aticonfig --adapter=1 --od-enable
AMD Overdrive(TM) enabled
[test@localhost adl3]$ aticonfig --adapter=0 --odgt

Adapter 0 - AMD Radeon HD 7900 Series
            Sensor 0: Temperature - 42.00 C
[test@localhost adl3]$ aticonfig --adapter=1 --odgt
ERROR - Get temperature failed for Adapter 1 - AMD Radeon HD 7800 Series
[test@localhost adl3]$ ./atitweak -l
ADL_Adapter_Active_Get failed.

and the same error in Xorg.0.log as "bogesman" above Sad
newbie
Activity: 26
Merit: 0
Hi, I have BAMT with 3x5850+3x7950, and have such strange issue:
With 5 cards plugged atitweak works good, but when I plug 6 cards atitweak do not reply anything, it freezes. All mining software use atitweak, so I can't work with 6 cards... Any ideas? aticonfig command works well.
newbie
Activity: 47
Merit: 0
Hi,

When I use Atitweak -l it only lists performancelevel 0 and 1. There is no Performance level 2?

While mining I can use atitweak to change Engine and Memory but have to use Performance level 1 for anything to happen. The same with changing Voltage but the voltage actually never change.

In cgminer it also works to change the Engine clock and the Memory clock and it also says that "Driver reports success..." but the Voltage never changes.

It is the same experience with both the cards
MIS 7970 Lightening & Sapphire Dual-X OC

Any idea what I can do to change the voltage?

Best Regards
Robert

I think this is the usual ADL issue with linux.  You can't change voltage with 6xxx or 7xxx series, and memory has a difference limit, e-100 6870, e-125 69xx, e-150 7970.

Hmm, interesting.  Really hoping this would be working for linux, not switching to windows if I can help it.
Have a 7950, and here are my observations:
--When I set clocks and voltage for performance level 0/1, it appears to work.  atitweak appears to correctly be setting the clocks and voltage!
--When I start mining with reaper, the clock settings show up correctly, but voltage does not.
... here's where I get confused.  atitweak -s gives:
---x---
1. AMD Radeon HD 7900 Series  (:0.1)
    engine clock 950MHz, memory clock 1400MHz, core voltage 1.25VDC, performance level 3, utilization 99%
---x---

"performance level 3" ?  I can't set anything for performance level 3.. could that be why the voltage settings aren't being applied?  (If I try to specify -P 3, it doesn't affect anything, and there is no output.. )?

Also, this topic is stale.. anyone know how to contact the developer to see if this might be something easy?  I'd be happy to test!

Edit:  I believe Trixx and MSI Afterburner (both windows) will allow voltage changes... I just happened to notice the voltage change on performance levels 0 and 1, but then when I mine, -s shows level 3...
Edit2:  Shows 4x different clock states here:  http://www.techpowerup.com/vgabios/117750/Gigabyte.HD7970.3072.120308.html
I have hope that this is something fixable...  Smiley
donator
Activity: 798
Merit: 500
Hi,

When I use Atitweak -l it only lists performancelevel 0 and 1. There is no Performance level 2?

While mining I can use atitweak to change Engine and Memory but have to use Performance level 1 for anything to happen. The same with changing Voltage but the voltage actually never change.

In cgminer it also works to change the Engine clock and the Memory clock and it also says that "Driver reports success..." but the Voltage never changes.

It is the same experience with both the cards
MIS 7970 Lightening & Sapphire Dual-X OC

Any idea what I can do to change the voltage?

Best Regards
Robert

I think this is the usual ADL issue with linux.  You can't change voltage with 6xxx or 7xxx series, and memory has a difference limit, e-100 6870, e-125 69xx, e-150 7970.
newbie
Activity: 12
Merit: 0
Hi,

When I use Atitweak -l it only lists performancelevel 0 and 1. There is no Performance level 2?

While mining I can use atitweak to change Engine and Memory but have to use Performance level 1 for anything to happen. The same with changing Voltage but the voltage actually never change.

In cgminer it also works to change the Engine clock and the Memory clock and it also says that "Driver reports success..." but the Voltage never changes.

It is the same experience with both the cards
MIS 7970 Lightening & Sapphire Dual-X OC

Any idea what I can do to change the voltage?

Best Regards
Robert
legendary
Activity: 1106
Merit: 1006
Lead Blockchain Developer
Somebody else having issues with latest driver 12.6?

driver info
Quote
[9.574] (II) fglrx(0):     Name: fglrx
[9.574] (II) fglrx(0):     Version: 8.98.2
[9.574] (II) fglrx(0):     Date: Jun 11 2012
[9.574] (II) fglrx(0):     Desc: AMD FireGL DRM kernel module

This is the error that I receive

Quote
Traceback (most recent call last):
  File "/usr/local/bin/atitweak", line 24, in
    from adl3 import *
  File "/usr/local/lib/python2.7/dist-packages/adl3/__init__.py", line 1, in
    from .adl_api import *
  File "/usr/local/lib/python2.7/dist-packages/adl3/adl_api.py", line 40, in
    _libadl = CDLL("libatiadlxx.so", mode=RTLD_GLOBAL)
  File "/usr/lib/python2.7/ctypes/__init__.py", line 353, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: /usr/lib/fglrx/libatiadlxx.so: undefined symbol: APL_Initialize


I also have in Xorg.0.log a lot of lines like this
Quote
(WW) fglrx(0): ADL handler failure: PowerPlay library not initialized
(WW) fglrx(0): ADL handler failure: PowerPlay library not initialized


+1, ditto here.  12.6 driver, Ubuntu 10.04.4 LTC, Python 2.6.5
newbie
Activity: 10
Merit: 0
Somebody else having issues with latest driver 12.6?

driver info
Quote
[9.574] (II) fglrx(0):     Name: fglrx
[9.574] (II) fglrx(0):     Version: 8.98.2
[9.574] (II) fglrx(0):     Date: Jun 11 2012
[9.574] (II) fglrx(0):     Desc: AMD FireGL DRM kernel module

This is the error that I receive

Quote
Traceback (most recent call last):
  File "/usr/local/bin/atitweak", line 24, in
    from adl3 import *
  File "/usr/local/lib/python2.7/dist-packages/adl3/__init__.py", line 1, in
    from .adl_api import *
  File "/usr/local/lib/python2.7/dist-packages/adl3/adl_api.py", line 40, in
    _libadl = CDLL("libatiadlxx.so", mode=RTLD_GLOBAL)
  File "/usr/lib/python2.7/ctypes/__init__.py", line 353, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: /usr/lib/fglrx/libatiadlxx.so: undefined symbol: APL_Initialize


I also have in Xorg.0.log a lot of lines like this
Quote
(WW) fglrx(0): ADL handler failure: PowerPlay library not initialized
(WW) fglrx(0): ADL handler failure: PowerPlay library not initialized
omo
full member
Activity: 147
Merit: 100
I tried this  with 7970 under linux and it works!
newbie
Activity: 10
Merit: 0
Has anyone tested this with 7970 or 7950?
Will it work, because as I know ATI limited things from drivers. And RBE does not not support 7xxx.
sr. member
Activity: 274
Merit: 250
1. Forgive me. I`m a linux no0b.
2. Why sommethink verry important to my GPU like seaing VRM temp is eassy in windows, but not in linux witch is so important to mu GPU too :>
3. If you dont understand that, i must sau, that i`m anglish no0b too Tongue

EDIT
I still have in mind such a nice toll for windows like setfsb. you could choose witch smd you wont use. if you ware wrong - reboot. i think thare is no unlimited pool of i2c`s, we just need to choose the right one we need.
newbie
Activity: 58
Merit: 0
I think it needs a lot of work to retrieve i2c values. However, different card has different i2c chips, so it's not possible to support every card.

A similar discussion is on the forum, too.

https://bitcointalksearch.org/topic/radeonvolt-hd5850-reference-voltage-tweaking-and-vrm-temp-display-for-linux-10228

I think mjmvisser had tried to read i2c but had failed, that's why he commented out the i2c-related code, right?

I spent about a day on it, then gave up. I hate hardware hacking. :-)
member
Activity: 61
Merit: 10
Bitcoin believer
I think it needs a lot of work to retrieve i2c values. However, different card has different i2c chips, so it's not possible to support every card.

A similar discussion is on the forum, too.

https://bitcointalksearch.org/topic/radeonvolt-hd5850-reference-voltage-tweaking-and-vrm-temp-display-for-linux-10228

I think mjmvisser had tried to read i2c but had failed, that's why he commented out the i2c-related code, right?

@lueo
is it possible to read VRM temp`s in the future ??

sr. member
Activity: 274
Merit: 250
@lueo
is it possible to read VRM temp`s in the future ??
newbie
Activity: 58
Merit: 0
OK, the fix is on github, along with another fix from lueo that shows fan speed if only one of RPM or percentage is available. Please pull and test, and I'll push out the update to pypi when both issues are confirmed fixed.

thanks!
newbie
Activity: 58
Merit: 0
For whatever reason, a 64-bit libXext.so doesn't exist on my server, but libXext.so.6 is there. I'll update github and push out another release to pypi.
Pages:
Jump to: