Thanks for the how to on updating the GPU ROMs. I was getting ~550kh/sec before but 650kh/sec now.
Still not at the purported 750kh/sec though (see below).
A couple of "gotchas" that I stumbled into along the way:
First, since I have no Windoze machines I used unetbootin to make the FreeDOS USB bootable drive. However, although it worked I was dropped into A: which didn't have atiflash.exe nor the ROM. This really confused me and I tried B: but it didn't exist.
After much fiddling about I tried C:, just in case, and low and behold there were the files. *facepalm* So, after booting from your FreeDOS USB remember to go to C: to find your BIOS flashing files.
Second, despite flashing both GPUs (just got two in there at the moment while waiting for risers) I was getting only about 350kh/sec each from both cards. WTF! Again after much fiddling about and head scratching I had to leave the office (I'd brought the rig in so some of the guys at work could give me a hand). I noticed that one card was a lot hotter than the other. Then I realised I'd only plugged one screen in and had not been using the dummy plug. Doh!!
This one was a gotcha since it
looked from cgminer's screen like both cards were working, but in reality I think the hashing was being erroneously allocated evenly(ish) to both when in reality one card was idle due to not detecting a screen. Anyway, I hope my n00bish mistakes help someone else.
Regardless, I've only managed to get an average of 650kh/sec/GPU out of the cards using:
./cgminer --shaders 2048 --thread-concurrency 8192 --intensity 13 --worksize 256 --gpu-engine 1100 --gpu-memclock 1800As an aside, setting engine/mem speed to 1080/1500 reduces it to 640kh/sec. Changing -g to 3 (not quite sure what -g does even after
reading the docs but whatever! :p) seemed to make no appreciable difference, or even reduced hashrate, so given the health warning it comes with I decided to leave it alone.
When I try and tweak most things (other than clock speeds) I tend to get "invalid nonce" errors. This applies to many of the proposed settings on the
github LTC mining comparison page - any intensity above 13 seems to barf in particular. I've read that you can get 700-750kh/sec from these cards though as have others on this thread. Any suggestions? I saw something about it being better to use older graphics drivers somewhere but can't find it now.
Kate.
PS. The cards are Gigabyte HD 7970 3072MB GDDR5's with the
latest BIOS and I'm using cgminer 3.1.1. Oh, and Xubuntu.
Edit: I've now got three 7950's in the rig and after reading
the consolidated Litecoin mining guide have found a thread concurrency above 8192 which seems to work and allow greater intensities without those invalid nonce errors. I'm currently trialling this:
./cgminer --thread-concurrency 22392 -I 19 -w 256 -g 2 --gpu-engine 1050 --gpu-memclock 1840 --temp-target 70 --auto-fan -o it is not stable though.
A quick way to tell if it is unstable I've found is to try to ctrl-C (kill) it soon after starting. If it hangs, its unstable. The above is tantalizingly squeezing a teeny bit more from the cards though; I'm now up to just under 670 Kh/sec on average. not much use with the instability though!
With intensity back to 13 and the new thread concurrency I'm getting 630 kh/s. I'm exploring the range above 13 to see if there are any stable ones. I know most people say not, but what the hell. :p Will let you know!