511 KH/s, Core one on 84 and Core two on 85 degrees. Still a bit hot too my taste for every day use.
I have the card on a regular desktop case tho. My final settings are:
cudaminer.exe -H 1 -C 2 -t 1 -i 1 -l K8x32,K8x32
Any other suggestions? Also this is with interactive on, but I notice my pc becomes very laggy.
Not yet. I'm finally managing to reproduce your thermal overload problem on my own setup with 2x GTX690s:
| 82% 90C N/A N/A / N/A | 1087MiB / 2047MiB | N/A Default |
| 57% 78C N/A N/A / N/A | 1087MiB / 2047MiB | N/A Default |
| 57% 79C N/A N/A / N/A | 1087MiB / 2047MiB | N/A Default |
| 54% 75C N/A N/A / N/A | 1087MiB / 2047MiB | N/A Default |
Toasty. That 90C isn't good unless planning on making tea on your computer.
My kernel is going to make your display laggy even in interactive mode, unfortunately. The only thing I can think of to try to reduce both power and lagginess without changing the code is to try -l K2x32,K2x32 or something similar. Have you given that a shot? It should reduce the duration of time that the kernel runs and increase the relative amount of time interactive mode spends telling the GPU to not do mining. Could you let me know how that works?
I'm tied up for a while, but I'll see if now that I have my 690 running I can figure out any efficiency gains for it. Don't hold your breath, though: The 690 is pretty similar to the Grid K2 that I was optimizing for before. I think there are gains to be had for GF110 devices, but maybe not GF104.
-Dave
I wouldn't dare be disappointed, your help is much appreciated! I'm just trying to find out if it would be possible for me to mine with my 690.
The setting you provided indeed helped a lot with my display lag (Performance dropped to 419 Kh/s, but I don't mind that), unfortunately temps climb to 90 after 2 minutes or so, so still no-go :-).
I guess I'm out of luck (unless you find something). Thanks a lot for having a look!