Author

Topic: CCminer(SP-MOD) Modded NVIDIA Maxwell / Pascal kernels. - page 973. (Read 2347664 times)

sp_
legendary
Activity: 2954
Merit: 1087
Team Black developer
Edit, I forgot to check in something. Will do it now.


I have merged som changes from another coder called Nelson, but some of the code doesn't work so I had to revert parts of it.

I think the best will be to do a clean and a full rebuild with the latest sourcecode.
hero member
Activity: 677
Merit: 500
In commit 980 i have a problem whith text visualisation in Miner Control (last correct is tid xxxxxxxx, thats all...). No hash. No gpu №. but miner is working. Cant test hashrates...
On nicehash miner is faster. I go to close credit for my 980gtx and do payment for grate gob!
legendary
Activity: 938
Merit: 1000
I currently have crossfired R9 270 cards in my computer. I am pulling them out and rebuilding my mining rig. I will be adding 2 R9 290x cards to make a four card rig. The R9 270 is currently the best overall card to mine ETH, getting 18 MH/s with each card, whereas reported hashrates for the R9 290x are about 21 MH/s. So obviously 2 of the R9 270 is better than a single R9 290x for this purpose, but I want my rig to be versatile with regards to mining other algos.

So.... that said, I am writing in this thread to say that I am debating whether or not to get a Nvidia card for my computer to replace the crossfired R9 270 cards.  I had ordered an R9 Fury X, but delays in shipping caused me to cancel. I am now down to deciding on either a single R9 290x Vapor-X or the Gigabyte GTX 970 G1.

This seems like it would give me the best versatility between my mining rig and my computer when mining various algos. Power consumption is not that much of a concern, but keeping temps cool in my computer case is a big plus that has me leaning towards getting the Nvidia card. I am only undecided at this point because from most available comparisons of hashrates that I can find, the R9 290x seems to greatly beat out the 970.

 I probably have said more than I need to, but what say ye? Is the 970 the better choice?


Edit: Okay, wrong thread for this probably  Cheesy
full member
Activity: 201
Merit: 100
@dominuspro,  I see you're running a newer driver than me. it seems nvidia likes to change the method of enabling OC on all GPUs with each new driver release. I had to go thru the same research because kopiemtu used to use driver 340.xx and the OC method was different even back then.

At least it is possible to oc Wink
Anyway I feel much more confident overclocking in Win enviroment with afterburner Tongue

sr. member
Activity: 427
Merit: 250
@dominuspro,  I see you're running a newer driver than me. it seems nvidia likes to change the method of enabling OC on all GPUs with each new driver release. I had to go thru the same research because kopiemtu used to use driver 340.xx and the OC method was different even back then.
sp_
legendary
Activity: 2954
Merit: 1087
Team Black developer
My EVGA superclocked gtx 950 mines quark at 8.3MHASH on factory clocks. Release 62
I have the same card. How far does your boost by default?
Mine goes to 1380. I got shit ASIC, only 66%  Angry

Don't have gpuz installed. Not sure. In the latest from github the EVGA gtx 950 superclocked is peaking at 8.5 MHASH release 62+. So I managed to push it up  150-200KHASH this weekend.
sp_
legendary
Activity: 2954
Merit: 1087
Team Black developer
Not much but I guess every little bit help. Wow it charges a fee every time you send now?
I don't recall it doing that before on BTC, just every now and then if I recall right..

- @sp_ : Transaction ID: 6f3ef10c412df4da57adf4d9d2c685d4c0dfdee610a2440bd2ff9acbc410f1c5-000
- @djm34 : Transaction ID: 438f780019f9788797a58b982b46575daa47e5b6f5b3db8cfe9e254ed723ba57-000
- @pallas : Transaction ID: 88b404d6b9036c6b123d192f54d8accb52f1da347a5a5f5855dc3c229d648b7c-000

Thanks for your support. I have added more hash for the gtx 960 (quark.) the other cards are also abit faster.
If you can build yourself, please try release 62+ (github)

Keccac512 with less instructions

full member
Activity: 201
Merit: 100
linux overclock:
As a complete linux noob I lost 1 afternoon messing with xorg.conf file. I could enable OC for first gpu only. Then luckily I found nvidia-xconfig command.
This command made the trick:
nvidia-xconfig -a --cool-bits=28

It generates a new xorg.conf.
-a enables 1 screen for each gpu
coolbits 28 enables nvidia oc + fan controls in xserver settings and should also enable voltage control from the terminal. It is meant for maxwell cards. Older could need different approach.


Tested on latest Lubuntu with nvidia 355.06 driver and some messed up cuda sdk 7.5 + 7.0

I hope it helps somebody Wink

cuda 7.0 ( .28 ) and 7.5 break just about every hashrate except x11 from what i can see ( and x11 is down by around 200KH compiled with them also ) ....

but i will try this when i get the chance to change settings / systems ...

ill be in full swing again tomorrow - so a great deal needs to be done ... debian based systems ( like ubuntu ) may work differently than fedora - so i will try and have a look at how this functions for fedora tomorrow ...

tanx for the info ...

#crysx

I know about cuda issues... I just tried following the tutorial on ethereum cuda thread to start slowly with linux Wink
Then I managed tho compile also other miners and then started messing with overclock... I'll probably have to do a new linux install to a smaller usb stick and it will be cuda 6.5 then Smiley

I forgot to say that it does not need any display or dummy plug connected into the second graphic card.
cheers
legendary
Activity: 2940
Merit: 1091
--- ChainWorks Industries ---
linux guys, check out the kopiemtu thread on litecoin.org. i solved the overclocking multiple cards issue around page 60-65 or so.  i run driver 346.59 and cuda 6.5

tanx hashbrown ...

its a mess when it comes to a 'simple' way of enabling all those features with linux ...

nvidia - as large and equippped as they are - really dont put a great deal of thought and effort into making it a more simplified approach to facilitating the linux community ... amd are a lot worse though ...

your help with this has been priceless ...

tanx again ...

#crysx
legendary
Activity: 2940
Merit: 1091
--- ChainWorks Industries ---
linux overclock:
As a complete linux noob I lost 1 afternoon messing with xorg.conf file. I could enable OC for first gpu only. Then luckily I found nvidia-xconfig command.
This command made the trick:
nvidia-xconfig -a --cool-bits=28

It generates a new xorg.conf.
-a enables 1 screen for each gpu
coolbits 28 enables nvidia oc + fan controls in xserver settings and should also enable voltage control from the terminal. It is meant for maxwell cards. Older could need different approach.


Tested on latest Lubuntu with nvidia 355.06 driver and some messed up cuda sdk 7.5 + 7.0

I hope it helps somebody Wink

cuda 7.0 ( .28 ) and 7.5 break just about every hashrate except x11 from what i can see ( and x11 is down by around 200KH compiled with them also ) ....

but i will try this when i get the chance to change settings / systems ...

ill be in full swing again tomorrow - so a great deal needs to be done ... debian based systems ( like ubuntu ) may work differently than fedora - so i will try and have a look at how this functions for fedora tomorrow ...

tanx for the info ...

#crysx
full member
Activity: 201
Merit: 100
linux overclock:
As a complete linux noob I lost 1 afternoon messing with xorg.conf file. I could enable OC for first gpu only. Then luckily I found nvidia-xconfig command.
This command made the trick:
nvidia-xconfig -a --cool-bits=28

It generates a new xorg.conf.
-a enables 1 screen for each gpu
coolbits 28 enables nvidia oc + fan controls in xserver settings and should also enable voltage control from the terminal. It is meant for maxwell cards. Older could need different approach.


Tested on latest Lubuntu with nvidia 355.06 driver and some messed up cuda sdk 7.5 + 7.0

I hope it helps somebody Wink
sr. member
Activity: 427
Merit: 250
linux guys, check out the kopiemtu thread on litecoin.org. i solved the overclocking multiple cards issue around page 60-65 or so.  i run driver 346.59 and cuda 6.5
full member
Activity: 231
Merit: 150

You only need to add one line, "12" will get you OC and fan control but only on cards with a monitor attached.
I've seen attempts to fake a monitor on a second card but nothing has worked for me.

http://www.phoronix.com/scan.php?px=MTY1OTM&page=news_item
Not really true so I found out on my setup. 8 & 12 did nothing only after I added "Coolbits" "5" was adjusting the fan speed enabled.
So I think it may depending on the card type and or driver used, some may have to play with different numbers for each type setup or driver used.
I just never removed the others and paste those because they did work for other setups.

Edit: have you tried making a dummy plug for those cards without a monitor?

https://rumorscity.com/2013/12/06/how-to-create-dummy-plugs-for-your-graphics-cards/

We had to use these back in the day before Nvidia got there stuff together in there drivers for folding at home rigs with muti video cards.
I still have some laying around here. lol

I had only tried software hacks but this looks interesting.

The coolbits pattern changed, either with the driver version or card generation. I have the latest driver and
maxwell cards and "12" works for me. "5" is an old code, the phoronix article explains it.  Either way you
can set all the necessary bits on one line.

I'm using a pretty old driver 304.125 because its the only one I could get folding to work with without errors in Linux.
So its probably why "5" is the only one that works in this setup. I still fold once a month with my main server rig
for the $10 EVGA Bucks, saves me 120 dollars a year off my next EVGA buy. I didn't really buy much last year so
I had enough to fully pay for the GTX960SSC with what I had saved up. 750k gets the 1st 5$ 1.5 mil get the 2nd 5$
http://folding.extremeoverclocking.com/user_summary.php?s=&u=610174

Folding@EVGA http://forums.evga.com/EVGA-Folding-Year-8-m2301653.aspx
I been folding for over 15 years way back before we had the type of power we have today to do these things.
I only cut back on my folding after getting into Cryptocurrency and F@H made a 4P rig almost worthless after I just payed $900
for the motherboard alone. So I put all my hardware to make money instead of spending it to keep up with folding projects.
It is a good cause.

About the bits I had read that all could be on one line but was unsure if they needed spaces or comers in between each
since there were no examples shown and the post wasn't very clear. Thanks for clearing that up.
sr. member
Activity: 292
Merit: 250
lyra2v2 is working here..

(Release 62)


Previously mentioned R62 which didn't run on my machine, I found the problem. It seems CUDA 7 was the problem. I removed it and put in CUDA 6.5 and now everything is ok.
legendary
Activity: 1470
Merit: 1114

You only need to add one line, "12" will get you OC and fan control but only on cards with a monitor attached.
I've seen attempts to fake a monitor on a second card but nothing has worked for me.

http://www.phoronix.com/scan.php?px=MTY1OTM&page=news_item
Not really true so I found out on my setup. 8 & 12 did nothing only after I added "Coolbits" "5" was adjusting the fan speed enabled.
So I think it may depending on the card type and or driver used, some may have to play with different numbers for each type setup or driver used.
I just never removed the others and paste those because they did work for other setups.

Edit: have you tried making a dummy plug for those cards without a monitor?

https://rumorscity.com/2013/12/06/how-to-create-dummy-plugs-for-your-graphics-cards/

We had to use these back in the day before Nvidia got there stuff together in there drivers for folding at home rigs with muti video cards.
I still have some laying around here. lol

I had only tried software hacks but this looks interesting.

The coolbits pattern changed, either with the driver version or card generation. I have the latest driver and
maxwell cards and "12" works for me. "5" is an old code, the phoronix article explains it.  Either way you
can set all the necessary bits on one line.
legendary
Activity: 2940
Merit: 1091
--- ChainWorks Industries ---

You only need to add one line, "12" will get you OC and fan control but only on cards with a monitor attached.
I've seen attempts to fake a monitor on a second card but nothing has worked for me.

http://www.phoronix.com/scan.php?px=MTY1OTM&page=news_item
Not really true so I found out on my setup. 8 & 12 did nothing only after I added "Coolbits" "5" was adjusting the fan speed enabled.
So I think it may depending on the card type and or driver used, some may have to play with different numbers for each type setup or driver used.
I just never removed the others and paste those because they did work for other setups.

Edit: have you tried making a dummy plug for those cards without a monitor?

https://rumorscity.com/2013/12/06/how-to-create-dummy-plugs-for-your-graphics-cards/

We had to use these back in the day before Nvidia got there stuff together in there drivers for folding at home rigs with muti video cards.
I still have some laying around here. lol

ours have dummy plugs on all the machines - but only need it on the FIRST card that a monitor sits on ...

amd cards never needed those ...

tanx for your info as well ... ill have to have a look at how this works in fedora next week ...

#crysx
full member
Activity: 231
Merit: 150

You only need to add one line, "12" will get you OC and fan control but only on cards with a monitor attached.
I've seen attempts to fake a monitor on a second card but nothing has worked for me.

http://www.phoronix.com/scan.php?px=MTY1OTM&page=news_item
Not really true so I found out on my setup. 8 & 12 did nothing only after I added "Coolbits" "5" was adjusting the fan speed enabled.
So I think it may depending on the card type and or driver used, some may have to play with different numbers for each type setup or driver used.
I just never removed the others and paste those because they did work for other setups.

Edit: have you tried making a dummy plug for those cards without a monitor?

https://rumorscity.com/2013/12/06/how-to-create-dummy-plugs-for-your-graphics-cards/

We had to use these back in the day before Nvidia got there stuff together in there drivers for folding at home rigs with muti video cards.
I still have some laying around here. lol
legendary
Activity: 1470
Merit: 1114

Hope this helps. You'll need to add a few lines to /etc/X11/xorg.conf
My .conf looks like so: you'll have to open the ect or x11 folder as admin before you can edit the xorg.conf
Code:
# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 304.125  (buildmeister@swio-display-x64-rhel04-14)  Mon Dec  1 21:18:22 PST 2014

# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 346.72  (buildd@toyol)  Tue May 19 14:39:51 UTC 2015

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"

    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "NOR LM-965WA"
    HorizSync       30.0 - 83.0
    VertRefresh     50.0 - 76.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 660 Ti"
    Option "Coolbits" "5"
    Option "Coolbits" "8"
    Option "Coolbits" "12"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "CRT-0"
    Option         "metamodes" "nvidia-auto-select +0+0"
    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection
you'll need to add these lines:
    Option "Coolbits" "5"
    Option "Coolbits" "8"
    Option "Coolbits" "12"
in this section:
Code:
Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 660 Ti"
    Option "Coolbits" "5"
    Option "Coolbits" "8"
    Option "Coolbits" "12"
EndSection
just as you see here.
I'm running a pretty old driver 304.125 and it works for adjusting fan speed, but no OC options.
I'd post some pictures but I don't have any programs like in windows for quick upload and links in the Linux setup.

You only need to add one line, "12" will get you OC and fan control but only on cards with a monitor attached.
I've seen attempts to fake a monitor on a second card but nothing has worked for me.

http://www.phoronix.com/scan.php?px=MTY1OTM&page=news_item
full member
Activity: 231
Merit: 150
RELEASE dot 62--

I am getting better performance on both my 750ti cards and my 970 cards when mining Quark.  The 750ti cards get hash rates between 6.5Mh/s and low 6.7Mh/s.  My 6-card 750ti rig is getting about 39.6Mh/s.  My 970 cards get 14-15Mh/s.  My 960 cards are back up to 10.6Mh/s.

Running in Linux, I get stats on fan speed and temperature readings.  I don't miss the "thread ID" info much, but the temp readings are higher than expected and have lead me to research nvidia-smi controls.  I think I need to switch to a newer driver in order to set the fan speed with nvidia-smi.  I'd like to set it at 85-95% for all the cards, just one set high speed for all.  My 970 cards are running near 80 deg C.       --scryptr

Hope this helps. You'll need to add a few lines to /etc/X11/xorg.conf
My .conf looks like so: you'll have to open the ect or x11 folder as admin before you can edit the xorg.conf
Code:
# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 304.125  (buildmeister@swio-display-x64-rhel04-14)  Mon Dec  1 21:18:22 PST 2014

# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 346.72  (buildd@toyol)  Tue May 19 14:39:51 UTC 2015

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"

    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "NOR LM-965WA"
    HorizSync       30.0 - 83.0
    VertRefresh     50.0 - 76.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 660 Ti"
    Option "Coolbits" "5"
    Option "Coolbits" "8"
    Option "Coolbits" "12"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "CRT-0"
    Option         "metamodes" "nvidia-auto-select +0+0"
    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection
you'll need to add these lines:
    Option "Coolbits" "5"
    Option "Coolbits" "8"
    Option "Coolbits" "12"
in this section:
Code:
Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 660 Ti"
    Option "Coolbits" "5"
    Option "Coolbits" "8"
    Option "Coolbits" "12"
EndSection
just as you see here.
I'm running a pretty old driver 304.125 and it works for adjusting fan speed, but no OC options.
I'd post some pictures but I don't have any programs like in windows for quick upload and links in the Linux setup.
full member
Activity: 231
Merit: 150
CPU-BOOST SWITCH--

The only time I have had better hash rates was when there was a CPU-load issue on an earlier version of CCminer.  Currently, Genoil, of the Ethminer dev team, has enabled a CPU boost switch that places a load on the CPU, but results in a higher ether coin hash rate.  This was a result of collaboration with SP_ and Epsylon3, where the CPU load "bug" was corrected.  In retrospect, I wonder if a "--cpu-boost" command line switch could be enabled in CCminer, where a load is paced in the CPU if selected, and a higher hash rate results.

I have 2 headless rigs, and their sole task is mining.  I would load the CPU on these rigs if a higher hash rate resulted.  On my work PC, I would omit the switch so that my GTX 960 could mine and I could browse the web.

A properly bred bug might be a prizewinner.         --scryptr
Might be able to change the OS power options to high performance so that the CPU is forced to run at a higher clock rate.
Also there are some options in the motherboard bios for Intel that can be changed to enable or disabled to force full clock speed at all times.
Not sure on AMD MB's haven't played with any of those much in quite a few years.


It can actually run at a higher clock or at full speed without any load at all on the CPU.
Of course more power is used because it is running at full speed at all times. That's the only draw back well and extra heat.
Jump to: