Big shout out to Phil for sending me the extra molex cable I needed to get my 4th 470 up and running! Thanks also for the extra goodies you included Phil... really appreciate it!
So, as it stands now, I am getting just shy of 82 Mh/s on ETH and about 1300 Mh/s on DCR. I am happy to finally have this rig running with 4 GPU's but I am still baffled at how these MSI 470's are only getting 20.7 per unit on ETH. I have tried pretty much everything suggested in this forum and in the Claymore readme file but nothing notably effects the speed of ETH mining. I've done the following:
- Added the 5 setx lines to my .bat file
- Added the "-ethi xx" command within the command line of the .bat file. Tried it at 10, 12, 14 & 16.
- Added the "-ethi xx" command to the config file. Tried it at 10, 12, 14 & 16 in various places.
- Increased my Virtual Memory from 16GB to 24GB
None of these moved the needle on ETH mining at all... still all 4 GPU's averaging 20.7 per unit. The only thing that effected ETH speed at all, was to ratchet down DCR intensity significantly but the minimal gain of .4% (from 20.72 to 20.82) would actually be more than offset by the dramatic decrease in DCR effectiveness. On the flip-side, bumping up DCR to 35 increased its hash-rate by ~15% but only nominally effected ETH mining (less than .1%). Bottom line, nothing I've have tried so far is getting me even remotely in the ballpark of the ~23 Mh/s that others are getting from those GPU's out of the box.
I would almost just think I got a bad batch, but my 4 GPUs came from 2 different vendors (3 from Amazon and 1 from NewEgg), spread over 3 orders over 2-3 weeks, so this seems unlikely. I really wanted to close this gap before proceeding with modding, but it's looking bleak at this point. I am putting off purchasing a 5th GPU for this rig until I can figure out exactly how much I can improve its current performance. Frankly, at less than 21 per card, it's barely going to be profitable, especially at the ridiculous power draw it is currently at... which reminds me...
So, I have my rig plugged into a KilaWatt unit. While running the rig with 3 GPUs, it was pulling around 630-640 watts. This seemed high to me because I was thinking max of 330 for the GPUs +140 for the i7-6800k, which left about 160 watts for the Mobo, SSD, heatsink, etc. Again, seemed a bit high but not out of the realm of possibilities.
Well, here's the perplexing thing... after hooking up the 4th GPU, KilaWatt now says I am using 810 watts! That's a 170 watt bump over the 3 GPU set-up. So, how exactly can a GPU with a max TPD of 110 watts add 170 watts of power usage? At this rate, I couldn't even add another GPU to this rig as my PSU is 1kw. I know I need to really dig into the power settings that have been mentioned here recently but was wondering if anyone could explain how a 110w GPU could add 170w to my power draw.
Thanks again for any tips!
I would push down the core 20%
Up the memory to 1850
That will up the MHs and lower the power without bios mod
120tdp isnt the max power draw.
Also msi cards have a high power set than the other 470s
If you want help with bios mod I will help anyway I can
If you pay 10cents or more in power I wouldn't dual mine. Your only talking about maybe 10-20$ profit
Pushing the cards harder and hotter for only 20$ ehh I'll save them
Yeah, my electricity is 11 cents, so doing rough math for 110 watts, cost me about $9/month. At best with 4 GPU's on DCR, I could mine maybe 15 in a month, so at DCR current rates, maybe $5 profit/month at best so really not worth it. Now, if it goes $2 and above then certainly worth looking at but not really at $1.
I am curious on your 2 suggestions... can these settings be done within the AMD WattMan App? I haven't really dug into that yet but if it is indeed managed there, I will certainly try that ASAP!
Yes those settings are for wattman. It'll be the easiest way to play around with
OK, so I attempted to make the recommended changes this morning but not really seeing any results, so I am probably doing something wrong.
Here is what I tried in Wattman:
- decreased Power Limit (in the Temp section) by 20%
- increased the Memory Frequency from 1650 to 1700 (I know 1850 was recommended but the max value I could set in Wattman was 1700, and only "State 1" was enabled (on all 4 cards) so, only 1 State moved and only up 50 Mhz
I applied this changes on all 4 GPUs. There was NO impact to hash-rate. There was impact to wattage usage but in the wrong direction... went from about 695 to 710. When I moved the Power Limit back to anything between -10% and 0 (flat), I got back down into the 695 range.
After this, I opened Afterburner. Also my first experience with it so still early in the learning curve on all of this stuff.
- Did further testing with Power Limit but same results as with Wattman... no impact between -10% & flat but increased wattage pull with -20%
- In this app, Memory Clock was movable all the way up to around 2000, so I tried the 1850 and applied to all 4 GPUs, but alas, again, zero impact to hash-rate on any of the 4 GPUs.
The other thing I briefly played with (but with scary results) was the GPU frequency in Wattman. I saw a youtube video where they were pulling down the state 6 & 7 frequency max values to around 970 & 1000 respectively. I tried that on GPU#1 (not a large decrease from the reflected values) and it actually lowered my wattage a bit. But, when I did the exact same thing to GPU#2 (this was actually a larger % decrease from the reflected values as this card seems to run hotter & pull more wattage than the others), it ended up stopping that GPU and crashing my system. After that, I didn't mess with the GPU settings again.
So, given all of this, can anyone provide some advice on where I am going wrong? Where exactly should I be trying to lower the power/core settings? And, am I missing something on the Memory increase side?
Thanks again for the help!