Hey everybody !
After countless hours of research, and obviously reading a lot about AMD cards Ether mining crown, I decided to do full Nvidia for a first 6xGPUs dedicated rig. I figured out that even if the hashrate is a bit lower than with RX 4XX cards on ETH, it also offers more possibilities on other cryptos. We don't want to forget that ETH is announced to go POS since last year. Plus, on the power consumption side, no battle here, Nvidia holds the 1st place by far.
I read as much as I could of this very thread, and checked the results on whattomine.com, but I'm still wondering what hashrates Nvidia users actually get in the real world, out of the theoritical field.
I'm already mining ETH+LBC with 2 GTX970 [41MHs/64MHs], for it happened to be the most valuable combo so far, but I doubt these results are accurate for a proper upgrade to GTX1070.
I also would like to ask what GTX1070 brand/model you're using, and eventually why. I'm looking for a good average between performance and longevity. With proper maintenance, I plan to get these working for at least 2 years time [Reboot and dust-off every week + full cleaning and basic maintenance every other month + full check and fan greasing twice a year]
I decided to buy GTX1070, but I've got a very nice opportunity for an EVGA GTX1080 SC, and was wondering as well, if people had performance feedback about this very GPU. As far as I can read, it seemed like 1080 weren't doing very good, and looked like it was because of a lack of optimisation [on top of the DDR5x issue]. Do you know if this has been fixed, or it will just be uncompatible at all ?
Cheers !
Try some high end R9's like the nano on a low epoch ETH alt like EXP.
Nah, sorry, already made my mind up, and they won't be cheap R9's, but like I said, GTX1070.
So besides that useful reply, anyone using Nvidia rigs ?
Nvidia gpu are good for mining ZEC, not so good for ETH.
Well, that might have been the case, like a year ago, but if you look at whattomine.com calculators, you'll easily figure out that :
+ GTX1070 GPUs ETH hashrates are close to RX480's
+ GTX1070 have a way better span of possibilities, with way better results
+ GTX1070 use like 30% less power for similar results
1070 ZEC hashrates are 450-ish, almost double than 480s.
1070 use 10% less power than 480
1070 is much more capable but need miners ( devs)
ZEC
+ 6 x RX480 - 1740 sol/s @ 720W
+ 6 x GTX1070 - 2520 sol/s @ 720W
ETH
+ 6x RX480 - 174 MH/s @ 810W
+ 6x GTX1070 - 171 MH/s @ 630W
LBRY
+ 6x RX480 - 630 MH/s @ 1050W
+ 6x GTX1070 - 1650 MH/s @ 720W
SIA
+ 6x RX480 - 6900 MH/s @ 12600W
+ 6x GTX1070 - 9600 MH/s @ 720W
DCR
+ 6x RX480 - 11820 MH/s @ 1140W
+ 6x GTX1070 - 15000 MH/s @ 750W
PASC
+ 6x RX480 - 42004 MH/s @ 810W
+ 6x GTX1070 - 5640 MH/s @ 720W
[Source : whattomine.com]
Except on ETH [-1.8%], GTX1070 hashes way more than RX480, for a much less power draw, like a lot less than 10%. Actually, even on ETH, considering the power consumption, GTX1070 is more profitable than RX480...
As far as my experience goes, with my 2 x GTX970 tiny test rig, dual mining ETH + LBRY is the most profitable, when you get 10% less hashrate on ETH than solo mining [44MH/s solo - 40MH/s dual], and 50% less hashrate on LBRY than solo mining [-dcri 30 - of course, it goes up the more you increase dcri intensity, but draws more power too].
I used these stats to estimate dual mining on a 6 x GTX1070 [and maybe I'm wrong here, it may hash more than 50% on LBRY, but let's say it's a good base to have an insight], and here what I daily get at actual exchange rate and difficulty, and after electricity cost :
ZEC - $14.99
LBRY - $15.38
ETH [-10%] + LBRY [-50%] - $21.97
Best results all together with 6 x RX480 at actual exchange rate and difficulty, and after electricity cost :
ETH - $13.82
Either I'm missing a whole lot of information, here, either it is an emergency for whattomine.com to debug its calculators, either Nvidia just ruined it all...