Well, it's a loaded question and I'll give a loaded answer. On the XP's, right now I've got the power levels set to about 80%, with RAM at 6000 and GPU core at 1733Mhz-ish. That gives me roughly 720 - 740 SOL/s. But before you scream rippoff, I have a mini rig with a couple 1080TI's that can do the same with mild tweaks... but the Titans use a lot less energy at 80%. It's around 225 watts. The TI's use 275 - 300 watts to get the same result. Now, I can get 810 -820 SOL/s on the Titans by upping the power to 110% - 120%, but my consumption jumps to 300 watts and I start to get diminished returns. If I had free power it wouldn't matter and I'd leave it at 820 SOL/s.
My 1080's average 510 -540 SOL/s at 65% - 75% power. Interestingly enough, I have two cheapo MSI reference cards in their own separate rig doing 545 SOL/s, beating all my other EVGA and Gigabyte 1080's, I can't explain it.
Total Hashrate is about 20k with what's in the picture.
Hi mate, i'm a bit new into mining but already got a 6xti rig (strix) and looking for a efficient spot (high cost of energy here) I decided to go for 70% TDP (std clocks) and getting around 620/640 Sols/s @ 170-180W / 55 to 65º C. Just curious of why a more experienced miner as you can go for those results of 550 Sols/s ¿? I mean I understand you're going after effiency too, but your numbers seem a bit too weird compared to mines, and I doubt if i'm doing/calculating something wrong.
I'm also today making OC testings and they seem good, only at the clock level because Mem doesn't matter too much, even at -500, but std mem clock is the better (hash/W). Don't you OC? risky?