Yes I agree the MSI Gaming X is not worth the extra ££ if buying for mining, I bought it for gaming initially when they were released, mining wasn't on my radar until around May. Same as Phil when pushed it will pull 505 sols which is really good for a 1070, but I can drop 75w on my wattmeter and still pull around 495 sols and it runs cool and quiet with the massive heatsink Like you say the extra 6pin power plug can be a pain for mining too.
How many sols do you get out of the 1080s? I always thought the extra price comparing cheapest 1070s to cheapest 1080s made the 1070 better value. I guess it depends where you buy from and which country you're in though.
From what I can see the 1060 3gb has the shortest ROI in the UK at least. But then there is the argument of the 1080Ti needing less PCIE slots, but then it needs bigger PSUs, more heat generated... lots to consider!
GTX 1080 for a short while was running the same to a hair LESS than the GTX 1070, during the peak of the "price gouge" period this summer.
500 sol/s out of a 1080 is pretty trivial, and they seem to provide a few sol/s more at the same wattage settings than a GTX 1070 once you get above 130 watts or so.
Depending on price, ZEC hashrate/$ is generally about the same on a GTX 1070, GTX 1080, and GTX 1080 ti especially when you are comparing at the "system" level - individual sale prices and individual card variations and how you set them up to run (efficiency, or max production) all seem to have more input than the actual card model.
The other thing I have to factor in on my setup is "infrastructure" of how the existing outlets are set up, as I don't have the option of doing significant changes to this place - but on the positive side, a lot of the existing outlets are clustered and almost every existing "duplex outlet" is on it's own dedicated 15A circuit, so I don't really HAVE to change things around to make them work viably well.
I do have one cluster running from a subpanel I set up to plug into a 220V 30 Amp "drier" circuit, but for ease of management I set that up as 4 x 110V 15 Amp circuits near some of the existing outlets.
Basically, I aim each rig (except the ASUS B250 mining one) to use 6 amps of power at the wall, which ends up being about 540-550 watts of GPU power draw plus the rest of the system (the CPU does BOINC work, and I have the hard drives set up for BURST mining, so all of my dedicated "mining" rigs are working at least 3 things at a time EXCEPT the ASUS and other "experiment" Intel rig I built, due to Intel drivers not playing well with other drivers for OpenCL usage).
I may end up switching the CPUs over to mining Monero at some point though - FX8xxx aren't too bad at that, and pay off a LOT faster than Ryzens do (70% or so of the hashrate for about the same power draw on a 8320E vs a Ryzen 1700, but more like 35% of the cost).
Most of my rigs were ORIGINALLY built to run Folding@Home, where using a bottom-end CPU is a "bad thing, kills your production VERY badly" and using anything less than a 16 lane PCI-E 2.0 slot also hits production noticeably, so they're not as optimised for pure mining use as they could be.
That's also why most of my rigs are NVidia based - I do have some AMD rigs as well, but right now they're hammering on the Distributed.Net project via the BOINC "Moo Wrapper" project, and generating enough Gridcoin to pay the power bill for them (and not much more any more).
I sometimes impress myself with the thought of what kind of producion I could do on D.Net/MooWrapper if I swapped all of my NVidia rigs over to that for a couple days....