If you were a real miner you should be doing research before you make a purchase
You keep ASSuming that all miners "purchase" all of their gear to mine with, as opposed to the LARGE number of us that already had EXISTING gear we put (or put BACK) into Mining service once something came up it was profitable on.
This is just as applicable to NVidia as to AMD, though NVidia has rarely been competative on mining vs AMD and there is a lot less "older" NVidia gear around used for mining with as a result. (in my case, 2x GTX 960, 3 x GTX 950, and 2 x GTX 750 Ti on the NVidia side as opposed to 5 x HD 7750, 2 x 7870, and a 7850 on the AMD side - but I'm probably a rarity in having comparable numbers of GPU on each side when the current ETH/ETC/XMR/ZEC/etc period of profitability began).
On the other hand, NVidia's "new GPU architecture with each generation" was nothing more than a minor rehash and usually a "more cores on the top end" change, the only REAL difference is they "officially" changed their silicon on their various 28nm generations - with similar performance gains (small and mostly due to faster memory being available) vs the contemporary AMD generations. Their only BIG change during the GPU mining years has been the move to 14/16nm with Pascal vs the previous generations - they just did a better job HIDING that fact vs AMD's "rebranding" on THEIR GCN 28nm generations.
AMD's move to GCN vs the older Terrascale was a much bigger change in architecture than anything NVidia in that period prior to Maxwell.
I never did understand why NVidia labeled the 750 as the same generation as the older 7xx stuff - by rights it should have been the GTX 940.
On the other hand, AMD should have changed their generation numbers when they moved to GCN, instead of having quite a few of their 7xxx series be Terrascale and the rest GCN.
BTW - I make my entire living from mining, if that's not "real" then you have a really wierd definition of "real".
I'm fully aware of the GPU bust from a couple years back - I had quite a bit of gear left over from mining Litecoin and X11 days, but instead of selling it off I just started using it for what I'd ORIGINALLY bought most of it for prior to my ever hearing of Bitcoin much less Litecoin. I "did my research" back THEN when I was part of that history.
Good amount of unnecessary spacing for the win, right?
Unless you have boxes of 7970s laying around(!?!?!) you're in the extreme minority. That is to say you don't fucking matter. All the other points hold true. The random guy trying to mine on his six year old laptop or random card he upgraded from is not the forefront of development effort for devs on here. You know why? Because legitimate miners are actually using hardware that is recent and they have a lot of hashing power. Whether or not you can make $.10 per day doesn't matter around here. Put your card on eBay and let a big miner swallow it up.
Rarely competitive vs AMD? You mean like NeoS, Lyra, Quark (before it was ASIC'd), Library and all the other algos you don't mine because AMD hardware isn't competitive on them. Ethereum is a niche case scenario when it comes to the world of cryptos. It is literally the one algo that is so memory bus restricted it literally strangles any GPUs put on it and can't be improved. Even Equihash is showing some individuals who thought hardware couldn't break 200 sols and then Nvidia hardware couldn't break 200 sols, this isn't fucking Ethereum. It's not completely bus limited, you can check out your MCU usage on powerful cards... Nvidia hardware still has a lot to go as far as that goes.
Now you're just talking semantics. They redesign their chips and it doesn't count? Sorry, but I'm calling bullshit on that one. Visit any real hardware website that's not trying to shovel self supporting agendas down your throat (BCT) and you know that's not true. Maxwell and Pascal have both had relatively good support on Nvidias side of things for mining (when devs develop for Nvidia). Anything older then that people should be happy if miners work for it.
If you live off $.1 a day then that's amazing, otherwise we have some discrepancies here. On one hand you're toting your old ass hardware that you want to be brand spanking new and just as efficient as recent hardware as 'stuff you have laying around' and on the other now you're saying you have boxes of them? This goes back to my original point that you made misinformed purchases and now you're getting pissy as we're approaching electrical cost and want devs to spend extra time specifically on your shitty hardware.
So curiously if you have boxes of GPUs laying around, what did you use it for during the time when it's unprofitable? Oh you powered it down for a year? Do you know what opportunity cost is? Although since you did your research back then, you should probably be pretty happy that it's lasted this long and you're ready to sell it to buy new stuff.
I also want to report there is a weird ass bug that's been plaguing this miner since the beginning, it still doesn't look like it's fixed. If you have a six GPU rig, one of the GPUs will sometimes hash 20% slower then all the other GPUS. It doesn't really make sense what causes it to happen. I can't figure it out even on virtually identical systems except for GPU brands. I have a couple systems like this.
Oh and add the ability to run multiple threads from the same miner.
Its due to signal processing of monitor signal. I noticed that to when I disconnect the monitor or turn it of..the hash increases on this card. But since I have this miner the diff is just 1-2h not that much it behaves better than many others. Maybe you need also to check the nvdia system setting and enable the max. perf setting..this also helped in certain cases for my setups. Give it a try if not already done.
Yeah I figured this out last night, it's just when a monitor is connected. This doesn't happen on other miners though like EQM.