Cool that it supports the latest excavator, but this really brings out a huge weak point in this software - the benchmarking and software selection system is the worst and requires so much manual labor to do something that can be easily handled via software.
It would be great if there was a way to automate selecting which mining software is enabled or disabled per profit profile depending on benchmarks. Right now it benchmarks exactly one mining software per algo, based on which I've alreeady selected. This is backwards, because I don't know which mining software I want enabled until I benchmark it. The right answer is always whichever software puts out the best number.
Instead of making me click through a million menus to test equihash with dstm, ewbf, ccminer, excavator and everything else - the software should just test everything and pick the best. And preferably remember the results per software, so when a new version of the software is released, it only needs to test the new software.
Example workflow:
I enable ewbf, ccminer, claymore, excavator for 1080 Ti profile.
Results:
Equihash ewbf : 11mh
Equihash ccminer: 12mh
Equihash excavator: 13mh
Ethereum ccminer: 25mh
Ethereum claymore: 32mh
Ethereum excavator: 31mh
1080 Ti profile automatically set to use excavator for Equihash, claymore for Ethereum.
Then new ccminer comes out. Benchmark tests only ccminer:
Equihash ccminer: 14mh
Ethereum ccminer: 30mh
Now profile set for ccminer for Equihash, but still claymore for Ethereum.
You are right , I also did this to find the fastest miner for an algo, by moving up or down in the list the miner and benchmarking again.
Now that excavator was introduced, I benchmarked excavator on all agos that are enabled on excavator and I noticed that excavator is faster on 5 algos than all the other miners. I enabled only that 5 algos for excavator and and disabled all the other algos from excavator with less hashrate.
The advantage is that I know that any time in the future, when I would benchmark I will always have these 5 algos faster for excavator and it is not necessary to benchmark with the same algo all the other miners again like NiceHash does. You might notice that Nicehash Benchmarking is 10 times slower (takes a lot to benchmark rigs with 8 or 16 GPUS copared to AM ). So, the suggestion you have will increase a lot the benchmarking time (that is really not optimized right now, because it should have a different execution time for different algos).
I do not think that I have to wait every time 5 times more time to know each time that the order is the same and only the best hashrate is saved, but indeed I really think it would be useful to have an option like
"Test all miners and pick the fastest miner for each algo", right ?
Now, the question remains what do we do when we benchmark today and we have the values saved and when we benchmark again 2 weeks later the values are different. The values also lead to wrong algos ordering because new hashrates changed the ordering, making some algos more profitable.
What do you think ?
That seems simple enough to solve with timestamping the results and setting an expiration date on them.
The reason I bought awesomeminer was to make my life easier. In practice it requires a lot of work, to the point where I'm not sure I'm actually saving any time or money. Seems like a lot of dev time is spent on configurability and not enough on automation. Software is nothing more than a rote procedure, and whenever I have to manually step in to do a rote procedure myself, then that software is failing IMO.
Like for another example of where I need to spend hours manually clicking when it should be automated - even if they were to implement my suggestion, I still need to manually stop a miner with a profile, launch benchmarking, wait 15+ minutes (!!!) while it benchmarks, save the new settings, then start that miner again, then do it all over again for every profile. I have Titan XP, 1080 Ti, 1080, 1070, 1060 and RX580 profiles. This whole process takes hours, it's ridiculous.
What situation exists where someone would want to benchmark one profile and not the others? What situation would there be where anyone would want to do literally anything but test everything that untested, and then enable it if theres an improvement?
Combined with my previous suggestion, I feel like all I should really have to do here is press an optimize button, and it'll automatically stop a miner with each profile, benchmark the software/algos, pick the best results, and automatically move on to the next. Why does this require any user interaction at all? Nevermind the fact that it needs to stop an entire 6 GPU rig to test just one of the GPUs. Or that I can't benchmark multiple rigs at once or do any administration while benchmarking, because the benchmark locks up the entire UI.
Overall I only have one overarching suggestion - shift gears for a bit from adding new features and complexity to improving usability and automation. It's great that I was able to integrate the new excavator, I'm making a few extra bucks now. It's not great that I had to spend several hours testing it, keeping a manual log of results, and then fiddling with a thousand menus to integrate it optimally. This is all stuff good software can and should handle on its own.