The last thing we need is a "creative" computer system running the world.
I don't fully agree with that definition, but going with it for argument's sake:
Predictable or not (assuming we mean "predictable" here as "predictable by humans" [arrogant IMO]), if the ruling class (in this case sentient machines) is actually smarter, more powerful, and more able to rule the world than humans, than humans simply have no say anymore. It would not matter if the ruling computer system came to the (much processed) conclusion that humanity should be eradicated -- it would be a good and correct decision, because the smartest most powerful class of intelligence in existence has deemed it so.
And that justifies the humans killing all the wolfs? Because, you know, they are more intelligent?
Great way forward!
You make the error of thinking that something that exceeds humans evolutionary development would not be selfish.
You also make the error of thinking that having a human-like intelligence regulating society is optimal.
I don't think we need a human-like AI at all but if you insist you still have a problem in that it will have its own ideas of what is good or bad that don't stroke well with what humans think is good or bad. That is what human-like intelligence is all about.
I don't recall making the errors you listed, and I would be happy to re-examine that if you could point me toward them
We as humans eradicate pests all the time. We have deemed wolves as non pests. I think a higher form of life might deem us similarly.