Yes, we very much want to avoid doing something reasonable for today and pass this point along to future generations. Hard forks are not nice and having a committee decide on an appropriate maximum block size every so often is out of the question (a central point of failure).
Unfortunately, there's no easy way of doing what you'd like. Bitcoin itself has no conception of a market agent; it certainly can't distinguish between them or count them. Bitcoin itself can't know if the system is highly decentralised or if all the addresses and all the hash power are controlled by a single party.
We might be able to come up with a probabilistic or economic solution but no algorithm can measure decentralisation with certainty. Perhaps some blockchain-based metric may suggest that the system is decentralised (or under attack by an economically irrational agent) with high probability. However, I expect that any such algorithm, no matter how subtle, will yield an equally subtle attack where a single agent attempts to appear to be many agents.
There are 2 attack resistant proposals under discussion in this thead:
https://bitcointalksearch.org/topic/increasing-the-block-size-is-a-good-idea-50year-is-probably-too-aggressive-815712
Both are self-correcting, based on the block chain, and can serve to avoid centralisation of decision making on this matter for the future.
One is based on block size, another is based on TX fees.
Both would take a preponderance of interests in order to "attack", in much the same way a 51% attack would do.
The key to resilience here is both in using self-correcting market influence, and limiting the amount of variance achievable. These are also the effective mechanisms of the difficulty adjustment algorithm.