I showed that that in a system without hierarchial structure failure to converge to a fitness landscape will occur if entropy exceeds an critical level. I did so with a simplified biological model where this result can be shown definitively and mathamatically. The same model shows a lack entropy will also result in a failure to converge and that increasing entropy will increase the rate of convergence until an inflection point is reached. Pushing entropy beyond this inflection point will slow convergence. Beyond the error threshold the system won't converge at all.
You have countered that this model is not applicaple to real human societies which are obviously far more complex than simple nuclaic acid chains. You also countered that such extreme entropy will never occur in human societies since we naturally self organize to follow leaders (thought leaders, religious leaders, politicians, gang leaders, warlords, ect).
The applicability of the model is a fair challenge. However, I see little reason to think that the same general rules do not apply to our more complex society. You correctly pointed out that there is an asymmetry of systemic risk today with the risk of collectivism far exceeding the risks of individualism. I agree, however, the reason our current system is dangerous is not because it is excessivly rigid. A system that was mearly excessivly ordered would still eventually (if very slowly) converge on optimal outcomes. The system poises a systemic risk because it is grossly unsustainable.
The true risks arise from possibility of complete systemic collapse. This raises the specter of the Mad Max scenario. It is possible that an unstable order will breakdown so completly that entropy will rise uncontrollable and rapidly. Our society (like the simple biological model) can also exceed an error threshold. This occurs when knowledge is systemically destroyed rather than created. Think worldwide loss of electricity, nulcear holocaust, survival of the fittest with mass starvation. These are all extreme entropy scenarios where knowledge would be systemically lost.
I would agree with this and add to it that order becomes cancer the moment it becomes unsustainable and incapable of self correcting. Chaos is the risk that arises from the inevitable death of the cancer.