You have clearly demonstrated to me by now in our discussions over the past days that you entirely do not understand/appreciate the critical importance of the entropic force and the irreversibility of thermodynamic processes.
Well, I have the impression that you deify the notion of entropy. For sure it is an important and fascinating concept, and it sneaks into areas of research where one didn't expect it at first sight. But it isn't a magic word. Entropy has a very well-defined meaning, and using the word and the notion outside of its well-defined meaning can sometimes be speculative theory, but most of the times, charlatanism. There's another word like that, "quantum", that is often abused. It is not always clear in what way the author using that word is simply being casually speculative but has in fact a clear idea in mind, and when he's using the word "magically". I've seen that you sometimes use the word "entropy" in a context where I cannot link it in a clear way to its definition. I don't know if, at those moments, you use it with a clear link to its definition, which I fail to understand, or as a "magic word" because you're somehow fascinated by it, without clearly understanding yourself what it could mean in that context, but you "feel in your bones" that it must have something to do with it, but you don't know what.
Entropy is just "lack of knowledge of an event". The event has a specific state/value/outcome... (will have, had, ...) and you don't know which one it is, but you only know it must be "one of several possibilities".
Entropy is hence a relationship between the entity that can take on the event, and the entity that "ignores". What is entropy for me, is maybe not entropy for you (and in that case, YOUR entropy for me is at least as big as the entropy I have about the system - which is the essence of the second law).
Now, what can we ignore ? Almost anything, which is exactly why this notion is so all-pervading. "physical entropy" is our ignorance of the exact micro state of a physical system, that still satisfies a given "macro state".
If I have a cup of hot water, I know its macro state: it is "hot water". However, I don't know where what water molecule is, exactly, on a microscopic scale. This ignorance is the entropy that how water has. Note that it is dependent on what I consider as micro state. I also don't know the nuclear state of hot water, but usually I don't care, so I don't take this into account in its physical entropy. When I have to do nuclear stuff with water, suddenly water has much more entropy, than when I will be doing chemistry with it. The heat that I can get out of water by using nuclear fusion is not accounted for by the usual values of entropy one gives to hot water, because most of the time, we don't consider that kind of outcomes.
In the same way as I can ignore physical micro states, I can ignore other things, like memory states of a computer, the actual state of a finite-state machine. Take a whole network as a big finite state machine, then my ignorance of the exact state of this network can be measured in entropy units.
I can ignore "complex outcomes", like throwing a dice. I know that if you throw a dice (or if you threw one yesterday, but didn't tell me the outcome), that it must be one of 6 outcomes, but I don't know which one. Entropy.
In very much most cases, the entropy of the ignorance of a physical state is MUCH, MUCH larger than anything "computer-related". There's more entropy in my hot cup of water than in the whole internet.
But, and this is important, we now come to the complementary notion: information. Information is what sets off entropy (and is hence the same kind of quantity). I need to receive information in order to diminish my entropy about something. If you tell me that the dice thrown gave a 4, then my entropy about that dice is now entirely lifted, by an equal amount of information I received.
But information has another aspect. It has the aspect of "what I care about". Information has also to do with what one calls "macro states". If I have hot water, the physical entropy of that is huge, but I don't care. I only need the macro state to make coffee. So my "entropy about the macro state" is essentially 0, because I KNOW the macro state. I don't need the micro state if I want to make coffee.
So the fact that when I pour a cup of hot water over my instant coffee, and that I'm "Handling huge amounts of entropy", doesn't matter for what I'm doing.
I've seen you fall into this trap several times, like with the "entropy of the network with latencies calculating the digits of Pi". I don't care, because the macro state I'm only interested in, is the value of the digits of pi, and the "microstates of the network" don't matter. They are the equivalent of the hot cup of water, and its microstates. I don't have to know them to make coffee.
Replication is not the same as a diversity of irreversible paths.
Chaotic irreversibility is nothing else but the amplification of a specific micro state to a macro state. The ignorance of the micro state then results in the ignorance of the macro state.
There are usually two opposing dynamics: one is chaotic dynamics which "brings micro states to the macro level", and the other is "statistical averaging" over large populations, which turns "individual macro states" into "system micro states".
The first one makes that one can "lose information", the second makes that "this information doesn't really matter".
If I use hot water, and I pour it to make coffee, I not only have the physical entropy of the hot water, I also have the chaotic dynamics of the turbulence in the water flow. And all of this doesn't, in the end, matter, and I end up having a cup of coffee.
However, in as much as the turbulence is too turbulent, say, I can burn myself, I have to go to the hospital, and this implies a change in my life path that was difficult to foresee. However, on the level of humanity, on average, 0.001% of people making coffee burn themselves, so this becomes again a kind of macro state.
So one cannot just be "amazed at the amount of entropy in a system". There's more entropy in your cup of hot water than there is in the whole internet. It doesn't always matter.