What do you all think of this definition of time by global sync of smallest amount of information being able to (if that path is desired) move from everywhere to everywhere?
Had to read 3 times before thinking that OP is proposing - I believe - to use global latency as an unit of time (
)
So instead of global sync, I think we should formalize a definition of time, in the context of syncs of information flow, which defines global sync as lagging behind each smaller and faster piece of the network which agrees to local sync, all the way down to computers as they exist today.
A key aspect of an agreeable definition of time by multiple parts is the ability to independently verify it.
If the definition of time derives from the same event you are trying "to time", it turns into a mere "event counter" and not anything usable in a scenarion you need to sync.
Either that or I still have no idea what you are proposing...