In Algorithmic Information Theory [1] one studies the minimal number of bits needed to describe objects.
Normally the objects studied are just binary strings. For instance, the first million bits of pi in binary.
Random as they may look, the fact that they can be generated by a small program shows that they are anything but.
A truly random string of n bits would take a program of length at least n bits to describe it. With only 1+2+...2^{n-1}
= 2^n - 1 possible descriptions/programs shorter than n bits, it's clear that such a random bitstring must exist.
What if we want to extend this notion of descriptional complexity to blockchain protocols?
This would allow us to quantitatively compare the complexity of different coins.
It remains to answer two questions.
1) What programming language do we use to encode the protocol?
2) What is it that the program would do?
For 1) we could pick a common popular language like C99.
For 2) we could have a program that will interact with the network and a payer to receive 1 coin and see it confirmed.
Does this seem like a reasonable definition? Does it capture all the complexity of a blockchain protocol? Could it be further simplified?
[1]
https://en.wikipedia.org/wiki/Algorithmic_information_theory