Any computation that determines the maximum size of a block must have the same result no matter who calculates it or when they calculate it, and the inputs to the calculation must be indisputable. Otherwise, the chain will fork.
Eventually, mining will be completely dependent on transaction fees and the maximum size of a block affects the value of those fees. So, a proposal for varying the maximum size should include some analysis on the effect on fees in potential future scenarios.
It can be argued that a 1 MB fixed cap may not be optimal (or even sufficient), but until a clearly better method can be demonstrated, I think it is likely to remain.
Thanks for replying.
I agree that the difficulty isn't really a measure for transaction demand. However, and I might also be wrong about this one, I believe that a relation (not necessarily a linear one) might exist.
I went to check some data to see something more visual.
I checked these 2 tables:
https://ycharts.com/indicators/bitcoin_transactions_per_day
https://www.blockchain.com/charts/hash-rate
I have seen 2 significant things:
1) It appears that there is no linear relation between the hashing rate and the transactions per day.
One thing that I think should be taken into account, is that today we are already facing the scalability issue's consequences. Meaning that I think it's possible to say with just a little of confidence, that if
transaction fees weren't as high - we would see much more transactions occurring. So basically saying that it might be possible that the data we are seeing today is "manipulated" by the problem itself.
2) When major changes in trends occur, it is seen in both the hashing rate and the transaction 'demand'. You can look at the end of June 2021, end of April 2021, end of October 2020, end of May 2020,
end of March 2020. Although not identical, the downwards trend is similar. It should be said that it's possible that there a 3rd factor I didn't take into account that is effecting both of these rates.
So to sum this one up - I don't believe that the relation between the two factors should be linear. It could be inverted, divided, and etc.
Regarding your insight of having to have matching calculations no matter time of place - regarding the who and where I agree, regarding the time factor, that's the whole case here.
Who and where - shouldn't matter, as long as the formula/algorithm is embedded in the network.
Time - that's what the calculation is all about. To give a specific (yet doesn't have to be accurate to the decimal point one) value that represents the relation between demand (transaction amount) and network power.
This value has to change throughout time, or nothing will ever change in the network (hence block size remains the same).
The algorithm might take multiple calculations made by many nodes, and average them to get the best all-around estimation. This way there wouldn't be a hard fork. This is only theoretically speaking. I am not sure that on the mathematical level this should be a good idea.