Thank you.
Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?
Yes. Though you should also recognize that block size and bandwidth use are not so tightly tied.
Network and disk compression can compress data, the block size is measured after decompression.
Fair enough- block size may not be the best parameter to tweak to maintain the stated "bandwidth and disk space" goal, but it is a technically simple parameter to tweak.
Do you believe that there exists somewhere in the blockchain a metric, let's call it X, which would serve as a good predictor of "the bandwidth and disk space an average enthusiast can afford"?
I think this is the same question, though you may disagree: Do you believe that this metric X has a causal relationship with "the bandwidth and disk space an average enthusiast can afford"?
The Socratic inquiry is a bit pedantic, don't you think?
Skip to your point please.
Very well.
No metric that can be gleaned from the blockchain has a causal relationship with "the bandwidth and disk space an average enthusiast can afford", and therefore any such predictor has a high danger of being either too restrictive or not restrictive enough.
Using Nielsen's Law also has a danger of being inaccurate, however given that it has at least been historically accurate, I find this danger much lower.
Do you disagree? (let's leave ossification out of this just for the moment, if you don't mind)
Thank you. You saved yourself a lot of time. I had enough socratic in law school. And we'll set aside ossification for your benefit even though it cuts against your position here.
Yes, I disagree.
Both Block size and transaction fee may be better tools than Nielsen's law, the combination may be even more so. Dismissing inquiry on the matter, is a missed opportunity.
Having worked in multinational telcos for a few decades designing resilient scalable systems serving 193+ countries, managing teams of security software engineers, and responsibility for security and capacity management, the concepts are not so foreign. The benefit of something like the block chain to provide consolidated data to rightsize applications over time for their audience, is a ripe fruit.
Neilson's law is less fit for purpose.
1) It has measured fixed line connections.
- Usage demographics have changed over the period of history it covers. More connections are mobile now than previous, and telco resources and growth have shifted. There are other shifts to come. These are not accommodated in the historical averages, nor are they factored into the current ones under Neilson.
2) It is not a measure of the average enthusiast.
- It measures a first world enthusiast, whose means have improved with age, in a rich nation with good infrastructure in time of peace. This is not average with respect to place and time through history.
3) Following bandwidth growth is not the only function of max block size, though tying it to the average enthusiast capabilities (if that were possible) would be a suitable way of addressing other functions.
- ultimately it must accommodate the transactions of sufficient fees to maintain the network, and to not constrain reasonable commerce. These will be business decisions which may be depending on the capacity and cost of the Bitcoin network and its associated fees. These may radically bend the curve in one way or another. A fixed non-responsive rate can not be flexible to a changing environment. Avoiding a requirement for central decision makers to accommodate (or not) puts perverse incentives on Bitcoin developers.
I get that the core devs, (and former core devs) do have do deal with a lot of crazies. But what is not needed is the "either you agree with me or your are stupid, crazy, or lazy" dismissals of doing real science instead of merely technicians work. Science is hard, but it is often worth it.
I recall Gavin's talk in San Jose in 2013 being a lot more nuanced on this matter, and it looked like there were real solutions coming, with a future-proof market sensitive approach. That conference was better in many ways than TBF did this year in Amsterdam.
That earlier stance was optimistic and well founded, it was abandoned. The explanations for why it was abandoned don't seem compelling at all.
In my first proposal in this thread, I replicated Gavin's Nielsen's law approach with a simple algorithm that replicated it in effect, but took its cues from the block chain to accomplish that (so growth would stop or accelerate if real world circumstances changed). This was simply an exercise to show that it would be easy enough to do so.