You keep saying that it is opt-in however it is only opt-in for the sender not the reciever,
Not so, the merchant can simply ignore the transaction until it is confirmed; as they already do for all manner of unusual, nonstandard, unconfirmed input transactions, etc. or otherwise their acceptance of zero conf is no more secure than RBF (if it ever is...) ... and doing this is relatively harmless, because Opt-in RBF transactions do not need to suffer significant confirmation delays.
My point still stands and like I said before ignoring a transaction until it is confirmed is not suitable for the retail environment I was using as an example.
For me it is not even a question of greater expertise since it has also become about ideology relating to economics and politics, these are questions that most technical experts are not specifically trained to answer. I think that some of these more fundamental questions like the blocksize for instance are more concerned with politics and economics then computer science,
You are making a strong and unjustified assumption about the skills and background of people maintaining Bitcoin Core. I think you may be making the fallacy of assuming that a group excellent in a particular area must necessarily be weak in another specific area.
It is impossible to be an expert in an unlimited amount of fields, I think that multi disciplinary approaches do lead to the best understanding in most cases, however specialization to a certain extend is required. We can be good at many things but only masters at some.
However in regards to my political arguments it is irrelevant how qualified in any field Core is, A benign dictatorship is still a dictatorship, we could debate whether it would be an oligarchy, technocracy or a form of totalitarianism. It still does not change the underlying nature of what would define the governance under Core without significant support for alternative implementations. I can understand that software development needs to be "dictatorial" in its internal decision making process. This is why distributing development allows it to become more "democratic" and more in line with the ethos of decentralization within Bitcoin.
The community, even the most active segment, is fairly large and diverse in many ways-- much more so then, for example, the persons working on XT*. Beyond the expected CS and distributed systems PHDs, the community includes people with expertise in mathematics, economics, financial markets, ... Peter Todd has a fine arts degree. Skepticism about the viability of the Bitcoin system absent effective meaningful block size limits can be found in peer reviewed academic publications in economics venues. Negative effects on mining fairness are both predicted by simulation, and borne out in field trials on test networks.
Like I said before it is irrelevant how benign or qualified Core might be, I will still vote according to my own conscience. The problem might just be that Core is not effectively communicating this I am open to that idea. I might not be a technical expert but I have spent most of my time over the last year learning about cryptocurrency, so if I am simply failing to understand this then Core might have a problem with communication. However I suspect that the conclusions of some of your research would depend on the ideological understandings of these definitions. For instance I consider pools to be comparible to a form of representive democracy for the miners, I suspect that this would effect your conclusions on "miner centralization". If we accept the continued existence of 10-20 pools for miners to freely choose from, which is how Bitcoin functions today.
[*As a vague yardstick, there are ~19 contributors to Bitcoin core with individually more commit count activity in the last six months then all contributors to XT had in both XT and Core combined. Commit count is a crappy metric and you can figure that is off by a large factor in either direction; but this isn't really a comparable; and this is in spite of non-stop attacks that make working on Bitcoin really demoralizing]
I do hope you keep your spirit up, and I would consider it to be a shame if you did stop working on Bitcoin if BIP101 forked the network like you said you would. I would however absolutely respect your right to continue supporting Core if it chose not to adopt BIP101 and continue to exist as the smaller chain, if you are correct in your theories then it should become the dominant chain again over the long run.
And beyond the expertise, we're speaking about a question where in the absence of perfect knowledge we conducted the experiment: We raise the soft blocksize target from 250k to 750k and saw tremendously negative effects: substantial declines in node count (in spite large growths in userbase; and to brag, somewhat heroic efforts to increase software performance), substantial increases in mining centralization, substantial increases in Bitcoin businesses relying on third party APIs rather than running nodes (hugely magnifying systemic risks).
I am somewhat doubtfull whether you can definitively causally link all of these factors to the increased transaction volume, there are also many different variables at play, including increased decentralization because of adoption.
It is a tug of war of these different variables so to speak. I do think that the blocksize should ideally be a balancing act, with the limit acting as a precautionary measure, meaning that the blocks should not become consistently full over longer periods of time, I would disagree with such a change in the economic policy of Bitcoin, this was also never supposed to be the intention of this limit, there are also concerns over this somewhat breaking the social contract as well.
And today we are left at a point where the bandwidth consumption of an ordinary Bitcoin node just barely fits within the 350GB/mo transfer cap of a high end, "best available in most of the US" broadband service.
I have done some research to test this statement and I do not think it is true. Comcast has an available option for having no data limit, while AT&T has a service where they "do not enforce" the data cap, I know that is a bit weird but that is what I found out. Time Warner also does not have data caps on its more popular plans and Verizon also does not have data caps. These are the top four ISP's in the US, the situation in Europe is also getting better like the US. To be fair these are relatively recent developments, so I would understand how you might have been mistaken about these facts.
We cannot know to what degree the load increase was causative, but none of the metrics had positive outcomes; and this is a reason to proceed only with the greatest care and consideration.
This I can agree with, however care and consideration can of course also be taken to far.
Especially against a backdrop where Bitcoin's fundamental utility as a money are being attacked by efforts to regulate people's ability to transact and to blacklist coins; efforts that critically depend on the existence of centralized choke-points which scale beyond the system's scalability necessarily creates.
This I actually disagree with, I also see a threat of centralized choke points however it would more likely be due to an increased reliance on a limited number of third parties because of the presently restricted blocksize.
You're right though that the question is substantially political: A fully centralized system could easily handle gigabyte blocks with the work we've do to make megabyte blocks barely viable in a highly decentralized world. Such a system could also happily institute excess inflation, censor transactions, and other moves "for the good of the system" and "to assure widest adoption".
You are arguing a straw man here, I believe that increasing the blocksize is what will be best for decentralization over the long run. Leaving the one megabyte restriction in place presents greater risks of centralization and obsolescence.
If Bitcoin is to survive in the long run we just stand by the principles we believe in, and which make the system valuable in the first place. -- Even against substantial coercive pressure. Otherwise the transparent system of autonomously enforced rules risks devolving into another politically controlled trust-based instrument of expedience that we see with legacy monetary instruments.
I have the same concern, however I perceive Core as being the most likely point of centralization at this point, yet it seems like we do share some of the same principles yet we are on opposite sides of this debate.
Furthermore when blocks do fill up we now already have child pays for parent for unsticking transactions without the negative consequences
We do not. CPFP has substantial complexities that prevent it from actually working on the network today; and using it has large overheads. It will be a good additional tool to have, but it does not replace RBF.
I believe that I have used CPFP for some very practical reosons, I thought it was a pretty good feature at least from the perspective of the user. One of the many things I can congratulate Core for developing.
In regards to you saying that Gavin is not active in development I certainly do have a different perspective, considering
You can have a different perspective; but you cannot have your own facts. This is a question of objective fact. But you mistake my comment for an insult, it wasn't intended as one-- who am I to judge what someone else spends their time on? But rather an observation the it would have been surprising to see a contribution there.
You somewhat missed the point I made about Gavin, I consider him to have contributed a lot for the development of the Bitcoin protocol by his actions to increase the blocksize, which is an important issue to me.
which can only be done significantly by increasing the blocksize.
An action which you could only contemplate due to the work of myself and others who believe that the BIP101 approach would be significantly damaging. I think it's likely that it will be increased in the future, but in a way only that preserves Bitcoin's properties as a decenteralized P2P electronic cash, rather than disregarding them or undermining them.
Saying that it will likely be increased in the future is not good enough, which is why we are at this impasse in the first place. You can not expect us to simply trust Core to increase the blocksize when we do have a fundamental ideological disagreement about allowing the blocks to fill up. If you where to take the smart political approach you should announce a date for an increased blocksize before January, this would allow Core to maintain control for longer which I would hope would be for benign reasons like helping to ease the transition into having multiple implementations to distribute the power of development more.
It should be an increase that meets BIP101 in the middle, it would need to be a true comprise. BIP100 might be able to serve this role, even with the thirty two megabyte limit that still exists, at least it would set a precedent. I am being generous here because I would prefer to see consensus through compromise compared to the possibility of a split.