You keep saying that it is opt-in however it is only opt-in for the sender not the reciever,
Not so, the merchant can simply ignore the transaction until it is confirmed; as they already do for all manner of unusual, nonstandard, unconfirmed input transactions, etc. or otherwise their acceptance of zero conf is no more secure than RBF (if it ever is...) ... and doing this is relatively harmless, because Opt-in RBF transactions do not need to suffer significant confirmation delays.
For me it is not even a question of greater expertise since it has also become about ideology relating to economics and politics, these are questions that most technical experts are not specifically trained to answer. I think that some of these more fundamental questions like the blocksize for instance are more concerned with politics and economics then computer science,
You are making a strong and unjustified assumption about the skills and background of people maintaining Bitcoin Core. I think you may be making the fallacy of assuming that a group excellent in a particular area must necessarily be weak in another specific area. The community, even the most active segment, is fairly large and diverse in many ways-- much more so then, for example, the persons working on XT*. Beyond the expected CS and distributed systems PHDs, the community includes people with expertise in mathematics, economics, financial markets, ... Peter Todd has a fine arts degree. Skepticism about the viability of the Bitcoin system absent effective meaningful block size limits can be found in peer reviewed academic publications in economics venues. Negative effects on mining fairness are both predicted by simulation, and borne out in field trials on test networks.
[*As a vague yardstick, there are ~19 contributors to Bitcoin core with individually more commit count activity in the last six months then all contributors to XT had in both XT and Core combined. Commit count is a crappy metric and you can figure that is off by a large factor in either direction; but this isn't really a comparable; and this is in spite of non-stop attacks that make working on Bitcoin really demoralizing]
And beyond the expertise, we're speaking about a question where in the absence of perfect knowledge we conducted the experiment: We raise the soft blocksize target from 250k to 750k and saw tremendously negative effects: substantial declines in node count (in spite large growths in userbase; and to brag, somewhat heroic efforts to increase software performance), substantial increases in mining centralization, substantial increases in Bitcoin businesses relying on third party APIs rather than running nodes (hugely magnifying systemic risks). We've seen the result and it isn't pretty. And yet this information is ruthlessly attacked whenever it is pointed out-- I am routinely called a "bitcoin bear" even though I have a significant portion of my net worth tied up in it, simply for beveling in Bitcoin enough to be frank about the problems and limitations in it. Many people less convinced about Bitcoin's power and value than I and much more interested in the short term pump are unwilling to tolerate any discussion of challenges; and this creates a poisonous atmosphere which undermines the system's ability to heal and improve.
And today we are left at a point where the bandwidth consumption of an ordinary Bitcoin node just barely fits within the 350GB/mo transfer cap of a high end, "best available in most of the US" broadband service. We cannot know to what degree the load increase was causative, but none of the metrics had positive outcomes; and this is a reason to proceed only with the greatest care and consideration. Especially against a backdrop where Bitcoin's fundamental utility as a money are being attacked by efforts to regulate people's ability to transact and to blacklist coins; efforts that critically depend on the existence of centralized choke-points which scale beyond the system's scalability necessarily creates.
You're right though that the question is substantially political: A fully centralized system could easily handle gigabyte blocks with the work we've do to make megabyte blocks barely viable in a highly decentralized world. Such a system could also happily institute excess inflation, censor transactions, and other moves "for the good of the system" and "to assure widest adoption". If Bitcoin is to survive in the long run we just stand by the principles we believe in, and which make the system valuable in the first place. -- Even against substantial coercive pressure. Otherwise the transparent system of autonomously enforced rules risks devolving into another politically controlled trust-based instrument of expedience that we see with legacy monetary instruments.
Furthermore when blocks do fill up we now already have child pays for parent for unsticking transactions without the negative consequences
We do not. CPFP has substantial complexities that prevent it from actually working on the network today; and using it has large overheads. It will be a good additional tool to have, but it does not replace RBF.
In regards to you saying that Gavin is not active in development I certainly do have a different perspective, considering
You can have a different perspective; but you cannot have your own facts. This is a question of objective fact. But you mistake my comment for an insult, it wasn't intended as one-- who am I to judge what someone else spends their time on? But rather an observation the it would have been surprising to see a contribution there.
which can only be done significantly by increasing the blocksize.
An action which you could only contemplate due to the work of myself and others who believe that the BIP101 approach would be significantly damaging. I think it's likely that it will be increased in the future, but in a way only that preserves Bitcoin's properties as a decenteralized P2P electronic cash, rather than disregarding them or undermining them.