Over its history, Bitcoin users per nodes has declined from >1:1 to now something like <1:700 (from the calcs upthread).
Looking at it another way... # of nodes contributes an element of security (resilience) where N is the minimum to run, N+1 is with one 'hot failover' system for backup, we have something like N+7000... which so far has been more than enough.
....
As (A) [meaningful transactions] increases, Bitcoin's value increases. As (B) [node quantity] increases, Bitcoin's cost increases.
Thus if <1:~700 has been sufficient with a market cap of 3-4 billion one could extrapolate a minimum of full nodes desired with a more valuable economy as there is a loose correlation between the incentive to attack a more valuable network and the costs inherent to do so. So if Bitcoin grows to a market cap of 8 billion we would want around <1:~350 full nodes. 16 billion we would want <1:~175 nodes. This shouldn't be a linear ratio continuously as supporting that level of security for growth isn't realistic even if we keep the block limit at 1MB. Perhaps if we start considering pruned nodes as having a certain amount of value in the total node count it can be extended much further.
Extrapolation with only these variables is not such a great idea, but I take your meaning. Yes, add in pruned nodes when they materialize at some value less than their percentage (they are worth nothing alone).
I still believe what is more important is the type, quality, whom controls, and distribution of the nodes vs just the total node count but am fine if bitcoin is over-engineered where it focuses on both considerations. Additionally, I believe we should consider the value of 1GB pruned full nodes in context to the discussion with subsidizing security.
Thus we could either incorporate an incentive structure through price discovery methods to incentivize nodes and/or optimize core and add pruning, or we can have some external methods as discussed to accomplished these shared goals.
What we cannot do is simply sit idly by and assume nothing needs to be done either. To be fair many people who oppose this fork are actually working on projects like the 10k extra pogo nodes to address this issue. I think that if we have a comprehensive plan of action before the hardfork we can satisfy all our values and objectives.
There are more important things that can be done like setting up libbitcoin Obelisk nodes/servers to support other implementations than simply focusing on total node count for bitcoin core, but if the requirements are realistically achievable like you outlined with scaling up the total node count with market cap than there is no reason we cannot accomplish both at the same time.
So moving forward, if a plan is in place, with an incentive structure built that will sustain decentralization realistically, and we are achieving said goals before the hard fork, than is anyone opposed to increasing the blocksize limit and has specific objections in doing so?
Sure... but the perfect is the enemy of the good (or the better). You listed some risk mitigation efforts, there are also other threat vectors. If you might be willing to imagine that
the USG is anti-Bitcoin, you might posit things like
Project BULLRUN, or the
Equation Group's capabilities to take over any computer with storage and an EEPROM based on it matching a pattern of data on its storage (say, the genesis block, for example). So this doesn't move the needle much. We still won't know how much security is needed until there isn't enough.
We can also consider recovery modes, a bandwidth attack may not end Bitcoin entirely due to its anti-fragile nature. The biggest full failure modes come with adding pernicious risks, ones that build over time to eventually kill it. I believe this is part of why satoshi dropped it from 32MB to 1MB in the first instance (and maybe also TOR).
An external incentive structure is a good thing, but being external, it is also untrustable by the code (it could disappear and we'd be left with code that was written assuming it is there).
So with this impasse, I'll re-propose the progression mentioned above with just a little more detail.
1) A hard fork with a definite increase (8MB maybe, some moderate yet arbitrary number that doesn't fix the core of the problem). Nothing exponentially increasing, but maybe a best guess at providing enough time for (1.5).
1.5) Adding code for all the stuff needed to measure block size within the block chain, or number of transactions per block in the chain (something clever with the Merkle tree maybe?)
2) A dynamic limit hard fork (indefinite in term, but not inaccurate) which sets a max block size to accommodate transaction growth within a range thereby preventing spam/attacks but with some room to grow, (maybe 10x the average size). The goal here is to rightsize the block size within the block chain.
3) No block size limit (because the transaction cost supports all network costs appropriately). JustusRanvier's discussion papers allude to methods for this. There are practical problems, lots of them, too many to list. This is however the end-game for perfectly scalability whilst maintaining security using a decentralized free-market economic incentive structure. No one knows how to do this yet. The problem with indefinite exponential increasing limit is that it is too close to this and assumes too much.
For what its worth, there may well be a decade in between each of these phases, what we need from our scientists is the massive amount of work and planning to get from (1) to (3). Bitcoin ought not skip steps due to fear of its opposition, to do so would play into the hands of the feared opposition.
Gavin's proposal is a great one, it is well reasoned, and I think it is well meaning. I agree with him that it is likely "the simplest that could work". However it unnecessarily creates more risk than benefits. Please, let us take only necessary risks and not unnecessary ones? Let us keep the highest standards for ourselves, and for posterity. We owe this to all those that have contributed so very much to this project.
============
There are some "no fork for blocksize" folks that see increasing the capability of transactions as a sort of inflation and thus unacceptable.
www.contravex.com/2015/01/12/bitcoin-doesnt-need-a-hard-fork-it-needs-hard-people/This is what it boils down to: scarcity. There’s no room in Bitcoin for inflation of any kind. Other applications and whatnot can be built on top of it as is. It’s for the world to adapt and conform to Bitcoin, not the other way around.
I don't see transaction volume increase as a change in inflation so much as a change in velocity.
Henry Hazlitt who is credited with bringing Austrian Economic theory to the English speaking world, would I think agree that is not something that ought to be artificially influenced either by constraint or encouragement. Hitting the limit would be an artificial constraint, removing it ahead of (3) would be an encouragement, but for inflation it is a no-op.
Money is always in someone's hand. For consumers to spend and "circulate" money at a rapid rate, there needs to be a party willing to accept the currency. That is, the average per capita holding of currency will remain the same