No. He is talking about the 970-980 kB blocks which not rare these days. While the average is around 300kB, that means we are at 1/3 capacity. And I personally think 1/3 is where we should seriously consider our options for scalability. The way people are split on the subject right now it is even more urgent to start getting a consensus.
But on the other hand there is no hurry and no crisis. The extra transactions can just go through altcoins if Bitcoin chooses the 1MB block + higher fees 'solution'.
They aren't rare, but they are not the norm. Most aren't near full.
F2Pool solved a few that were close recently, above 900K. With a good number of freebies.
https://blockchain.info/block/0000000000000000175b44859017a5148c48ecba7a67f14012232e9bb6b47a73
https://blockchain.info/block/00000000000000000f9597aed448ce8429c550a65f896b66760381d0c364901e
and then there is this 7K block in between
https://blockchain.info/block/000000000000000014efb22561313ebe3c27780808b5d8939ebc1a850badf9da
There are a lot of blocks with <200K, so we'd get some scalability of more Tx/s with a minimum block size too, but that would not be a good thing to do.
Emmmm, I believe I also said as much. The norm is 300 kB. So all we can get is 3x as much transactions. I see that as a bottleneck.
If this was my production server I'd be looking into getting a new one
We agree on the problem, we disagree on the solution. Treating it as a 'production server' replacement, would be the wrong approach.
How many times do you want to replace this 'production server' for the same reason?
If we are going to a dynamic limit, it should be one that isn't going to need to change later, and can be assured that it will be fit for purpose, and without opening up new vulnerabilities.
The problem with that... it isn't simple.
If the limit were say 10x the average size of the last 1000 blocks, it would still provide the anti-spam protection and keep the node distribution from getting too centralized