yeah at first i thought maybe they would come up with something special but as time has gone on, it's become painfully clear that they really are clueless. they don't know what they're doing and that's dangerous! that's my opinion of course. NIST did an ok job in selecting AES but that hasn't translated over too well in picking a PQC encryption signature thing. not at all. I guess they thought it would be just as easy. I guess not! Plus it's taking forever and you just know that whatever they end up choosing will be out of date by the time quantum computers arrive on the scene anyway and it will need to be revamped probably of course that is assuming QC ever do arrive on the scene, I know you're skeptical about that happening. But IBM was in the news lately: https://arstechnica.com/science/2023/12/ibm-adds-error-correction-to-updated-quantum-computing-roadmap/
I think IBM is going to eventually get a usable QC out the door. No doubt about that. All these other companies I wouldn't know but IBM when they commit to something they usually see it through.
The method most commonly tested today (called a "surface code") can require up to 4,000 hardware qubits to host 12 logical qubits; the scheme described in the manuscript can do so using only 288 hardware qubits.
If IBM can get it down to only 288 then maybe there's hope. But it can't be taking 1000 qubits to form a single logical qubit. That's way too many.
Compare when IBM invented the Deep Blue chess computer. nothing like that had ever beaten a world champion before and it destroyed Gary Kasparov at the chess board. ushering in a new era in chess computing which today there are far more powerful chess computers. even on your phone. but IBM was the first. proving it could be done. They'll do the same thing with Quantum.