I found the paper in May that put two different qubit chips up against software and hardware solvers in a very specific class of problem, and the results were that with a problem that is most suitable for the chip it found a solution in half a second, several thousand times faster than the best traditional methods.
There were a lot of ifs/buts and exceptions however the V5 and V6 chips tested, when given the right kind of problem, were indeed able to solve it (it was an annealing problem) in a grand flash.
A clear eyed summary on that May paper, and the D-Wave devices is here:
http://spectrum.ieee.org/computing/hardware/dwaves-year-of-computing-dangerouslyI've no doubt that quantum computing is going to be an arms race, and it will start to solve in parallel more problems over time. Whether that includes searching for keys or cryptography I've no idea but if it does you CAN BET THAT NSA WILL NOT TELL US ABOUT IT.
One thing I have pointed out in the past (and likely will need to continue to point out for a very long time ) is that DWAVE's system implements (or it stated to implement there is some controversy over exact what is happening) quantum annealing algorithm which while it does (or may) have some quantum properties is not the same thing as a general purpose quantum computer.
To attack public encryption requires a general purpose quantum computer (QPQC) capable of implementing Shor's algorithm (or one built to only implement ONLY Shor's algorithm, a quantum "ASIC" if you will). Even if DWAVE's computer was a quadrillion qubits it would be absolutely useless for attacking public encryption.
The global recognition for successfully factoring larger numbers using a GPQC and Shor's algorithm is a pretty big deal. To date nobody publicly has shown the ability to factor numbers larger than 143 (an 8 bit number). Even factoring 32 bit numbers at commercially viable speeds would probably mean we are decades away from being able to break 256 bit ECC keys but we aren't even close to that. While QC is powerful in theory, practical progress has also been agonizingly slow:
2001
First successful demonstration of Shor's algorithm.
Computer size: 7 qubits
Number factored: 15 (into the factors 5 & 3)
Equivelent RSA keysize: 4 bit
2011
First successful factorization of a three digit number (143)
Computer size: 8 qubits (actually 4 bits using an iterative process)
Number factored: 143 (into the factors 11 & 13)
Equivelent RSA keysize: 8 bit
So the bit strength of "encryption" (I use this term loosely because one could crack 8 bit "encryption" with paper and pencil) that can be broken using QC has gone from 4 bit to 8 bit after a decade of research. Even if this can scale, at the current growth rate breaking Bitcoin would be more than a lifetime away. ECDSA is a little different than RSA in that it doesn't involve factoring large numbers and instead uses the properties of elliptical curves but both can be broken using Shor's algorithm. In general ECC requires smaller keys than RSA for an equivelent level of security. To achieve 128 bit key strength (which is considered beyond brute force for classical computing) using ECDSA requires 256 bits, but using RSA requires 3,072 bits. Bitcoin could use RSA and be just as secure however transactions would be up 12x as large.
Still both RSA and ECC in theory can be broken faster than what is possible with classical computing by using a large enough (and fast enough) quantum computer implementing Shor's algorithm. I have only found one paper to date which compares the qubit cost of implementing Shor's algorithm against both RSA & ECC.
6.3 Comparison with the quantum factoring algorithm
One of the main points of this paper is that the computational “quantum advantage”
is larger for elliptic curve discrete logarithms than for the better known
integer factoring problem. With our proposed implementation we have in particular
achieved similar space and time requirements. Namely the number of qubits needed is
also of O(n) and the number of gates (time) of order O(n^3)), although in both cases
the coefficient is larger. Note that the input size n is also the key size for RSA resp.
ECC public key cryptography. Because the best known classical algorithms for breaking
ECC scale worse with n than those for breaking RSA, ECC keys with the same computational
security level are shorter. Below is a table with such key sizes of comparable security
(see e.g. [25]). The column to the right roughly indicated the classical computing resources
necessary in multiples of C, where C is what’s barely possible today (see. e.g. the RSA
challenges [26] or the Certicom challenges [27]). Breaking the keys of the last
line seems [15,360 RSA or 512 bit ECDSA] to be beyond any conceivable classical computation,
at least if the presently used algorithms can’t be improved.
Summary:
Breaking 256 bit ECDSA (128 bit key strength) requires 1800 logical qubits and a time of 6.0*10^9 operations.
Breaking 512 bit ECDSA (256 bit key strength) requires 3600 logical qubits and a time of 5.0*10^9 operations.
Where f(n) and f'(n) are as in section 6.2 with ǫ = 10. The time for the
quantum algorithms is listed in units of “1-qubit additions”, thus the number
of quantum gates in an addition network per length of the registers involved.
This number is about 9 quantum gates, 3 of which are the (harder to implement)
Toffoli gates (see e.g. [5]). Also it seems very probable that for large scale
quantum computation error correction or full fault tolerant quantum computation
techniques are necessary. Then each of our “logical” qubits has to be encoded
into several physical qubits (possibly dozens) and the “logical” quantum gates
will consist of many physical ones. Of course this is true for both quantum
algorithms and so shouldn't affect the above comparison. The same is true for
residual noise (on the logical qubits) which will decrease the success probability
of the algorithms. The quantum factoring algorithm (RSA) may have one advantage,
namely that it seems to be easier to parallelise.
http://arxiv.org/pdf/quantph/0301141.pdf I emphasis of relevant portions is by me.
So we are looking at a general purpose quantum computer which needs at least 1,800 logical qubits. Nobody (AFAIK) has even done research on the amount of error correction required for a commercial general purpose quantum computer (because none exists) but lets take the authors estimate of "dozens of physical qubits per logical qubit" and use 24x as a low end. This would mean (1,800 * 24) a system with 43,200 or more physical qubits. To put that into perspective the largest (AFAIK please correct me if you find a larger example) general purpose quantum computer capable of implementing Shor's algorithm is ... 7 qubits so we aren't eve in striking range yet.
Still QC "may" may break 256 bit ECDSA within our lifetime, so should be looking to future solutions. There are a couple things which can be done to improve the quantum security of the Bitcoin network. The simplest change would be to implement a new address type which uses a larger ECC curve. While larger ECDSA keys can sitll be broken they do increase the qubit requirements, moving to 512 bit keys doubles the required qubits and 1,024 will quadruple them. If GPQC proves viable this would buy the network some time for a longer term solution. I don't even know if 1,024 bit ECC is supported by major libraries, outside of quantum computing there is little need for 1,024 bit ECC because even 256 bit ECC is considered beyond brute force.
A longer term and more complex solution would be moving to an address scheme based on a post-quantum algorithms (
http://en.wikipedia.org/wiki/Post-quantum_cryptography). These systems generally have no wide spread deployment so there may be unknown security issues so I am not advocating a change anytime soon just pointing out there are algorithms which can be used to protect the network even if large GPQC become available. The largest obstacle to these post quantum systems is generally they involve much much larger public keys. The bandwidth and storage requirements of Bitcoin are highly correlated to the size of the public key and we are talking about public keys in the tens of kilobytes (notice the plural) so transactions and blocks would be easily a hundred times larger. Now it is entirely possible that Moore's law will mitigate this somewhat, and stat using a 50,000 byte key in 2048 will be no more of a challenge then using a 256 bit key today.
For those who are interested in post quantum cryptography one implementation (which has open source implementation) is NTRU. It is likely far too early to consider any such implementation in a production system but implementing a patched version of bitcoin on testnet which incorporates NTRU would be an interesting project.
Of the various lattice based cryptographic schemes that have been developed, the NTRU family of cryptographic algorithms appears to be the most practical...smallest key size...highest performance.