Pages:
Author

Topic: 512-qubit Quantum Computer acquired, is bitcoin doomed? (Read 12209 times)

legendary
Activity: 868
Merit: 1006
This is basically a science fiction entertainment for those that have too much free time. No, SHA256 is still safe, and if it cracked, the banking system would collapse as well since everything is running under SHA256 those days, from ATMs to electronic payment systems that each bank has implemented for their internet transactions, Paypal would get hit hard as well.
legendary
Activity: 3248
Merit: 1070
for those who are too busy/lazy to read D&T's statement, my interpretation is:

If QC starts to look like a threat to Bitcoin over the coming decades, a hardfork could be done into a new set of software and blockchain, that is QC proof. Basically a migration.

The only thing is that this migration has to be done early enough to prevent an (QC proof) altcoin from taking over.  Wink

the problem i could see, if they don't  disclose the release of a possible working machine, and keeps it secret, this could be a real issue
legendary
Activity: 1372
Merit: 1014
for those who are too busy/lazy to read D&T's statement, my interpretation is:

If QC starts to look like a threat to Bitcoin over the coming decades, a hardfork could be done into a new set of software and blockchain, that is QC proof. Basically a migration.

The only thing is that this migration has to be done early enough to prevent an (QC proof) altcoin from taking over.  Wink
legendary
Activity: 2646
Merit: 1137
All paid signature campaigns should be banned.
Here is a great exchange from a "mass storage and other technical stuff" related email list that I subscribe to discussing this IBM breakthrough:

First were the questions:

Quote
As an ex IBM Research guy, this sounds like a product of the IBM Research PR dept. They are really good.  However, maybe some of you can answer the following dumb questions.
 
1.       I thought the quandary of practical implementation of high Qubit computers was that the Qubits all had to communicate with their environment (that is, other Qubits), without communicating with their environment (the thermal sea of other interacting  fluctuations leading to quantum decoherence).  Does this IBM advance in what seems like ECC resolve this quandary?
2.       The press release says:” Once a quantum computer is perfected it will not only be able to crack any encryption code today and make new uncrackable codes..”
I thought that Schorrs algorithm showed an approach to factoring giant numbers into prime numbers in polynomial time on quantum computers caused concern about the current approach to asymmetric codes (public/private keys that are really important), but   other approaches to asymmetric code(someone on this list  mentioned knapsack) and also transpositional (hash) codes (Bitcoin for infamous example)  had no known algorithm for solving in polynomial time on quantum computers, so describing this advance as  cracking any encryption code is overstated. .So, is the press release overstated on cracking any encryption code today.?
3.     Is it proven that quantum computers can’t solve transpositional codes or other codes (besides factoring primes) in polynomial time, or could it just be we haven’t discovered the algorithm?

Then the reply to the questions:

Quote
Hi Bob,

I don't have time right now to give a larger rundown, but:

1. Quantum Error Correction is a very well-researched field. See
Devitt's "QEC for beginners" paper, if you're interested:
http://arxiv.org/abs/0905.2794
Yes, you have the general characterization of the problem right -- you
want qubits that are easy to control but don't interact with the
environment, and those two characteristics are contradictory.
2. I'm actually not aware of any research on using quantum computers
to make better encryption schemes. The usual problem is that they have
confused quantum key distribution (QKD) with quantum computing, and
QKD doesn't exactly solve the problems created by Shor's algorithm.
Shor will impact authentication mechanisms, breaking RSA and friends,
but QKD in fact *still depends on authentication*, so it's not a fix.
As to other specific encryption algorithms, I'm not up on the details,
but there is an irregular series of conferences on post-quantum
cryptography, and I think the next one is here in Japan:
http://pqcrypto.org/
3. Same as above, don't know.

As systems people, a good place for you to start might be my CACM
article from 2013:
http://cacm.acm.org/magazines/2013/10/168172-a-blueprint-for-building-a-quantum-computer/fulltext
it should be open access, you shouldn't need an ACM membership to fetch it.

If you are interested in networks, in particular, in fact last year I
published a book on quantum repeater networks. My apologies for the
price, that wasn't my decision:
http://eu.wiley.com/WileyCDA/WileyTitle/productCd-1848215371.html

I was asked by someone else about the specific numbers in the press
release. Sorry, my answer is kind of technical and relates to the
surface code form of error correction, but:

Are these 8,13,17,49 some kind of magic numbers ?

I?m not sure where the 8 comes from (other than that?s what they think
they can make), but the 13 and 17 very likely come from Clare?s
lattice surgery paper:
http://iopscience.iop.org/1367-2630/14/12/123011/article;jsessionid=B9EB904155288C40375C2DEE81165F77.c2

17 is enough to demonstrate distance 3 (d=3) surface code qubit,
including all of the stabilizer syndrome qubits. If you reuse some of
the syndrome qubits, you can do it with 13 physical qubits instead,
but then you have to wait longer for one cycle of QEC, so it?s only a
win if your memory is high fidelity. At distance 3, you can *really,
truly* correct *any* single error that occurs on your qubits.

I?m not sure where the 49 comes from. Shota, is 49 the right number
for some larger lattice?

53 is the number you really want ? arrange them as in Fig. 18 in
Clare's paper, and you can do a CNOT between two d=3 logical qubits,
and you would have the world?s first true, error protected quantum
*computation*. I would guess that 53 is the number that both Google
and IBM are really shooting for in the next five years or less...

So the IBM work is a very big advance. The PR is even better.

--Rod

Plus a follow on posting:

Quote
btw, one of the big reasons that quantum error correction is hard is
that extracting syndromes requires touching the qubits. We are
accustomed to thinking about error correction as something that is
done after the error channel itself, e.g. after transmission or
reading from a disk, and that the error correction process itself
doesn't introduce errors beyond the mathematical limitations. But
imagine if the circuit that calculates your syndromes itself has an
error rate of several percent and might accidentally overwrite even
the already error-prone data you are trying to correct. That's what
it's like in quantum.

In the research literature, you sometimes read papers that say that
errors can be corrected up to about 10%, or even 50% in some cases,
but those are hypothetical systems in which the extraction of
syndromes is perfect -- very far from the real world.

--Rod
AGD
legendary
Activity: 2070
Merit: 1164
Keeper of the Private Key
Nice article.
I think the Skynet robots will keep using Bitoins after they have destroyed the human race, because they will think it's perfect. I don't see any problem for Bitcoin here. Chill.
legendary
Activity: 2674
Merit: 2965
Terminated.
IBM reports it works well for almost 15 seconds before bursting into a ball of flames. They expect great results as soon as they work out a way to bury it in the core of the ice planet Neptune.
This just means that the news is useless. They haven't figured out a 'working quantum computer'. How can you say that something is working if it isn't stable for longer than 15 seconds?
I wouldn't worry about QComputers right now at the moment. Maybe in the future we will have to consider changing algorithms.
full member
Activity: 152
Merit: 102
"There is as yet insufficient data for a meaningful answer."
legendary
Activity: 2156
Merit: 1393
You lead and I'll watch you walk away.

IBM reports it works well for almost 15 seconds before bursting into a ball of flames. They expect great results as soon as they work out a way to bury it in the core of the ice planet Neptune.
legendary
Activity: 3976
Merit: 1421
Life, Love and Laughter...
full member
Activity: 193
Merit: 100
Quantum computing = the "hydrogen-powered car" of computer research.

Always "just around the corner", lots of hype and FUD, but never quite moving beyond a technical curiosity. The only way quantum computing could generate more baseless hype is if someone ports the litecoin client to run on a D-Wave box  Smiley

Quantum computing will be big for many things, but cracking bitcoin keys - or running Windows 8 -  are probably not two of them.

But hydrogen cars are already operational in Germany.
They even produce solid hydrogen now.
I think it might catch up in the next decade as some countries like Japan or European countries will want to get away from oil.
member
Activity: 84
Merit: 10
No need to nuke anyone.  Just blow up some nukes in space, and have the beautiful EMP burn all technology alive.  Then, we won't have to worry about any form of hacking for many years to come!
full member
Activity: 238
Merit: 100
Whow, he makes Iranian nuclear bombs look like the last straw of hope for the human race
member
Activity: 84
Merit: 10
No chance.  I don't see the encryption on bitcoin being beaten in our lifetimes, or that of our children's.  Maybe in the very late future.  But if bitcion is doomed, EVERY SINGLE secure process on the internet is too.
legendary
Activity: 2156
Merit: 1393
You lead and I'll watch you walk away.
To summarize, the OP is wrong. Bitcoin is not doomed because 256 bit keys are beyond brute force using anything that can be manufactured and we could always fork Bitcoin to 512 which is beyond beyond brute force but that will create a blockchain requiring a hard drive the size of Chevy van. lol
hero member
Activity: 508
Merit: 500
I found the paper in May that put two different qubit chips up against software and hardware solvers in a very specific class of problem, and the results were that with a problem that is most suitable for the chip it found a solution in half a second, several thousand times faster than the best traditional methods.

There were a lot of ifs/buts and exceptions however the V5 and V6 chips tested, when given the right kind of problem, were indeed able to solve it (it was an annealing problem) in a grand flash.

A clear eyed summary on that May paper, and the D-Wave devices is here: http://spectrum.ieee.org/computing/hardware/dwaves-year-of-computing-dangerously

I've no doubt that quantum computing is going to be an arms race, and it will start to solve in parallel more problems over time. Whether that includes searching for keys or cryptography I've no idea but if it does you CAN BET THAT NSA WILL NOT TELL US ABOUT IT.


One thing I have pointed out in the past (and likely will need to continue to point out for a very long time ) is that DWAVE's system implements (or it stated to implement there is some controversy over exact what is happening) quantum annealing algorithm which while it does (or may) have some quantum properties is not the same thing as a general purpose quantum computer.

To attack public encryption requires a general purpose quantum computer (QPQC) capable of implementing Shor's algorithm (or one built to only implement ONLY Shor's algorithm, a quantum "ASIC" if you will).  Even if DWAVE's computer was a quadrillion qubits it would be absolutely useless for attacking public encryption.   

The global recognition for successfully factoring larger numbers using a GPQC and Shor's algorithm is a pretty big deal.  To date nobody publicly has shown the ability to factor numbers larger than 143 (an 8 bit number).  Progress has also been agonizingly slow:

2001
First successful demonstration of Shor's algorithm.
Computer size:  7 qubits
Number factored: 15 (into the factors 5 & 3)
Equivelent RSA keysize: 4 bit

2011
First successful factorization of a three digit number (143)
Computer size:  8 qubits (actually 4 bits using an iterative process)
Number factored: 143 (into the factors 11 & 13)
Equivelent RSA keysize: 8 bit

So the bit strength of "encryption" (I use this term loosely because one could crack 8 bit encryption with paper and pencil) has roughly doubled after a decade of research.  Even if growth continues it would be 80+ years before even current RSA keys could be compromised using Quantum Computing and moving to larger keys is always possible.  ECDSA is a little different than RSA in that it doesn't involve factoring large numbers and instead uses the properties of elliptical curves.  In general ECC based keys have a higher bit strength then the equivelent sized key using RSA.  This is one reason that ECC was used over RSA for Bitcoin.  To achieve 128 bit security (which is considered beyond brute force for classical computing) requires a 256 bit ECDSA key but requires a 3,072 bit key.  Using RSA would be just as secure but transactions would be up 12x as large. 

Still both RSA and ECC in theory can be broken faster than what is possible with classical computing by using a large enough (and fast enough) quantum computer implementing Shor's algorithm. I have only found one paper to date which compares the qubit cost of implementing Shor's algorithm against both RSA & ECC.

Quote
6.3 Comparison with the quantum factoring algorithm
One of the main points of this paper is that the computational “quantum advantage”
is larger for elliptic curve discrete logarithms than for the better known
integer factoring problem. With our proposed implementation we have in particular
achieved similar space and time requirements. Namely the number of qubits needed is
also of O(n) and the number of gates (time) of order O(n^3)),
although in both cases
the coefficient is larger. Note that the input size n is also the key size for RSA resp.
ECC public key cryptography. Because the best known classical algorithms for breaking
ECC scale worse with n than those for breaking RSA, ECC keys with the same computational
security level are shorter. Below is a table with such key sizes of comparable security
(see e.g. [25]). The column to the right roughly indicated the classical computing resources
necessary in multiples of C, where C is what’s barely possible today (see. e.g. the RSA
challenges [26] or the Certicom challenges [27]). Breaking the keys of the last
line seems [15,360 RSA or 512 bit ECDSA] to be beyond any conceivable classical computation,
at least if the presently used algorithms can’t be improved.


Summary:
Breaking 256 bit ECDSA (128 bit key strength) requires 1800 logical qubits and a time of 6.0*10^9 operations.
Breaking 512 bit ECDSA (256 bit key strength) requires 3600 logical qubits and a time of 5.0*10^9 operations.


Where f(n) and f'(n) are as in section 6.2 with ǫ = 10. The time for the
quantum algorithms is listed in units of “1-qubit additions”, thus the number
of quantum gates in an addition network per length of the registers involved.
This number is about 9 quantum gates, 3 of which are the (harder to implement)
Toffoli gates (see e.g. [5]). Also it seems very probable that for large scale
quantum computation error correction or full fault tolerant quantum computation
techniques are necessary. Then each of our “logical” qubits has to be encoded
into several physical qubits (possibly dozens) and the “logical” quantum gates
will consist of many physical ones.
Of course this is true for both quantum
algorithms and so shouldn't affect the above comparison. The same is true for
residual noise (on the logical qubits) which will decrease the success probability
of the algorithms. The quantum factoring algorithm (RSA) may have one advantage,
namely that it seems to be easier to parallelise.

I bolded the important portions.  So we are looking at a general purpose quantum computer which needs at least 1,800 logical qubits.  Nobody (AFAIK) has even done research on the amount of error correction required for a commercial general purpose quantum computer (because none exists) but lets take the authors estimate of "dozens of physical qubits per logical qubit" and use 24x as a low end.   This 1,800 * 24 = 43,200 physical qubits.  To put that into perspective the largest (AFAIK please correction if you find a larger example) general purpose quantum computer capable of implenting Shor's algorithm is ... 7 qubits. 

Still this may happen within our lifetime.  There are a couple things which can be done.  The simplest would be to implement a new address type which uses a larger ECC curve.  512 bit keys doubles the required qubits and 1,024 will quadruple them.  If GPQC proves viable this would buy the network some time for a longer term solution.  I don't know if 1,024 bit ECC even exist simply because 256 bit keys are consider beyond brute force and 512 bit is beyond beyond brute force.  A longer term and more complex solution would be moving to address schemes based on post-quantum algorithms (http://en.wikipedia.org/wiki/Post-quantum_cryptography).  The largest issue with these systems other than the fact that they have no widespread deployment and thus there may be unknown flaws is that generally they involve much much larger public keys.  The bandwidth and storage requirements of Bitcoin are highly correlated to the size of the public key and we are talking about public keys in the kilobytes (notice the plural) range so 80x to 200x larger than current 256 bit public keys.   Now it is entirely possible that in the future the effect of Moore's law on available storage and bandwidth will mean that 50,000 bytes in 2048 is no larger than 256 bytes today on a relative basis.





That's a bit reassuring ! Smiley
donator
Activity: 1218
Merit: 1079
Gerald Davis
I found the paper in May that put two different qubit chips up against software and hardware solvers in a very specific class of problem, and the results were that with a problem that is most suitable for the chip it found a solution in half a second, several thousand times faster than the best traditional methods.

There were a lot of ifs/buts and exceptions however the V5 and V6 chips tested, when given the right kind of problem, were indeed able to solve it (it was an annealing problem) in a grand flash.

A clear eyed summary on that May paper, and the D-Wave devices is here: http://spectrum.ieee.org/computing/hardware/dwaves-year-of-computing-dangerously

I've no doubt that quantum computing is going to be an arms race, and it will start to solve in parallel more problems over time. Whether that includes searching for keys or cryptography I've no idea but if it does you CAN BET THAT NSA WILL NOT TELL US ABOUT IT.


One thing I have pointed out in the past (and likely will need to continue to point out for a very long time ) is that DWAVE's system implements (or it stated to implement there is some controversy over exact what is happening) quantum annealing algorithm which while it does (or may) have some quantum properties is not the same thing as a general purpose quantum computer.

To attack public encryption requires a general purpose quantum computer (QPQC) capable of implementing Shor's algorithm (or one built to only implement ONLY Shor's algorithm, a quantum "ASIC" if you will).  Even if DWAVE's computer was a quadrillion qubits it would be absolutely useless for attacking public encryption.  

The global recognition for successfully factoring larger numbers using a GPQC and Shor's algorithm is a pretty big deal.  To date nobody publicly has shown the ability to factor numbers larger than 143 (an 8 bit number).  Even factoring 32 bit numbers at commercially viable speeds would probably mean we are decades away from being able to break 256 bit ECC keys but we aren't even close to that.  While QC is powerful in theory, practical progress has also been agonizingly slow:

2001
First successful demonstration of Shor's algorithm.
Computer size:  7 qubits
Number factored: 15 (into the factors 5 & 3)
Equivelent RSA keysize: 4 bit

2011
First successful factorization of a three digit number (143)
Computer size:  8 qubits (actually 4 bits using an iterative process)
Number factored: 143 (into the factors 11 & 13)
Equivelent RSA keysize: 8 bit

So the bit strength of "encryption" (I use this term loosely because one could crack 8 bit "encryption" with paper and pencil) that can be broken using QC has gone from 4 bit to 8 bit after a decade of research.  Even if this can scale, at the current growth rate breaking Bitcoin would be more than a lifetime away.  ECDSA is a little different than RSA in that it doesn't involve factoring large numbers and instead uses the properties of elliptical curves but both can be broken using Shor's algorithm.  In general ECC requires smaller keys than RSA for an equivelent level of security.  To achieve 128 bit key strength (which is considered beyond brute force for classical computing) using ECDSA requires 256 bits, but using RSA requires 3,072 bits. Bitcoin could use RSA and be just as secure however transactions would be up 12x as large.  

Still both RSA and ECC in theory can be broken faster than what is possible with classical computing by using a large enough (and fast enough) quantum computer implementing Shor's algorithm. I have only found one paper to date which compares the qubit cost of implementing Shor's algorithm against both RSA & ECC.

Quote
6.3 Comparison with the quantum factoring algorithm

One of the main points of this paper is that the computational “quantum advantage”
is larger for elliptic curve discrete logarithms than for the better known
integer factoring problem. With our proposed implementation we have in particular
achieved similar space and time requirements. Namely the number of qubits needed is
also of O(n) and the number of gates (time) of order O(n^3)),
although in both cases
the coefficient is larger. Note that the input size n is also the key size for RSA resp.
ECC public key cryptography. Because the best known classical algorithms for breaking
ECC scale worse with n than those for breaking RSA, ECC keys with the same computational
security level are shorter. Below is a table with such key sizes of comparable security
(see e.g. [25]). The column to the right roughly indicated the classical computing resources
necessary in multiples of C, where C is what’s barely possible today (see. e.g. the RSA
challenges [26] or the Certicom challenges [27]). Breaking the keys of the last
line seems [15,360 RSA or 512 bit ECDSA] to be beyond any conceivable classical computation,
at least if the presently used algorithms can’t be improved.


Summary:
Breaking 256 bit ECDSA (128 bit key strength) requires 1800 logical qubits and a time of 6.0*10^9 operations.
Breaking 512 bit ECDSA (256 bit key strength) requires 3600 logical qubits and a time of 5.0*10^9 operations.


Where f(n) and f'(n) are as in section 6.2 with ǫ = 10. The time for the
quantum algorithms is listed in units of “1-qubit additions”, thus the number
of quantum gates in an addition network per length of the registers involved.
This number is about 9 quantum gates, 3 of which are the (harder to implement)
Toffoli gates (see e.g. [5]). Also it seems very probable that for large scale
quantum computation error correction or full fault tolerant quantum computation
techniques are necessary. Then each of our “logical” qubits has to be encoded
into several physical qubits (possibly dozens) and the “logical” quantum gates
will consist of many physical ones.
Of course this is true for both quantum
algorithms and so shouldn't affect the above comparison. The same is true for
residual noise (on the logical qubits) which will decrease the success probability
of the algorithms. The quantum factoring algorithm (RSA) may have one advantage,
namely that it seems to be easier to parallelise.
http://arxiv.org/pdf/quantph/0301141.pdf  I emphasis of relevant portions is by me.  

So we are looking at a general purpose quantum computer which needs at least 1,800 logical qubits.  Nobody (AFAIK) has even done research on the amount of error correction required for a commercial general purpose quantum computer (because none exists) but lets take the authors estimate of "dozens of physical qubits per logical qubit" and use 24x as a low end.   This would mean (1,800 * 24) a system with 43,200 or more physical qubits.  To put that into perspective the largest (AFAIK please correct me if you find a larger example) general purpose quantum computer capable of implementing Shor's algorithm is ... 7 qubits so we aren't eve in striking range yet.

Still QC "may" may break 256 bit ECDSA within our lifetime, so should be looking to future solutions.  There are a couple things which can be done to improve the quantum security of the Bitcoin network.  The simplest change would be to implement a new address type which uses a larger ECC curve.  While larger ECDSA keys can sitll be broken they do increase the qubit requirements, moving to 512 bit keys doubles the required qubits and 1,024 will quadruple them.  If GPQC proves viable this would buy the network some time for a longer term solution.  I don't even know if 1,024 bit ECC is supported by major libraries, outside of quantum computing there is little need for 1,024 bit ECC because even 256 bit ECC is considered beyond brute force. 

A longer term and more complex solution would be moving to an address scheme based on a post-quantum algorithms (http://en.wikipedia.org/wiki/Post-quantum_cryptography).  These systems generally have no wide spread deployment so there may be unknown security issues so I am not advocating a change anytime soon just pointing out there are algorithms which can be used to protect the network even if large GPQC become available.   The largest obstacle to these post quantum systems is generally they involve much much larger public keys.  The bandwidth and storage requirements of Bitcoin are highly correlated to the size of the public key and we are talking about public keys in the tens of kilobytes (notice the plural) so transactions and blocks would be easily a hundred times larger.   Now it is entirely possible that Moore's law will mitigate this somewhat,  and stat using a 50,000 byte key in 2048 will be no more of a challenge then using a 256 bit key today.

For those who are interested in post quantum cryptography one implementation (which has open source implementation) is NTRU.  It is likely far too early to consider any such implementation in a production system but implementing a patched version of bitcoin on testnet which incorporates NTRU would be an interesting project.

Quote from: “Quantum Resistant Public Key Cryptography” - NIST
Of the various lattice based cryptographic schemes that have been developed, the NTRU family of cryptographic algorithms appears to be the most practical...smallest key size...highest performance.
legendary
Activity: 1176
Merit: 1280
May Bitcoin be touched by his Noodly Appendage
I'd like to see papers about that
hero member
Activity: 518
Merit: 521
I will bring my linked blog into play here on my Theory of Everything.

Quantum superposition is due to the emergent derivation that matter is relative to itself.

When we measure something, i.e. form coherence, it is only valid in our local coherent delusion because it is always aliasing error on the universal scale.

Traditional CMOS electronics constrain quantum effects to local measurement components, i.e. transistors so the potential increased global degrees-of-freedom due to entanglement across the entire circuit are destroyed, i.e. made coherent by quantizing each logic gate locally within the circuit. In other words, pre-mature coherence.

A black hole is superpositioned matter. If we could superposition ourselves, we wouldn't be perceivable (not coherent) in a local reality.

Entanglement doesn't mean two quantum particles are joined across great distance, rather that can be an aliasing error effect that we perceive in the local 3D context. It means that quantum particles (actually waves) can interact on universal global scale (i.e. distance is irrelevant). You see in the superpositioned universe, there are infinite dimensions thus 3D is just a local delusion. When I use the terms local and global I am not referring to distance in 3D, rather to limited versus unlimited perspectives (aka samples or measurements). In superpositioned matter there are infinite perspectives thus no one single coherent realization and thus the black hole-- maximally disordered ether. Now you know what anti-matter is.

This isn't hand waving. This is the only description of the universe that can be logically globally coherent. I discussed these concepts in another thread and another one.

So the challenge of quantum computing is to build a circuit that allows computation to proceed in the maximally disordered ether, while quantizing only the result of the computation.

The arguable criticism of the D-Wave system is it may not be enabling entanglement between the qubits, i.e. they are just akin to supercooled transistors which locally have two (finite) superpositioned states each.

Edit: many details omitted for brevity.
hero member
Activity: 508
Merit: 500
That's a bit scary ! :/
Pages:
Jump to: