Pages:
Author

Topic: Researcher Claims to Crack RSA-2048 With Quantum Computer - page 3. (Read 1321 times)

sr. member
Activity: 1190
Merit: 469

And if you have any evidence, post it here in this thread for all to see.
he won't even answer my basic questions so good luck getting this guy to post any actual evidence.

Quote
But practically, how would reversible computing work? To avoid burning energy on erasing data, you'd have to keep the result of every operation. How would caching work, a technology that is vital for keeping our computers fast and entirely based on erasing data when it needs space for new data?
yeah he doesn't know how to answer a question like that. he just believes... but i'll give it a shot. one of the issues it seems to me is the one you bring up that it would increase the complexity of the computer's hardware architecture to have to "keep the result of every operation". probably not cost effective. you don't get something for nothing but let's hear what Mr. Ph.D has to say. he's the authority.

Quote
What would your computer do once you've streamed a full movie online, and have generated terabytes of raw screen data? Does pixels on the screen not "count" in your energy budgets, since they are outside the CPU? They are still erased dozens of times per second, millions of them.
he won't know how to answer that question i'm pretty sure unless he just links you to some summary of some research paper that they did to avoid losing tenure...

since I'm the OP I wish i could put him onto a moderation in this thread to kind of tamper his enthusiasm for "reversible computing". but i guess we don't have that feature.  Grin
full member
Activity: 162
Merit: 230
Do you have any references that show reversible computations in classical computers would more energy efficient in practice? The actual energy used in computer chips is several magnitudes more per operation than the Landauer limit, the actual bit erasure energy is basically insignificant.
That is a good question.

Disclaimer: I am not a physicist or an engineer specializing in reversible computing hardware.

1. Reversible computation will become essential long before the energy efficiency of computation approaches k*T*ln(2). Regardless of what kind of hardware one uses, if one wants to delete a bit of information, one should spend at least 100*k*T energy per bit deletion in order to overcome thermal noise. We should therefore expect for reversible computing hardware to outperform conventional computing hardware at the level of >100kT.

Wouldn't reversible computation also have to overcome thermal noise? Since the energy not related to bit erasure significantly dominates the energy usage of current technology, I would assume reversible computing would also have such large "hidden" factors.

2. The paper arxiv:1701.08202 indicates that a specific kind of reversible mechanical computation can perform calculations at a rate of about 4*10^(-26) Joules per cycle at a rate of 100 MHz. This is several orders of magnitude below Landauer's limit. The catch is that it is an engineering nightmare to engineer such a device. Fortunately, there are other proposals for producing energy efficient reversible computing hardware that can be manufactured with something similar to our modern photolithography.

Did you cite the right paper? There is nothing about doing computation at that energy level or a rate of 100mhz, the whole paper insteads seem to be about how one can more effectively simulate mechanical computational devices on current hardware.

3. In biology, there are already examples of reversible computation (the biosynthesis of messenger RNA). The logical and thermodynamic reversibility of this process has been described in the paper Logical Reversibility of Computation by Charles Bennett. So reversible computation happens in nature. It should therefore be physically possible to manufacture logically and thermodynamically reversible (and partially reversible) computers.

If this is not enough for you, I can send you evidence that reversible computation is underrated.

-Joseph Van Name Ph.D.

Do you know the power usage of these biological processes? I don't have journal access so I can't read that paper.

And if you have any evidence, post it here in this thread for all to see.


But practically, how would reversible computing work? To avoid burning energy on erasing data, you'd have to keep the result of every operation. How would caching work, a technology that is vital for keeping our computers fast and entirely based on erasing data when it needs space for new data?

What would your computer do once you've streamed a full movie online, and have generated terabytes of raw screen data? Does pixels on the screen not "count" in your energy budgets, since they are outside the CPU? They are still erased dozens of times per second, millions of them.
member
Activity: 691
Merit: 51
Do you have any references that show reversible computations in classical computers would more energy efficient in practice? The actual energy used in computer chips is several magnitudes more per operation than the Landauer limit, the actual bit erasure energy is basically insignificant.
That is a good question.

Disclaimer: I am not a physicist or an engineer specializing in reversible computing hardware.

1. Reversible computation will become essential long before the energy efficiency of computation approaches k*T*ln(2). Regardless of what kind of hardware one uses, if one wants to delete a bit of information, one should spend at least 100*k*T energy per bit deletion in order to overcome thermal noise. We should therefore expect for reversible computing hardware to outperform conventional computing hardware at the level of >100kT.

2. The paper arxiv:1701.08202 indicates that a specific kind of reversible mechanical computation can perform calculations at a rate of about 4*10^(-26) Joules per cycle at a rate of 100 MHz. This is several orders of magnitude below Landauer's limit. The catch is that it is an engineering nightmare to engineer such a device. Fortunately, there are other proposals for producing energy efficient reversible computing hardware that can be manufactured with something similar to our modern photolithography.

3. In biology, there are already examples of reversible computation (the biosynthesis of messenger RNA). The logical and thermodynamic reversibility of this process has been described in the paper Logical Reversibility of Computation by Charles Bennett. So reversible computation happens in nature. It should therefore be physically possible to manufacture logically and thermodynamically reversible (and partially reversible) computers.

If this is not enough for you, I can send you evidence that reversible computation is underrated.

-Joseph Van Name Ph.D.
full member
Activity: 162
Merit: 230
For the purposes of this post and all future posts, when I refer to reversible computation, I am referring to classical partially reversible computation in order to perform calculations more energy efficiently.

Do you have any references that show reversible computations in classical computers would more energy efficient in practice? The actual energy used in computer chips is several magnitudes more per operation than the Landauer limit, the actual bit erasure energy is basically insignificant.
sr. member
Activity: 1190
Merit: 469
Well, since we do not have energy efficient reversible computers yet, this is theoretical work. Do you have a problem with scientific theories? Where are you going with this? I hope you know that before anything becomes practical it is theoretical. If you do not like scientific theories, then you do not like progress.
yeah i do have a problem with silly theories like people thinking one day they will be able to teleport themself to anywhere on the planet without getting in a car. people can get some really silly notions based off of silly scientific theories they hear about that will never come to pass.

https://en.wikipedia.org/wiki/Teleportation

And can you give an example of when "deleting bits of information" is actually done in modern computer hardware?

Quote
Furthermore, in a previous comment, I told you and everyone else what I meant by deleting bits of information. By deleting information, I was referring to irreversibly replacing each of those bits of information with zeros. But in a hard drive, when you 'delete' information, you simply remove the pointer to that information which does not require k*T*ln(2) energy per bit of information in the location that the pointer is referring to, but when you overwrite the information with other information, then you will need to spend at least k*T*ln(2) energy per bit overwritten.
i'm the one that started this thread you know. you wouldn't be here talking in it if it wasn't for me. just remember that. now, about the overwriting information thing, i'm not disagreeing about that. in fact you just pretty much confirmed my point.

Quote
No. Landauer's principle does come into play because when you overwrite information, you have to spend that >>k*T*ln(2) energy per bit overwritten.
remove that ">>" part and i might be more likely to agree with you but when you put the "">>" thing in there it makes me think you're saying it requires like double the energy. once to erase and once to write which i'm not sure about that. so you're saying that the overwriting process is not one step but two discrete distinct steps. erasing and writing. i don't know about that.

Quote
Oh. And if CPUs and GPUs did not delete any information, we would already have reversible computers.
i would love to hear an explanation about why you think that way. but as you pointed out, this thread was about quantum computers. not imaginary things like reversible ones.


Quote
-You are really starting to )1$$ me off. I have a Ph.D. in Mathematics.
maybe you do but that doesn't mean all of your beliefs are not subject to scrutiny especially if you go around posting them on unrelated threads. this thread was about quantum not reversible but thanks for bringing up the topic since i hadn't heard about it before. Shocked

member
Activity: 691
Merit: 51
and you know that, how? from experience? or from theory?
Well, since we do not have energy efficient reversible computers yet, this is theoretical work. Do you have a problem with scientific theories? Where are you going with this? I hope you know that before anything becomes practical it is theoretical. If you do not like scientific theories, then you do not like progress.

And can you give an example of when "deleting bits of information" is actually done in modern computer hardware?
Hard drives are for long term storage of information instead of for computation. Right now, a typical hard drive may have 1 tb of storage space, but at Landauer's limit, deleting all of that information at Landauer's limit would cost 2.4*10^(-8) Joules, and even at 1000 times Landauer's limit (to cover thermal noise), it will still cost 2.4*10^(-5) Joules to delete all of that information. But what about deleting information in the arithmetic logic unit in a CPU? Do you honestly expect that CPUs and GPUs perform all those calculations without throwing away a few bits now and there? Because if that is the case, then those CPUs and GPUs are already reversible.

Furthermore, in a previous comment, I told you and everyone else what I meant by deleting bits of information. By deleting information, I was referring to irreversibly replacing each of those bits of information with zeros. But in a hard drive, when you 'delete' information, you simply remove the pointer to that information which does not require k*T*ln(2) energy per bit of information in the location that the pointer is referring to, but when you overwrite the information with other information, then you will need to spend at least k*T*ln(2) energy per bit overwritten.

you see, this landauer's principle you're talking about doesn't even really come into play because nothing is being "deleted" it's just being overwritten. you don't have to erase data first before you write over it. news flash.  Shocked
No. Landauer's principle does come into play because when you overwrite information, you have to spend that >>k*T*ln(2) energy per bit overwritten. Please stop denying scientific facts. Oh. And if CPUs and GPUs did not delete any information, we would already have reversible computers. Please try to learn what you are talking about before making a spectacle of yourself.

no its because in practice there is no reason for computations to be reversible. for example, imagine what bitcoin would be like if we required sha256 to be reversible. bitcoin wouldn't even exist. but i guess you think there is some reverse inverse function for sha256. there's not.
-You are really starting to )1$$ me off. I have a Ph.D. in Mathematics, so I @#$%ing know what an invertible function is you @$$hole. You are clearly the one who knows absolutely nothing about anything. SHA-256 is not an invertible function. And that means absolutely nothing at all. I ALREADY TOLD YOU THAT WE DO NOT NEED COMPLETE REVERSIBILITY FOR REVERSIBLE COMPUTATION TO BE USEFUL. WE ONLY NEED PARTIAL REVERSIBILITY FOR REVERSIBLE COMPUTATION TO BE USEFUL AND DOMINANT. Have you even looked at the circuit for SHA-256? Do you know about the properties that cryptographic hash functions are supposed to satisfy? Cryptographic hash functions are supposed to be COLLISION RESISTANT. That means that for a cryptographic hash function H, even though there are in principle distinct inputs w,x where H(w)=H(x), in practice, one should not be able to find two distinct inputs w,x with H(w)=H(x). Collision resistance is a weakened form of invertibility because it means that in practice, we are not able to find a specific example of non-invertibility (though it is easy to establish the non-invertibility without producing an example). Now, how do we build collision resistant and efficient cryptographic hash functions? That is right. We use mostly reversible components or at least components that can be made (mostly) reversible with a moderate computational overhead.

Now, the designers and standardizers of SHA-256 (that is the NSA and the NIST) made a big mistake since they did not design SHA-256 to run on reversible hardware or software, so I have to give the NSA and the NIST both an F- on their design of SHA-256. Yes. It was important for the NSA to design hash functions for partially reversible computers, but they did not do this. The NIST should have also standardized other cryptographic functions as a preemptive measure for unknown (at the time) applications where they needed to use a standard function for a certain piece of technology. Cryptographic hash functions were never designed for cryptocurrency mining, and it turns out that for cryptocurrency mining, you do not really need to have a function that takes arbitrary input and returns a 256 bit output. A 32 bit keyed permutation would be sufficient as the main part of a cryptocurrency mining algorithm, but such a permutation should be designed for reversible computation. Oh. And the NIST should have standardized a cryptographic hash functions to be as anti-reversible as possible. After all, reversible computation is a dangerous technology that will power the AI that should be concerning, and a reasonable way to delay the progress in reversible computing technologies would be to design a cryptographic hash function that can avoid reversible computation to the extent to which that is feasible.

that's because we have AI like Chatgpt it's pretty smart. we have low level quantum computers and they are getting bigger. what we don't have a need for is something that can reverse its computations. think about it. alot of computations are not based on things that have inverses.
-GPT is a dumbass. And the quantum computers that we have have not performed any useful calculations. But you say that we have no need for anything that can reverse its calculations? Do you even know the basics of how quantum computers work? The quantum gates are UNITARY MATRICES. A matrix X (over the reals, complex or even the quaternions) is said to be unitary if X\cdot X^*=X^*\cdot X=1 which means that X is invertible. Of course, with quantum computation, we do not really care much about the overall energy efficiency of those unitary transformations, but it is still invertible.

But I am not here to bash AI or quantum computation. I have been developing my own AI algorithms (or are you going to hate that too?), so I cannot be against them. I am just here to bash the hype behind AI and quantum computation because the people hyping up AI and quantum computation display their ignorance by being so oblivious to reversible computation. I am also here to bash the news behind AI and quantum computation because by ignoring reversible computation, the media has been completely one sided.

You have not given any scientific reason why reversible computation is not possible or feasible. I have given a reason why we do not see the technology yet, and I have given a reason why we should expect to see more work on reversible computing technologies. Yes. Developing practical physically reversible computers will be difficult. And so is developing quantum computation. The only reason why quantum computation is more popular than reversible computation is pure hype.

since you're the Ph.D. feel free and make any necessary corrections
-I already have corrected all your factually inaccurate claims. Factual accuracy is more important than grammar, but you at least need to try to communicate using proper English so that I can respond to your claims more effectively.

In order to stay on topic here, if you want to continue to attempt to discuss reversible computation, we should go on another thread.

-Joseph Van Name Ph.D.







legendary
Activity: 3500
Merit: 6320
Crypto Swap Exchange
I do not seem to understand how this challenge works.

In RSA you need the private key to decrypt anything that is encrypted by it by it. Now I don't know how it works with encryption, but when you're RSA/DSA/ECDSA signing something with the private key, the public key is exposed. But that doesn't really matter here because he also shared the public key by itself.

If something is "encrypted by the public key", that is by keypair and no password, right? Hopefully he did not use the public key as a pasword, because that would not make sense?

Looks like more or less he is just making the point that he really can't decrypt anything.

If he can decrypt it, he would then have the private key and then could sign anything. So if he really did find out a way to to decrypt something without the private key somehow then he would have the private key.

But, since it can't be done its more or less a FU to him.


This is the same as the magic carburetor that allowed you to get 200 miles to a gallon of gas in your 2 ton truck, it just can't be done.
You just need to use uranium instead of gasoline.





-Dave
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
Quote
Anton Guzhevskiy, the chief operating officer at Australian cybersecurity firm ThreatDefence, also challenged Gerck to prove his claims. "I've shared an RSA-2048 public key and a corresponding private key encrypted by this public key. If you can decrypt the private key, you can sign some piece of text with it, which will prove that you are in possession of the private key," he said in a response to Gerck's post on LinkedIn. "Can you do it?"

"There is a publication delay, and I do not control that," Gerck responded.
Says it all really.

This sounds like CSW all over again. Ask for a signed message as proof, which would be trivially easy to provide if any of the claims were true, and instead be told that we have to wait for some paper or evidence to be published which will "totally prove it". Roll Eyes

I won't be holding my breath.

I do not seem to understand how this challenge works.

In RSA you need the private key to decrypt anything that is encrypted by it by it. Now I don't know how it works with encryption, but when you're RSA/DSA/ECDSA signing something with the private key, the public key is exposed. But that doesn't really matter here because he also shared the public key by itself.

If something is "encrypted by the public key", that is by keypair and no password, right? Hopefully he did not use the public key as a pasword, because that would not make sense?
sr. member
Activity: 1190
Merit: 469
Reversible computation will simply use less energy than irreversible computation in order to perform the same task.
and you know that, how? from experience? or from theory?


Oh ok, you got me. Entire *observable universe then. Tongue
yeah i got you this time  Grin


member
Activity: 691
Merit: 51
For the purposes of this post and all future posts, when I refer to reversible computation, I am referring to classical partially reversible computation in order to perform calculations more energy efficiently.

Quantum computing promises exceptional speedup for some specific problems.
Some, like molecular dynamics are extremely important for humanity.
That is great. One can also simulate molecular dynamics by actually constructing the molecules that one wants to study. This is why I am more in favor of using quantum computation for AI. I have my own kind of quantum AI algorithm that reduces the dimension of quantum channels, but I do not have a quantum computer to run it on. But so far quantum computing has promised a lot while delivering no practical applications so far. Both reversible computing and quantum computing promise performance gains. No human has built a useful reversible or quantum computer yet, but there are prototypes of both. Quantum computation is difficult since one will need to use far more physical qubits because most of those physical qubits will be needed for error correction. Reversible computation is less difficult because one will need to overcome a moderate but manageable computational complexity theoretic overhead, but we can overcome this obstacle gradually using partial reversibility and first for computation that is more amenable to reversibility.

A rational society would look at reversible and quantum computation and value both types of computation more or less equally since they both have their purposes. A rational society would also value reversible computation because reversible computation is not entirely disjoint from quantum computation. How are people so confident that practical quantum computing will be achieved before practical energy efficient reversible computing? How are people so confident that the innovations in practical energy efficient reversible computing will not spur innovation in quantum computing?

There are plenty of reasons to favor reversible computing over quantum computing right now. Energy efficient reversible computing is easier to understand than quantum computing since with reversible computing, one does not have to take tensor products of many complex inner product spaces, nor does one have to worry about the weirdness of quantum information theory.

Quote from: jvanname
I personally ignore most quantum computation because quantum computing is overhyped. Think about it. Quantum computing promises exceptional speedup for some specific problems. Reversible computing promises moderate speedup for all problems.

reversible computing sounds like even more a fantasy than quantum computers.  just because you make a computer that is reversible doesn't guarantee it won't use electricity or even large amounts of it, i would think. i have no idea why something that is reversible would be faster than a conventional computer. so you have some explaining to do there  Shocked got any examples of reversible computers? probably not.  
-You need to learn a lot more about the claims of reversible computation before making such bold and arrogant statements about it. You cannot say that something is a 'fantasy' when later you claim that you have 'no idea why' reversible computation will outperform conventional irreversible computation. I never claimed that reversible computation would not use any electricity or gasoline. Reversible computation will simply use less energy than irreversible computation in order to perform the same task.

Landauer's principle states that in order to delete a bit of information (by delete, I mean replace the bit of information with a zero), one must expend more than k*T*ln(2) energy where k is Boltzmann's constant, T is the temperature, and ln(2)=0.69314... Here, k=1.38*10^(-23)Joules/Kelvin. As one approaches Landauer's limit, it gets difficult to reliably irreversibly delete information, so one should expect to expend about 100*k*T to delete the bit of information using irreversible computation regardless of the type of hardware one uses as long as one deletes bits of information one at a time.

The reason why we do not have any reversible computers yet (we do not have any practical quantum computers either) is that Landauer's limit is quite small, and there have been other ways of obtaining performance gains that do not require reversibility. These other methods include taking your 10 micrometer transistors and shrinking them down to 2 nm transistors which is about where we are now. There is currently not much room to shrink transistors because atoms have a diameter of about 0.1 nm. In the past, the energy efficiency of computation was too far away from Landauer's limit for very many people to be concerned with energy efficient reversible computation, but those days are over. This means that people need to be looking out for reversible computation because it is coming. Sure. Developing energy efficient reversible computers will be difficult, but so is developing practical quantum computers.

Anyone who knows nothing about reversible computation but instead hypes up stuff like AI, quantum computation, and even impossible things such as anti-matter engines that can propel spacecrafts to 90% of the speed of light may be safely ignored.

P.S. You would be much more persuasive or at least respectful if you used proper grammar, capitalization, spelling, and punctuation.


-Joseph Van Name Ph.D.


legendary
Activity: 2268
Merit: 18771
This is the same as the magic carburetor that allowed you to get 200 miles to a gallon of gas in your 2 ton truck, it just can't be done.
You just need to use uranium instead of gasoline.

just a minor thing to point out which is no one knows the size of the entire universe or if it is finite or infinite but i get what you're saying. some numbers are big to write down. for us mere mortals...
Oh ok, you got me. Entire *observable universe then. Tongue
sr. member
Activity: 1190
Merit: 469
even if every digit of such a number was stored in a single Planck volume, the entire universe would be too small to represent such a number. But this man has factorized such a number on a mobile phone? Lol.

just a minor thing to point out which is no one knows the size of the entire universe or if it is finite or infinite but i get what you're saying. some numbers are big to write down. for us mere mortals...

Quote from: jvanname
I personally ignore most quantum computation because quantum computing is overhyped. Think about it. Quantum computing promises exceptional speedup for some specific problems. Reversible computing promises moderate speedup for all problems.

reversible computing sounds like even more a fantasy than quantum computers.  just because you make a computer that is reversible doesn't guarantee it won't use electricity or even large amounts of it, i would think. i have no idea why something that is reversible would be faster than a conventional computer. so you have some explaining to do there  Shocked got any examples of reversible computers? probably not.  
staff
Activity: 4284
Merit: 8808
Shor's algorithm with noiseless qubits and gates works in theory.
Shor's algorithm with noisy qubits and gates does not work in theory. It needs exponentially small noise.
Both in theory.
That's it.
I thought that this is what error correction fixes-- with a huge but only polynomial blow up.

Quantum computing promises exceptional speedup for some specific problems.
Some, like molecular dynamics are extremely important for humanity.
full member
Activity: 149
Merit: 165
Metal Seed Phrase at the lowest price! From 44.99
With regards to QC, I always think about these facts:

- Is it easy to run software on QC? Not sure about it. Or develop something to destroy Bitcoin
- If QC is used for destroying Bitcoin, wouldn't it be easier to crack all the banks in the world, and more rewarding? (Unless we get into the fact that destroying Bitcoin is the true target)
- Worst case scenario, if Bitcoin becomes QC vulnerable, and a hard fork has to be implemented, you still keep the coins that you had already... There would be some confusion, like with BCH, but look at how it is now.
member
Activity: 691
Merit: 51
I personally ignore most quantum computation because quantum computing is overhyped. Think about it. Quantum computing promises exceptional speedup for some specific problems. Reversible computing promises moderate speedup for all problems. It is important to get better at computing everything and also to try to get an exceptional speedup for specific tasks. But quantum computing has been overwhelmingly hyped while few have heard of reversible computing. This is not good since the old strategy of simply shrinking transistors and making everything smaller is not going to work any more. This means that we will need another strategy for improving performance in computation, and partially reversible computation is a part of the solution.

Heck, even things that are pretty much impossible like antimatter engines that can propel people to the speed of light are hyped much more than reversible computation. The reason for this is psychological rather than a result of any deficiency of the promises of reversible computation. People are unwilling to accept that the laws of physics may not always make computation easier and in order to compute within the laws of physics and saenergy, one must make tradeoffs between energy efficiency per bit operation and time/space. And the word 'reversible' sounds less magical than the word 'quantum', and people cannot accept that.

For this reason, I am going to ignore any news about quantum computation.

-Joseph Van Name Ph.D.
full member
Activity: 206
Merit: 450
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically.
Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.
This is correct only with ideal noiseless qubits and gates.
That's exactly what a "theoretical QC" is. Hence, your claim of "not vulnerable to QC even theoretically" being wrong.

Shor's algorithm with noiseless qubits and gates works in theory.
Shor's algorithm with noisy qubits and gates does not work in theory. It needs exponentially small noise.
Both in theory.
That's it.

legendary
Activity: 990
Merit: 1108
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically.
Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.
This is correct only with ideal noiseless qubits and gates.
That's exactly what a "theoretical QC" is. Hence, your claim of "not vulnerable to QC even theoretically" being wrong.
full member
Activity: 206
Merit: 450
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically. Neither is RSA-128 - yes only 128 bits are beyond Shor's algorithm even in theory.

Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.


This is correct only with ideal noiseless qubits and gates. Shor's algorithm fails when exponentially small noise is present. That is, it needs noise levels of the order 2-n. The same is true for ECDLP. Good luck achieving noise level 2-256.

It is very easy to figure out why that happens. We start with qubits which are independent, each representing only 1 bit of information. But before reaching the final result it is not yet clear which of the 2n possibilities are the correct ones. So midway of the quantum computation, qubits have to represent 22n states at the same time. This enormous amount of information vanishes with any noise present.

Read the last sentence in this section https://en.wikipedia.org/wiki/Shor%27s_algorithm#Physical_implementation.

legendary
Activity: 990
Merit: 1108
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically. Neither is RSA-128 - yes only 128 bits are beyond Shor's algorithm even in theory.

Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.

Quote
Current QC hardware struggles with RSA-6 (six bits).

The only thing you got right. The current QC factorization record of 21 = 3 * 7 even used some shortcuts for numbers of a special form. So it's fair to say that we have yet to successfully run Shor's algorithm on a QC.

[1] https://en.wikipedia.org/wiki/Shor%27s_algorithm
full member
Activity: 206
Merit: 450
they might not factor 10^1000 size numbers but 2048 bit numbers are an entirely different animal and should be vulnerable to quantum computers at some point. very vulnerable. the only question is, when do companies like atom computing scale up past 1000 qbits to say 1 million qbits. i'd say in the next 10 years at worst.
they went from 100 qbits to 1000 in like 2 years.

You got it wrong. RSA-2048 is not vulnerable to QC even theoretically. Neither is RSA-128 - yes only 128 bits are beyond Shor's algorithm even in theory. Current QC hardware struggles with RSA-6 (six bits).

Finally someone did put the noise into quantum equations and this is the result:
We consider Shor's quantum factoring algorithm in the setting of noisy quantum gates. Under a generic model of random noise for (controlled) rotation gates, we prove that the algorithm does not factor integers of the form pq when the noise exceeds a vanishingly small level in terms of n - the number of bits of the integer to be factored, where p and q are from a well-defined set of primes of positive density. We further prove that with probability 1−o(1) over random prime pairs (p,q), Shor's factoring algorithm does not factor numbers of the form pq, with the same level of random noise present.


Pages:
Jump to: