Author

Topic: Researcher Claims to Crack RSA-2048 With Quantum Computer (Read 1318 times)

sr. member
Activity: 1190
Merit: 469
It seems like NIST really just wants to justify spending a lot of resources with its post-quantum cryptography competition. This is not good for cryptography, since I would rather use a more classically secure and more efficient digital signature algorithm than a sketchy digital signature algorithm. The NIST standardization gives people a false sense of security for cryptographic algorithms (except for hash-based signatures which actually are secure). And the NIST has fallen into the entire quantum hype while completely ignoring reversible computation with this cryptographic contest. Don't give into hype.

yeah at first i thought maybe they would come up with something special but as time has gone on, it's become painfully clear that they really are clueless. they don't know what they're doing and that's dangerous! that's my opinion of course. NIST did an ok job in selecting AES but that hasn't translated over too  well in picking a PQC encryption signature thing. not at all. I guess they thought it would be just as easy. I guess not! Plus it's taking forever and you just know that whatever they end up choosing will be out of date by the time quantum computers arrive on the scene anyway and it will need to be revamped probably  Shocked of course that is assuming QC ever do arrive on the scene, I know you're skeptical about that happening. But IBM was in the news lately: https://arstechnica.com/science/2023/12/ibm-adds-error-correction-to-updated-quantum-computing-roadmap/

I think IBM is going to eventually get a usable QC out the door. No doubt about that. All these other companies I wouldn't know but IBM when they commit to something they usually see it through.

The method most commonly tested today (called a "surface code") can require up to 4,000 hardware qubits to host 12 logical qubits; the scheme described in the manuscript can do so using only 288 hardware qubits.

If IBM can get it down to only 288 then maybe there's hope. But it can't be taking 1000 qubits to form a single logical qubit. That's way too many.


Compare when IBM invented the Deep Blue chess computer. nothing like that had ever beaten a world champion before and it destroyed Gary Kasparov at the chess board. ushering in a new era in chess computing which today there are far more powerful chess computers. even on your phone. but IBM was the first. proving it could be done. They'll do the same thing with Quantum.
member
Activity: 691
Merit: 51
It seems like NIST really just wants to justify spending a lot of resources with its post-quantum cryptography competition. This is not good for cryptography, since I would rather use a more classically secure and more efficient digital signature algorithm than a sketchy digital signature algorithm. The NIST standardization gives people a false sense of security for cryptographic algorithms (except for hash-based signatures which actually are secure). And the NIST has fallen into the entire quantum hype while completely ignoring reversible computation with this cryptographic contest. Don't give into hype.

Hmm. It looks like I have done more to develop tests for the cryptographic security of functions than the entire Bitcoin community has (I am a @#$%coiner who only values worthless @#$%coins, so I do not count as a part of the Bitcoin community; I will let you think about that).

ecdsa123-Ok. I will try to explain finite fields and doubly stochastic matrices, but if one really wants to understand everything, one should learn about abstract algebra, linear algebra, probability, and stochastic processes. That should not be too hard since your name is ECDSA.

Abstract algebra:

A group is a set G along with a binary operation * along with a constant e and a unary inversion operation ^(-1) that satisfies the following identities.

0. (Closure) The expressions x*y only makes sense if x,y belong to G. And if x,y belong to G, then so do x*y,x^(-1).

1. (Associativity) (x*y)*z=x*(y*z)

2. (Identity) x*e=e*x=x.

3. (Inverses) x*x^(-1)=x^(-1)*x=e (x^(-1) is the inverse of x).

A group is said to be abelian if it satisfies the identity x*y=y*x. In an abelian group, we typically write + instead of * and we typically write -x instead of x^(-1).

For example, the set of all integers with the operation + forms an abelian group. Don't think about the axioms too much. Whenever someone says "abelian group" you should think of "an operation on a set that resembles the integers with addition and subtraction".

A ring is an algebraic structure (R,+,0,-,*) where (R,+,0,-) is an abelian group with identity 0 and * is an associative operation that satisfies the distributivity identities (x+y)*z=x*z+y*z,x*(y+z)=x*y+x*z. A ring is said to have a unit 1 if it satisfies the identity x*1=1*x=x.

A ring is said to be commutative if it satisfies the identity x*y=y*x. Do not think about the axioms too hard. When someone talks about commutative rings with unity, just think about the integers where +,* are your standard addition and multiplication.

A field is a commutative ring with unity such that if x is not 0, then there is some y with xy=yx=1; in this case, y is the inverse of x, and we usually denote the inverse of x by x^(-1). In other words, a field is a commutative ring where you can always divide two elements as long as you don't divide by zero. Examples of fields include the field of rational numbers, real numbers, complex numbers, rational functions (or meromorphic functions).

Finite fields: Suppose that p is a prime number. Then the collection of integers {0,1,2,3,...,p-2,p-1} forms a field that we denote by F_p. Here, addition and multiplication are taken modulo p. For example, if p=13, then
6+8=14=1 mod p, so we would say that 6+8=1. Furthermore, 6*8=48=9 mod 13, so we would say that 6*8=9.

We can now take polynomials over a finite field, but we will consider two polynomials to be distinct if they have different expressions even if they represent the same function. For example, the polynomials x^p and x represent the same function in the finite field {0,1,2,...,p-1} of integers modulo p. We say that a polynomial over some field is irreducible if it cannot be factored as a product of two smaller polynomials.

Now suppose that p=2. Now the field F_2 of integers modulo 2 is just the collection {0,1} where + represents the XOR operation and * represents the AND operation. We can extend this field to a field with 256 elements. Consider the polynomial f(x)=x^8+x^4+x^3+x+1. This is the Rijndael finite field. Then the collection of polynomials of degree at most 7 over the field F_2 itself forms a field. The sum of two polynomials of degree at most 7 has degree at most 7, so addition makes sense. If r(x),s(x) are polynomials of degree at most 7, then r(x)*s(x)=f(x)a(x)+b(x) where b(x) has degree at most 7. Here b(x) is the remainder that you get when you divide r(x)*s(x) by 7. In this case, we would just say r(x)*s(x)=b(x) in the Rijndael finite field.

Everything that I did not explain properly will be explained in a proper abstract algebra text.

I am tired of typing, so I will only explain doubly stochastic matrices after the Rijndael finite field is properly understood. I will give a link to the code after the Rijndael finite field and stochastic processes are understood.

-Joseph Van Name Ph.D.




sr. member
Activity: 1190
Merit: 469
Off-topic, but still a valid question.

If quantum computers actually start gaining the ability to crack Bitcoin's encryption, how fast will the Core Developers code a patch, have it merged, have it deployed? Fast enough?
today? they would be caught with their pants pulled down. in 10 years maybe they wouldn't be.

Quote from: jvanname

SPHINCS+ is a stateless hash-based signature scheme, which was submitted to the NIST post-quantum crypto project. Stateless hashes will bloat up the blockchain with signatures. SPHINCS+ signatures will take 15 KB space if one just optimizes for signature size without optimizing for anything else, and we should expect longer signatures in practice. That is not going to work very well.

Dilithium2 could almost work. Bitcoin needs to seriously code up protocol that will allow for a drop-in replacement for ECDSA for elliptic curves. And all private keys need to be generated from a backup private-public key pair in case ECDSA gets broken. And there needs to be an automatic procedure for updating the blockchain in case ECDSA is broken. For example, if a whole bunch of long lost coins are being found, then the automatic system can kick in and tell everyone to join the new chain. We need a system that is hard coded in so that a backup blockchain can be implemented quickly and so that everyone can agree on the right blockchain in case ECDSA is broken. Yes. People will lose their coins and Bitcoin will lose value, but not everything will be lost.

I appreciate you bringing up these things. Your ideas sound very reasonable. The only thing that concerns me is how these NIST candidates some of them have been weeded out because they had serious flaws which LUCKILY got discovered before they branded them as standards. it seems like it's not so easy to come up with something that doesn't have holes in it. If we've learned anything from this NIST post quantum crypto standardization endeavor it would have to be that.

I'm sure you already heard: https://spectrum.ieee.org/quantum-safe-encryption-hacked but some people might not have.
member
Activity: 691
Merit: 51
Wind_FURY-The post-quantum digital signature algorithms are well on their way to being standardized by NIST. Unfortunately, NIST somehow thinks it is important and awesome to standardize post-quantum algorithms without standardizing reversible counterparts to AES,SHA-256,SHA-3. To make things worse, one of the digital signature algorithms selected by NIST is a stateless hash-based signature algorithm. Yes. That is right. NIST thinks it is a good idea to standardize hash-based signatures to be safe against the quantum computers of the future, but they are too inept to notice that maybe they should first have the hash functions that are designed for the energy efficient reversible computers of the future. Well, since NIST thinks reversibility is unimportant, should work on better hash-based signature algorithms that incorporate partial reversibility by ourselves because NIST does not believe that this is important.

A typical Bitcoin digital signature is 71 bytes long. Let's look at the signature algorithms that are being standardized by the NIST.

Dilithium2-"Dilithium is a digital signature scheme that is strongly secure under chosen message attacks based on the hardness of lattice problems over module lattices." according to their website.

public key 1312 bytes
signature size 2420 bytes
signature verification 118412 cycles on the Skylake CPU with an optimized implementation.

SPHINCS+ is a stateless hash-based signature scheme, which was submitted to the NIST post-quantum crypto project. Stateless hashes will bloat up the blockchain with signatures. SPHINCS+ signatures will take 15 KB space if one just optimizes for signature size without optimizing for anything else, and we should expect longer signatures in practice. That is not going to work very well.

Dilithium2 could almost work. Bitcoin needs to seriously code up protocol that will allow for a drop-in replacement for ECDSA for elliptic curves. And all private keys need to be generated from a backup private-public key pair in case ECDSA gets broken. And there needs to be an automatic procedure for updating the blockchain in case ECDSA is broken. For example, if a whole bunch of long lost coins are being found, then the automatic system can kick in and tell everyone to join the new chain. We need a system that is hard coded in so that a backup blockchain can be implemented quickly and so that everyone can agree on the right blockchain in case ECDSA is broken. Yes. People will lose their coins and Bitcoin will lose value, but not everything will be lost.


-Joseph Van Name Ph.D.
copper member
Activity: 1330
Merit: 899
🖤😏
Off-topic, but still a valid question.

If quantum computers actually start gaining the ability to crack Bitcoin's encryption, how fast will the Core Developers code a patch, have it merged, have it deployed? Fast enough?

If that wouldn't be a problem, how fast will full nodes run the new software? That I believe would be more "complicated".
What is a Bitcoin encryption? Public key cryptography is not encryption. But to answer your question, how do you know it hasn't been cracked already? Because if it's cracked, people wouldn't go after obvious coins, they do it another way.
But since there are no signs of it happening, devs see no reason to cause panic by saying it might happen soon so we need to be prepared, instead we are saying it, as experimental scientists.
EC being broken, we'll need another system on top of it, maybe we could even start implementing a new system right now and store the second proof of ownership on another database, something similar to 2fa authentication, when you want to transact, the other database/chain would first validate your 2fa token/password and generate a ticket for you to be accepted for entering the main chain mempool. This is just one idea.
legendary
Activity: 2898
Merit: 1823
Off-topic, but still a valid question.

If quantum computers actually start gaining the ability to crack Bitcoin's encryption, how fast will the Core Developers code a patch, have it merged, have it deployed? Fast enough?

If that wouldn't be a problem, how fast will full nodes run the new software? That I believe would be more "complicated".
member
Activity: 691
Merit: 51
Cryptographic hash functions and encryption functions are designed to be understood as much as possible while still efficiently mixing things up very well, and a good way to ensure the understandability of encryption functions and hash functions is to make these functions as mathematical as possible. I am not too familiar with techniques that one can use to analyze SHA-256 (it is harder for me to analyze SHA-256 since SHA-256 uses 32 bit blocks while AES uses 8 bit blocks), but I can attest to the mathematical nature of AES.

0. The non-linear portion of the AES S-box is simply inversion (and where we set 0^(-1)=0 to make inversion bijective) in the finite field F_{256} with 256 elements. Inversion over this finite field has very good mathematical properties. Just the other day, I computed the second largest eigenvalues in magnitude of the doubly stochastic matrix associated with the mapping {x,y}-->{a+x^(-1),a+y^(-1)} where a is an random element in F_{256}, and I got 18/256=9/126. This quantity is low and is much better than what we can get if we used a random S-box. If we used the full S-box with its non-linear portions, we would get a spectral radius of 16.41493572768185/256 which is closer to an ideal value of 16/256. I can analyze AES mathematically, so other people should be able to do this as well. The AES S-box was selected for other mathematical properties as well.

1. The group generated by the round functions of AES is the alternating group as it should be. This shows that not only does AES mix things up well, but AES also behaves mathematically enough to be analyzed. This is mostly due to AES being an SP-network.

2. I have been able to show that for good SP-networks, a lot of the structure (such as the partition of the message into S-boxes) is definable from the block cipher round function and only the block cipher round function.

3. For cryptography, one needs non-linearity. But there are mathematical ways to quantify the non-linearity of a cryptographic function, and mathematicians have studied maximally non-linear functions otherwise known as bent functions.

If anything, encryption functions and hash functions need to be studied more mathematically because we need to analyze and standardize alternatives to AES,SHA-2,SHA-3 for reasons that I have explained elsewhere (these functions have weaknesses). Of course, after the cryptographic function has been standardized, most people who use these functions will not analyze their cryptographic security and experience the mathematical nature of these functions. And I do not even need to mention how mathematical elliptic curve cryptography is.

-Joseph Van Name Ph.D.
sr. member
Activity: 1190
Merit: 469
Maybe if the Bitcoin community did not spend so much of their energy shaming and hating mathematicians, they would attract more mathematicians (and yes, mathematicians behave just as badly because they are at universities and universities lack professionalism). There is no excuse for this. Cryptography is one of the primary applications of abstract algebra, number theory, and many other areas of mathematics, so there is no reason for why mathematicians would be disinterested in Bitcoin. If the Bitcoin community attracted instead of repelled mathematicians, the Bitcoin community would have made better progress of solving problems such as a backup in case ECDSA was broken.
bitcoin is not really a good example of pure mathematics since it uses magical hash functions that are not really very well understood. anytime it needs a bit of magic, it just pulls out a hash out of the hat like a magician pulls out a rabbit and voila everything is peachy. at some point you have to think that's going to become a problem. but i guess it hasn't happened yet.

Quote
I gave the people on this site an opportunity to improve their social skills so that they would not repel mathematicians any more, but they rejected this opportunity since they are unaware of their own lack of social skills.

i'm trying to improve my social skills towards mathematicians like you because i think bitcoin needs more of them to help it have a better design someday. so i'm glad you are here on the forum your thoughts are always very interesting and informing. clearly you know alot about pure mathematics and that's really amazing. to have someone like that here.


member
Activity: 691
Merit: 51
Maybe if the Bitcoin community did not spend so much of their energy shaming and hating mathematicians, they would attract more mathematicians (and yes, mathematicians behave just as badly because they are at universities and universities lack professionalism). There is no excuse for this. Cryptography is one of the primary applications of abstract algebra, number theory, and many other areas of mathematics, so there is no reason for why mathematicians would be disinterested in Bitcoin. If the Bitcoin community attracted instead of repelled mathematicians, the Bitcoin community would have made better progress of solving problems such as a backup in case ECDSA was broken.

I gave the people on this site an opportunity to improve their social skills so that they would not repel mathematicians any more, but they rejected this opportunity since they are unaware of their own lack of social skills.

And maybe Bitcoin would attract mathematicians if it had a mining algorithm that was actually designed to advance science.

-Joseph Van Name Ph.D.
sr. member
Activity: 1190
Merit: 469
It will probably take years to take ECDSA down using a classical computer (if ECDSA suffers from such classical weaknesses at all), but an overnight takedown of ECDSA from an unknown entity is something that the Bitcoin developers should have prepared for (I do not know how well they have prepared for this).
there is no contingency plan in place right now. other than rolling back the blockchain prior to when it got hacked and introducing some new signature algorithm. of course that would be a total disaster for the entire bitcoin ecosystem and bitcoin might not survive. or it might take a huge plunge in price.

Quote
2. Bitcoin users should at least have the option of using hash based signatures if they want to. Not many people will do this since the fees for hash based signatures may be really high.
agreed. but for someone storing their wealth for long term savings and not doing frequent transactions, the fee might be worth the piece of mind.

Quote
3. In the case of a successful attack against ECDSA, Bitcoin and all Bitcoin wallets should have a backup. This means that each private key needs to not just be associated with an ECDSA public key, but each private key needs to be also associated with other secret information that can be used to recover the lost or stolen coins in case that Bitcoin needs to replace its digital signature algorithm due to a mathematical break. In particular, each private key needs to be of the form H(p) where p is a backup public key and H is a cryptographic hash function.
there's nothing stopping someone from creating a private key that way right now using sha256 for example. the thing is, p has no meaning to the blockchain currently. it's called a "brainwallet". i'm not sure how you would allow p to recover the lost or stolen coins. because that would or could require rolling back a potentially large number of transactions thus destroying peoples' trust in bitcoin.

Quote
The Bitcoin developers also need to have a drop-in replacement for ECDSA ready along with the code that allows the Bitcoiners to agree upon an updated blockchain. I do not know how well this will work in practice.
bitcoin developers are too busy working on lightning network and taproot and "important things" than to waste time worrying about these larger issues. i don't think there's anything they can do about them or will do about them until quantum computers force their hand. if something happens sooner than that, it might just mean the end of bitcoin.



member
Activity: 691
Merit: 51
I see that we are willing to sacrifice the safety that comes from hash based signatures with the efficiency of ECDSA. That is fine, but it is still a risk that needs to be acknowledged. New mathematics is being developed every day, and ECDSA may be taken down overnight by a simple but undiscovered algorithm. A dishonest mathematician may want to keep the algorithm a secret for as long as possible in order to steal lots of Bitcoins and wreak havoc. I do not think that this situation is likely, and even if ECDSA does get taken down, it will probably not be taken down overnight. It will probably take years to take ECDSA down using a classical computer (if ECDSA suffers from such classical weaknesses at all), but an overnight takedown of ECDSA from an unknown entity is something that the Bitcoin developers should have prepared for (I do not know how well they have prepared for this). There are several things that the cryptocurrency community can do about this:

1. The cryptocurrency community of course can get more educated about the possibility of ECDSA being broken by a classical mathematical attack either overnight or over the course of some time.

2. Bitcoin users should at least have the option of using hash based signatures if they want to. Not many people will do this since the fees for hash based signatures may be really high.

3. In the case of a successful attack against ECDSA, Bitcoin and all Bitcoin wallets should have a backup. This means that each private key needs to not just be associated with an ECDSA public key, but each private key needs to be also associated with other secret information that can be used to recover the lost or stolen coins in case that Bitcoin needs to replace its digital signature algorithm due to a mathematical break. In particular, each private key needs to be of the form H(p) where p is a backup public key and H is a cryptographic hash function. The Bitcoin developers also need to have a drop-in replacement for ECDSA ready along with the code that allows the Bitcoiners to agree upon an updated blockchain. I do not know how well this will work in practice.

I do not know if the Bitcoin developers have worked on Problem 3 yet.

-Joseph Van Name Ph.D.
sr. member
Activity: 1190
Merit: 469
At this point in time, it is much more plausible that someone breaks RSA-2048 (or any other public key cryptographic algorithm) by discovering a classical algorithm that breaks RSA than if someone made a functional quantum computer that breaks RSA-2048.
history doesn't bear that out though. it was invented in 1977. Researchers have had almost 50 years to break RSA and they still havent done it. The reason for that is simply that no classical algorithm likely exists. So it is more likely that a new technology like quantum computers will have to suffice.

Quote
Yes. This means that Bitcoin is in danger and that people must be vigilant to developments in public key cryptanalysis. In order for Bitcoin to be safe against classical and quantum attacks, Bitcoin should use hash based signatures.

if it was that simple they would have already done it. hash based signatures have drawbacks. one of which i think is they take up alot of space. that's like saying lets perform all our encryption using a one-time pad. sure, that's secure but it's inefficient too.


member
Activity: 691
Merit: 51
At this point in time, it is much more plausible that someone breaks RSA-2048 (or any other public key cryptographic algorithm) by discovering a classical algorithm that breaks RSA than if someone made a functional quantum computer that breaks RSA-2048. Yes. This means that Bitcoin is in danger and that people must be vigilant to developments in public key cryptanalysis. In order for Bitcoin to be safe against classical and quantum attacks, Bitcoin should use hash based signatures.

-Joseph Van Name Ph.D.
copper member
Activity: 1330
Merit: 899
🖤😏
Can you come up with a crypto system that has no solvable problem? If you don't have the parameters then you can't solve it, having no parameters also means you can't verify anything, I'm specifically talking about public key crypto systems. So if there is no problem to solve, you can't build a system to verify the authenticity of any data.

Just like finger prints, if there is no fingerprint, how can you identify someone by a fingerprint scanner?
If someone figures out a way to fake/forge the fingerprint, you go build a retina scanner etc.

My point is, the knowledge of forging/ faking fingerprints and retina scan already exist, knowledge has no age, it has existed outside the time space dimension, so it's only the matter of time before someone accesses such knowledge.

Now we live in a modern world, if there is advanced crypto systems, there will be advanced algos to break it.
For instance, you would never expect people from 1000 years ago to build a nuke, because there was no infrastructure in place to persuade them in seeking the required knowledge  to build it.

When the said infrastructure was founded and how long after that we managed to build a nuke? When was crypto systems currently in use were invented or founded? And how long has it been since?
sr. member
Activity: 1190
Merit: 469
I think the point is that there are human errors regardless of how solid and secure a system is.
human errors is not the same thing as a limitation in human understanding. you can't even call something a bug. RSA has help up fairly well to the test of time. Just because the tech becomes available to factor large numbers fast doesn't mean RSA had a bug in it. it was very well understood that one of its assumptions was that factoring was "hard". that's still a pretty solid assumption.

Quote
But in crypto systems it can't be interpreted as a human error, it's just a difficult problem to solve, and we all know all the crypto systems are solvable.
i dont know what you mean by "solvable" but you're not going to be "solving" AES256.  Shocked


Quote
That's the point, if they were not solvable, we couldn't verify the validity of any data.
i think you're having a mistaken view. just because you can verify something is a solution doesn't mean there has to be a way to come up with that solution fast. so i'm not sure i agree with you on some of these things.
legendary
Activity: 3822
Merit: 2703
Evil beware: We have waffles!
A bit off track here but regarding:
Quote
https://www.energy.gov/science/articles/department-energy-announces-45-million-inertial-fusion-energy-ife
In the last two years, the U.S. ICF program supported by the National Nuclear Security Administration has produced two significant scientific results. In August 2021, a burning plasma was achieved on NIF with a yield of 1.3 megajoules (MJ). Then, in December 2022, NIF announced a breakthrough result where scientific breakeven (target gain>1) was achieved. More energy from the fusion reactions was produced (3.15 megajoules) than the laser energy that created the burning plasma (2.05 megajoules).


A net gain of 1.1 megajoules is not "too small to generate electricity" is it? But I'm not sure if that's the complete picture. Maybe the laser energy is only part of the total energy input but if that was the case they should have quoted that too...
This summer they beat that level as well BUT -- what is usually omitted from news being released is that the net gain is over the LASER power input applied to the target in other words the output pulse power of the laser. However considering the NIF laser has a 'wall plug' efficiency of <10% that is still far short of producing more power out of the system than went into it.

edit. a simple search brought up an acticle from Physics World 
Quote
As such, NIF is extremely inefficient – its 2 MJ flash-lamp pumped laser requiring around 400 MJ of electrical energy, which equates to a “wall-plug” efficiency of just 0.5%.Jan 20, 2023
copper member
Activity: 1330
Merit: 899
🖤😏
Any system can be hacked, but it is usually the result of human error.
what's the human error in RSA though?
Larry please accept the lord as your saviour, lol
I think the point is that there are human errors regardless of how solid and secure a system is. But in crypto systems it can't be interpreted as a human error, it's just a difficult problem to solve, and we all know all the crypto systems are solvable. That's the point, if they were not solvable, we couldn't verify the validity of any data.
sr. member
Activity: 1190
Merit: 469
Any system can be hacked, but it is usually the result of human error.
what's the human error in RSA though?
member
Activity: 239
Merit: 53
New ideas will be criticized and then admired.
Any system can be hacked, but it is usually the result of human error.
legendary
Activity: 4326
Merit: 8914
'The right to privacy matters'

Not sure of the usefulness of hydrogen bombs. I guess they could be good / useful against an asteroid .

yeah probably. i think elon musk has this idea to explode h-bombs in the mars atmosphere to "warm up" the planet. thing is, that would be expensive and i doubt it would work. but he's kind of full of strange ideas...

h-bombs only real use is as a deterrent hopefully we dont ever have to use them on some poor country like we did with the atomic bombs.  Sad

I much rather we use cold fusion and really wander around in space a bit.
sr. member
Activity: 1190
Merit: 469

Not sure of the usefulness of hydrogen bombs. I guess they could be good / useful against an asteroid .

yeah probably. i think elon musk has this idea to explode h-bombs in the mars atmosphere to "warm up" the planet. thing is, that would be expensive and i doubt it would work. but he's kind of full of strange ideas...

h-bombs only real use is as a deterrent hopefully we dont ever have to use them on some poor country like we did with the atomic bombs.  Sad
legendary
Activity: 4326
Merit: 8914
'The right to privacy matters'
But we know for a fact that it is exceedingly difficult to get any useful power out of nuclear fusion.
not exactly. hydrogen bombs were invented a long time ago and they use fusion. if you're talking about controlled fusion then yeah it's been a long and arduous road they've been working on it for decades. you have to admire that type of dedication from scientists all over the world. and i think they'll get there eventually but what i don't think is that it will make peoples' electric bills go down. they'll still charge them the same which pretty much negates the benefits of fusion power for consumers. no one is going to just give away free energy or even close. they're going to corner the market and charge as much as they possibly can. kind of like nuclear power plants do...

but feel free to correct me if i'm wrong that it won't lower peoples' power bills. did solar power do that? probably not!

Not sure of the usefulness of hydrogen bombs. I guess they could be good / useful against an asteroid .
sr. member
Activity: 1190
Merit: 469
But we know for a fact that it is exceedingly difficult to get any useful power out of nuclear fusion.
not exactly. hydrogen bombs were invented a long time ago and they use fusion. if you're talking about controlled fusion then yeah it's been a long and arduous road they've been working on it for decades. you have to admire that type of dedication from scientists all over the world. and i think they'll get there eventually but what i don't think is that it will make peoples' electric bills go down. they'll still charge them the same which pretty much negates the benefits of fusion power for consumers. no one is going to just give away free energy or even close. they're going to corner the market and charge as much as they possibly can. kind of like nuclear power plants do...

but feel free to correct me if i'm wrong that it won't lower peoples' power bills. did solar power do that? probably not!
sr. member
Activity: 1190
Merit: 469
larry_vw_1955-Please accept the Lord Jesus Christ as your Lord, Savior, and Saviour. You are completely misrepresenting what I am trying to communicate because you have been completely consumed by your own hatred and anger. I do not want to communicate with you.

-Joseph Van Name Ph.D.

i thought you had me on ignore.  Undecided

Quote from: digaran

They have already heated hydrogen atoms near the temperature of sun's core with the most powerful lasers in the world.
exactly. no superconductors needed as far as I know. Joseph said the only way  to do it was using superconductors cooled down to 4k wrong. Anyhow...

Quote
Practically humanity has achieved fusion capability, but it's too small to generate electricity.

https://www.energy.gov/science/articles/department-energy-announces-45-million-inertial-fusion-energy-ife
In the last two years, the U.S. ICF program supported by the National Nuclear Security Administration has produced two significant scientific results. In August 2021, a burning plasma was achieved on NIF with a yield of 1.3 megajoules (MJ). Then, in December 2022, NIF announced a breakthrough result where scientific breakeven (target gain>1) was achieved. More energy from the fusion reactions was produced (3.15 megajoules) than the laser energy that created the burning plasma (2.05 megajoules).


A net gain of 1.1 megajoules is not "too small to generate electricity" is it? But I'm not sure if that's the complete picture. Maybe the laser energy is only part of the total energy input but if that was the case they should have quoted that too...
copper member
Activity: 1330
Merit: 899
🖤😏
They have already heated hydrogen atoms near the temperature of sun's core with the most powerful lasers in the world. Practically humanity has achieved fusion capability, but it's too small to generate electricity.

It's the same with quantum computers, we managed to build them, but they are too weak to be used for big calculations.
So, no need to wait billions of years, just a few decades.
sr. member
Activity: 1190
Merit: 469
It is a royal shame that reversible computation is getting nothing but hatred when there are other ideas such as power from nuclear fusion and quantum computation that people tend to gobble up uncritically. In case you did not know this, in order to get power from nuclear fusion, one has to create and sustain conditions for years that are more severe than the center of the sun (the sun takes 10 billion years to consume all of its fuel). Oh. And the only way that we have figured out how to do this is to use superconductors that must be cooled to 4K which are right next to the center of the sun.
Not exactly. I know what you're talking about though - it's Magnetic Confinement Fusion. There is also another type. Inertial Confinement Fusion. Now, both of those are very big projects and have their own pros and cons but I have to ask you, if reversible computation is so easy why would it be harder than making a fusion reactor because we have those already. prototypes. where's your prototype reversible computer?  

legendary
Activity: 2422
Merit: 1191
Privacy Servers. Since 2009.
https://www.bankinfosecurity.com/blogs/researcher-claims-to-crack-rsa-2048-quantum-computer-p-3536
As Ed Gerck Readies Research Paper, Security Experts Say They Want to See Proof

 We all knew this day was coming sooner or later but I guess we didn't realize it would be done without shors algorithm  Shocked
 Bitcoin should still be good since it doesn't require factoring large numbers

I really doubt something like that is possible using any modern hardware. Let's wait and see how this is going to unfold. I'm pretty sure he's a fake though and he won't be able to prove he cracked RSA2048 and crack the keys provided by the security experts.
legendary
Activity: 1932
Merit: 2354
The Alliance Of Bitcointalk Translators - ENG>SPA
The problem with the credibility of the claim discussed here is that not everyone has the technical background you guys have. This specific board of this specific forum (to which I do not belong) is the exception, not the norm, and most of the people interested in Bitcoin nowadays don't have a clue on cryptography or quantum computing.

What I mean is that some communication media publishes that this guy may have found the way to crack a very secure cryptosystem, and most people simply stay with the idea that "hey! Bitcoin is not safe anymore, price will fall to zero...".

I don't know what the goal of Gerck is, but if this is a tactic to draw attention, the potential damage it can cause is not worth it. Nothing comparable with CSW's melodrama, if the rumor spreads.
sr. member
Activity: 1190
Merit: 469

-Invertibility and reversibility are not exactly the same thing. Reversibility means that the optimal algorithm for computing such a function is an algorithm where there is no computational complexity overhead that results from using a reversible computer rather than an irreversible computer.
that's what reversibility means to you maybe. because you want everything to fit into a reversible computer!

Quote
There are many functions that are invertible but which I would not want to compute on a reversible computer.
then maybe your reversible computer is not really useful in the real world?

Quote
I can give of many examples of invertible functions that are not reversibility friendly.

one or two examples would have sufficed. you really went over the top with examples.  Shocked and it's completely off topic to the discussion of quantum computers. you really hijacked the thread even though you have your own thread on reversible computing too. talk about hogging up resources...

Quote
I remember that someone asked for the energy efficiency of biological processes. And the energy efficiency of transcription of DNA into RNA is lower than Landauers limit.
and yet no one has invented a general computing device out of it so it's just a theoretical curiosity at most. probably always will be.

member
Activity: 691
Merit: 51
I have to remind all of you that NIST has butchered cryptographic functions such as SHA-256 and AES by standardizing low quality irreversible cryptographic functions rather than the reversible ones (or partially reversible in the case of SHA-256). I have to give NIST a grade of F- for this egregious oversight.

Correct me if I am wrong, but AES, being a symmetric encryption function, is reversible by design.
-Invertibility and reversibility are not exactly the same thing. Reversibility means that the optimal algorithm for computing such a function is an algorithm where there is no computational complexity overhead that results from using a reversible computer rather than an irreversible computer. There are many functions that are invertible but which I would not want to compute on a reversible computer. I can give of many examples of invertible functions that are not reversibility friendly.

Example 1: Consider the operation x->Ax where A is an invertible matrix on a finite field. This operation is invertible, but it is much easier to compute the original function than its inverse since the inverse may be way more complicated than the original function. SHA-256 uses invertible matrices where the inverse is much more complicated than the original function. This means that SHA-256 is not reversibility friendly.

Example 2: Let f be a function where f(a) is always an invertible matrix over a finite field. Then the operation (a,x)->(a,f(a)x) is always invertible, but the inverse operation
(a,x)->(a,f(a)^(-1)x) may be much harder to compute than the original function. The problem of finding an inverse of an n by n matrix takes O(n^3) steps using Gauss-Jordan elimination. There are some faster algorithms for matrix inversion such as the Strassen algorithm, but there are no O(n^2.001) algorithms for matrix inversion that we know about.

Example 3: Let q be an integer. Let aa*z. This operation is invertible, but the inverse may be more complicated than the original function. For example, if q=2^32 and a=3, then the operation z->a*z is much easier to compute than its inverse.

Example 4: If f is an arbitrary function, then the mapping (x,y)->(x,y XOR f(x)) is invertible, but this function is not reversibility friendly. For example, when transforming x to f(x) on a reversible computer, one typically produces garbage information G(x). On a reversible computer, one will need to do the following to compute our function:
(x,y)->(G(x),f(x),y)->(G(x),f(x),y XOR f(x))->(x,y XOR f(x)). This construction is used in Feistel ciphers in cryptography. This means that while Feistel ciphers are useful for cryptography, they are not reversibility friendly.

Example 5: One way permutations are permutations f where x->f(x) can be computed in a reasonable amount of time but where the inversion f(x)->x has no known polynomial time algorithm. This means that the inversion f(x)->x cannot be computed in practice. This kind of function is not reversibility friendly.

Example 6: In the mix-columns step in AES, to encrypt, we perform the transformation f(x)->a(x)f(x) mod x^4+1 where f(x) is a degree 3 polynomial over the field with 256 elements and a(x) is a fixed polynomial for AES. Here, the developers of the AES chose a(x) so that the coefficients of a(x) and a(x)^(-1) are both simple. This means that f(x)->a(x)f(x) mod x^4+1 and f(x)->a(x)^(-1)f(x) mod x^4+1 are both relatively easy to compute. The problem is that we do not simply compute f(x)->a(x)^(-1)f(x) mod x^4+1 by running f(x)->a(x)f(x) mod x^4+1 in reverse. This means that a reversible computer must have some computational complexity overhead when performing the MixColumns step of AES encryption.

I remember that someone asked for the energy efficiency of biological processes. And the energy efficiency of transcription of DNA into RNA is lower than Landauers limit. Here is an excerpt from Charles Bennett's 1989 paper on time/space tradeoffs in reversible computation "However, a few thermodynamically efficient data processing systems do exist, notably genetic enzymes such as RNA polymerase, which, under appropriate reactant concentrations, can transcribe information from DNA to RNA at a thermodynamic cost considerably less than kT per step."

-Joseph Van Name Ph.D.

sr. member
Activity: 1190
Merit: 469


Correct me if I am wrong, but AES, being a symmetric encryption function, is reversible by design.
that's correct. it is a symmetric key cipher. so the encryption and decryption keys are the same it's reversible. no information is lost in the encryption process. that's my understanding anyway. Shocked
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
I have to remind all of you that NIST has butchered cryptographic functions such as SHA-256 and AES by standardizing low quality irreversible cryptographic functions rather than the reversible ones (or partially reversible in the case of SHA-256). I have to give NIST a grade of F- for this egregious oversight.

Correct me if I am wrong, but AES, being a symmetric encryption function, is reversible by design.
sr. member
Activity: 1190
Merit: 469
larry_vw_1955-I cannot respond to the content of your post right now because I am ignoring you. If you want me to respond to the content of your post, you will need to get someone to quote you. In the meantime, you really need to take a good hard look at yourself and try to become a better person. But you won't because you love being the scum that you are.

-Joseph Van Name Ph.D.

very well, i see you've migrated your extensive knowledge base over to another more appropriately titled thread having to do with your subject expertise. i hope we can become better friends in the future.
hero member
Activity: 1078
Merit: 566
20BET - Premium Casino & Sportsbook

The Scientist who has made this claim is not willing to prove it publicly, so there is very little chance that his claim will get acceptance.

Quote
"Quantum computing has become a reality. We broke the RSA-2048 key," Ed Gerck - Scientist making the claim.

"Breaking RSA is usually attempted by using Shor's algorithm in a quantum computer but there are no quantum computers in existence that can produce enough gates to implement Shor's algorithm that would break 2048 keys," Alan Woodward, a professor of computer science at England's University of Surrey

https://www.bankinfosecurity.com/blogs/researcher-claims-to-crack-rsa-2048-quantum-computer-p-3536

Quote
Ek Gerck said all his "QC computations were done in a commercial cellphone, or a commercial Linux desktop," at a capital cost of less than $1,000. "No cryogenics or special materials were used."

https://www.bankinfosecurity.com/blogs/researcher-claims-to-crack-rsa-2048-quantum-computer-p-3536

Breaking RSA-2048 needs a Quantum computer with 4,000 Qubits and 100 million gates and is expected to arrive in next 20 to 30 years. The Scientist has claimed that he not only has broken the RSA but also no special computing devices are used. This makes the claim even more suspicious.
sr. member
Activity: 1190
Merit: 469
I have to remind all of you that NIST has butchered cryptographic functions such as SHA-256 and AES by standardizing low quality irreversible cryptographic functions rather than the reversible ones (or partially reversible in the case of SHA-256). I have to give NIST a grade of F- for this egregious oversight.
imagine that. a Ph.D that comes onto a bitcoin forum criticizing the people that invented SHA-256 and AES. And seems to not realize that bitcoin would not exist in its current form without SHA-256, if at all.


And if you have any evidence, post it here in this thread for all to see.
-You are an extraordinarily arrogant chlurmcklet. Go away.

i didn't make that demand of you. someone else did but you misattributed it to me. here take a look:




Do you know the power usage of these biological processes? I don't have journal access so I can't read that paper.

And if you have any evidence, post it here in this thread for all to see.

but maybe you SHOULD post any evidence you have.
full member
Activity: 206
Merit: 447
Shor's algorithm with noiseless qubits and gates works in theory.
Shor's algorithm with noisy qubits and gates does not work in theory. It needs exponentially small noise.
Both in theory.
That's it.
I thought that this is what error correction fixes-- with a huge but only polynomial blow up.
If I understand it correctly, the error correction could reduce noise only in linear fashion - QC are analog machines - ten times more resources would reduce error only tenfold at most. For Shor's to work this is not enough at all - it needs exponentially smaller noise.
legendary
Activity: 2254
Merit: 2003
A Bitcoiner chooses. A slave obeys.
He's really bought his own press, this guy. Roll Eyes

I claim total bullshit. All we are seeing is another "ground breaking discovery" with no proof and the guy behind it is pushing this as hard as he can. It's only a matter of time until he tries to sell something.

I consider my own opinion close to Alan Woodwards.
Quote
Alan Woodward, a professor of computer science at England's University of Surrey: "I'll believe they have done this when people can send them RSA modulus to factor and they send back two primes. Until I see that, I'm just confused and not convinced they've done what they claim in the headlines."
source: https://www.bankinfosecurity.com/blogs/researcher-claims-to-crack-rsa-2048-quantum-computer-p-3536


Edit: People on the Linkedin post claim he sends malware when asked for the full paper.
legendary
Activity: 3276
Merit: 3537
Nec Recisa Recedit
...molecular dynamics...

it is one of the most fascinating sectors of the pharmaceutical industry. I had the pleasure of working practically at the beginning, studying "conformational analysis of small molecules" Smiley
it's really expensive right now since you need "a lot of GPU". Moreover it requires a lot of time for simulations (and maybe you are just evaluating few nanoseconds ...)

Probably OT, among other things these arguments have pushed some to "fantasize" further by hypothesizing that technically even our entire universe could simply be a simulation Wink below a couple of reference (one in Italian language).

https://en.wikipedia.org/wiki/Simulation_hypothesis
https://www.wired.it/article/matrix-universo-simulazione-nuova-teoria-fisica-infodinamica/
member
Activity: 691
Merit: 51
Did you cite the right paper? There is nothing about doing computation at that energy level or a rate of 100mhz, the whole paper insteads seem to be about how one can more effectively simulate mechanical computational devices on current hardware.

Here is the other (more main paper) https://arxiv.org/pdf/1801.03534.pdf. No. This is not about current devices because we currently do not have those energy efficient reversible computers. We are instead looking to directions to innovate.


But practically, how would reversible computing work? To avoid burning energy on erasing data, you'd have to keep the result of every operation. How would caching work, a technology that is vital for keeping our computers fast and entirely based on erasing data when it needs space for new data?
Trick 0: When using reversible computation, one should use partial reversibility instead of complete reversibility. Landauer's limit is not a lot of energy, so to get the most out of reversible computation, one needs to find the sweet spot between irreversibility and reversibility.

Trick 1: Suppose that one would like to compute a function f on the input a. Then using a reversible computer, by storing all of the bits generated in the computation, after the computation, one would obtain f(a) along with G(a) which is the garbage information that we generate. In other words, our computation performs the transformation
a->(f(a),G(a)). Now, we would like to get rid of the garbage information G(a). To do this, we would first copy our desired output f(a) to obtain a->(f(a),G(a),f(a)). We would then run our computation a->(f(a),G(a)) in reverse to transform (f(a),G(a)) back into a. When we put everything together, we have computed the mapping
a->(a,f(a)) reversibly. The function a->(a,f(a)) is injective and it is a restriction of the bijection (a,b)->(a,f(a) XOR b) which can also be performed on a reversible computer quite easily.

Trick 2: Suppose that f,g are inverse functions. Then as long as we are able to reversibly transform a to (a,f(a)) and b to (b,g(b)), we may also reversibly transform a to f(a) without producing any garbage information. In particular, we perform the following transformations a->(a,f(a))=(g(f(a)),f(a))->f(a).

Trick 3: One can iterate Trick 1 by using Bennett's pebble game (perhaps generalized to partial reversible computation and digraphs) in order to compute with a manageable computational complexity theoretic overhead.

Trick 4: One can also design functions including encryption and hashing functions for (partial) reversibility.

I have to remind all of you that NIST has butchered cryptographic functions such as SHA-256 and AES by standardizing low quality irreversible cryptographic functions rather than the reversible ones (or partially reversible in the case of SHA-256). I have to give NIST a grade of F- for this egregious oversight.

What would your computer do once you've streamed a full movie online, and have generated terabytes of raw screen data? Does pixels on the screen not "count" in your energy budgets, since they are outside the CPU? They are still erased dozens of times per second, millions of them.
-It is called partial reversibility. We do not have to immediately use reversibility everywhere in order for reversibility to be applicable. Besides, the lowest frequency of visible light has an energy of about 75 kT per photon, so if your device is emitting photons, it is losing more than Landauer's limit of energy per photon.




And if you have any evidence, post it here in this thread for all to see.
-You are an extraordinarily arrogant chlurmcklet. Go away.


yeah he doesn't know how to answer a question like that. he just believes... but i'll give it a shot. one of the issues it seems to me is the one you bring up that it would increase the complexity of the computer's hardware architecture to have to "keep the result of every operation". probably not cost effective. you don't get something for nothing but let's hear what Mr. Ph.D has to say. he's the authority.
-You are contributing absolutely nothing to the conversation because you are a very low quality specimen.

he won't know how to answer that question i'm pretty sure unless he just links you to some summary of some research paper that they did to avoid losing tenure...

since I'm the OP I wish i could put him onto a moderation in this thread to kind of tamper his enthusiasm for "reversible computing". but i guess we don't have that feature.  Grin
-You have that nasty attitude because you hate science and you hate research. You are contributing absolutely nothing to the conversation. And unlike a good steak, the meat on your bones has been completely wasted. How sad! And partial reversibility is a thing too. Please learn how to read what I have been telling you. But you can't do that because you hate science. I have much more respect for flat-Earthers than I do for specimens like you. Since you are so pathetic, I am pressing the ignore button. I will not respond to your bullshit anymore unless someone quotes you. I am also ignoring digiran because that entity is also a chlurmck who is not worth talking to.

-Joseph Van Name Ph.D.
copper member
Activity: 1330
Merit: 899
🖤😏
I don't want anyone to attack our doc here, he is a special case. But if anyone is interested to discuss reversible whatnot, please visit this topic and ignore my posts there https://bitcointalksearch.org/topic/m.62179276
I'm just allergic to any coin other than BTC, hence my hostile approach towards their blockchain and devs.

On topic, extraordinary claims require extraordinary evidence, people usually fail to provide the evidence.
sr. member
Activity: 1190
Merit: 469

And if you have any evidence, post it here in this thread for all to see.
he won't even answer my basic questions so good luck getting this guy to post any actual evidence.

Quote
But practically, how would reversible computing work? To avoid burning energy on erasing data, you'd have to keep the result of every operation. How would caching work, a technology that is vital for keeping our computers fast and entirely based on erasing data when it needs space for new data?
yeah he doesn't know how to answer a question like that. he just believes... but i'll give it a shot. one of the issues it seems to me is the one you bring up that it would increase the complexity of the computer's hardware architecture to have to "keep the result of every operation". probably not cost effective. you don't get something for nothing but let's hear what Mr. Ph.D has to say. he's the authority.

Quote
What would your computer do once you've streamed a full movie online, and have generated terabytes of raw screen data? Does pixels on the screen not "count" in your energy budgets, since they are outside the CPU? They are still erased dozens of times per second, millions of them.
he won't know how to answer that question i'm pretty sure unless he just links you to some summary of some research paper that they did to avoid losing tenure...

since I'm the OP I wish i could put him onto a moderation in this thread to kind of tamper his enthusiasm for "reversible computing". but i guess we don't have that feature.  Grin
full member
Activity: 161
Merit: 230
Do you have any references that show reversible computations in classical computers would more energy efficient in practice? The actual energy used in computer chips is several magnitudes more per operation than the Landauer limit, the actual bit erasure energy is basically insignificant.
That is a good question.

Disclaimer: I am not a physicist or an engineer specializing in reversible computing hardware.

1. Reversible computation will become essential long before the energy efficiency of computation approaches k*T*ln(2). Regardless of what kind of hardware one uses, if one wants to delete a bit of information, one should spend at least 100*k*T energy per bit deletion in order to overcome thermal noise. We should therefore expect for reversible computing hardware to outperform conventional computing hardware at the level of >100kT.

Wouldn't reversible computation also have to overcome thermal noise? Since the energy not related to bit erasure significantly dominates the energy usage of current technology, I would assume reversible computing would also have such large "hidden" factors.

2. The paper arxiv:1701.08202 indicates that a specific kind of reversible mechanical computation can perform calculations at a rate of about 4*10^(-26) Joules per cycle at a rate of 100 MHz. This is several orders of magnitude below Landauer's limit. The catch is that it is an engineering nightmare to engineer such a device. Fortunately, there are other proposals for producing energy efficient reversible computing hardware that can be manufactured with something similar to our modern photolithography.

Did you cite the right paper? There is nothing about doing computation at that energy level or a rate of 100mhz, the whole paper insteads seem to be about how one can more effectively simulate mechanical computational devices on current hardware.

3. In biology, there are already examples of reversible computation (the biosynthesis of messenger RNA). The logical and thermodynamic reversibility of this process has been described in the paper Logical Reversibility of Computation by Charles Bennett. So reversible computation happens in nature. It should therefore be physically possible to manufacture logically and thermodynamically reversible (and partially reversible) computers.

If this is not enough for you, I can send you evidence that reversible computation is underrated.

-Joseph Van Name Ph.D.

Do you know the power usage of these biological processes? I don't have journal access so I can't read that paper.

And if you have any evidence, post it here in this thread for all to see.


But practically, how would reversible computing work? To avoid burning energy on erasing data, you'd have to keep the result of every operation. How would caching work, a technology that is vital for keeping our computers fast and entirely based on erasing data when it needs space for new data?

What would your computer do once you've streamed a full movie online, and have generated terabytes of raw screen data? Does pixels on the screen not "count" in your energy budgets, since they are outside the CPU? They are still erased dozens of times per second, millions of them.
member
Activity: 691
Merit: 51
Do you have any references that show reversible computations in classical computers would more energy efficient in practice? The actual energy used in computer chips is several magnitudes more per operation than the Landauer limit, the actual bit erasure energy is basically insignificant.
That is a good question.

Disclaimer: I am not a physicist or an engineer specializing in reversible computing hardware.

1. Reversible computation will become essential long before the energy efficiency of computation approaches k*T*ln(2). Regardless of what kind of hardware one uses, if one wants to delete a bit of information, one should spend at least 100*k*T energy per bit deletion in order to overcome thermal noise. We should therefore expect for reversible computing hardware to outperform conventional computing hardware at the level of >100kT.

2. The paper arxiv:1701.08202 indicates that a specific kind of reversible mechanical computation can perform calculations at a rate of about 4*10^(-26) Joules per cycle at a rate of 100 MHz. This is several orders of magnitude below Landauer's limit. The catch is that it is an engineering nightmare to engineer such a device. Fortunately, there are other proposals for producing energy efficient reversible computing hardware that can be manufactured with something similar to our modern photolithography.

3. In biology, there are already examples of reversible computation (the biosynthesis of messenger RNA). The logical and thermodynamic reversibility of this process has been described in the paper Logical Reversibility of Computation by Charles Bennett. So reversible computation happens in nature. It should therefore be physically possible to manufacture logically and thermodynamically reversible (and partially reversible) computers.

If this is not enough for you, I can send you evidence that reversible computation is underrated.

-Joseph Van Name Ph.D.
full member
Activity: 161
Merit: 230
For the purposes of this post and all future posts, when I refer to reversible computation, I am referring to classical partially reversible computation in order to perform calculations more energy efficiently.

Do you have any references that show reversible computations in classical computers would more energy efficient in practice? The actual energy used in computer chips is several magnitudes more per operation than the Landauer limit, the actual bit erasure energy is basically insignificant.
sr. member
Activity: 1190
Merit: 469
Well, since we do not have energy efficient reversible computers yet, this is theoretical work. Do you have a problem with scientific theories? Where are you going with this? I hope you know that before anything becomes practical it is theoretical. If you do not like scientific theories, then you do not like progress.
yeah i do have a problem with silly theories like people thinking one day they will be able to teleport themself to anywhere on the planet without getting in a car. people can get some really silly notions based off of silly scientific theories they hear about that will never come to pass.

https://en.wikipedia.org/wiki/Teleportation

And can you give an example of when "deleting bits of information" is actually done in modern computer hardware?

Quote
Furthermore, in a previous comment, I told you and everyone else what I meant by deleting bits of information. By deleting information, I was referring to irreversibly replacing each of those bits of information with zeros. But in a hard drive, when you 'delete' information, you simply remove the pointer to that information which does not require k*T*ln(2) energy per bit of information in the location that the pointer is referring to, but when you overwrite the information with other information, then you will need to spend at least k*T*ln(2) energy per bit overwritten.
i'm the one that started this thread you know. you wouldn't be here talking in it if it wasn't for me. just remember that. now, about the overwriting information thing, i'm not disagreeing about that. in fact you just pretty much confirmed my point.

Quote
No. Landauer's principle does come into play because when you overwrite information, you have to spend that >>k*T*ln(2) energy per bit overwritten.
remove that ">>" part and i might be more likely to agree with you but when you put the "">>" thing in there it makes me think you're saying it requires like double the energy. once to erase and once to write which i'm not sure about that. so you're saying that the overwriting process is not one step but two discrete distinct steps. erasing and writing. i don't know about that.

Quote
Oh. And if CPUs and GPUs did not delete any information, we would already have reversible computers.
i would love to hear an explanation about why you think that way. but as you pointed out, this thread was about quantum computers. not imaginary things like reversible ones.


Quote
-You are really starting to )1$$ me off. I have a Ph.D. in Mathematics.
maybe you do but that doesn't mean all of your beliefs are not subject to scrutiny especially if you go around posting them on unrelated threads. this thread was about quantum not reversible but thanks for bringing up the topic since i hadn't heard about it before. Shocked

member
Activity: 691
Merit: 51
and you know that, how? from experience? or from theory?
Well, since we do not have energy efficient reversible computers yet, this is theoretical work. Do you have a problem with scientific theories? Where are you going with this? I hope you know that before anything becomes practical it is theoretical. If you do not like scientific theories, then you do not like progress.

And can you give an example of when "deleting bits of information" is actually done in modern computer hardware?
Hard drives are for long term storage of information instead of for computation. Right now, a typical hard drive may have 1 tb of storage space, but at Landauer's limit, deleting all of that information at Landauer's limit would cost 2.4*10^(-8) Joules, and even at 1000 times Landauer's limit (to cover thermal noise), it will still cost 2.4*10^(-5) Joules to delete all of that information. But what about deleting information in the arithmetic logic unit in a CPU? Do you honestly expect that CPUs and GPUs perform all those calculations without throwing away a few bits now and there? Because if that is the case, then those CPUs and GPUs are already reversible.

Furthermore, in a previous comment, I told you and everyone else what I meant by deleting bits of information. By deleting information, I was referring to irreversibly replacing each of those bits of information with zeros. But in a hard drive, when you 'delete' information, you simply remove the pointer to that information which does not require k*T*ln(2) energy per bit of information in the location that the pointer is referring to, but when you overwrite the information with other information, then you will need to spend at least k*T*ln(2) energy per bit overwritten.

you see, this landauer's principle you're talking about doesn't even really come into play because nothing is being "deleted" it's just being overwritten. you don't have to erase data first before you write over it. news flash.  Shocked
No. Landauer's principle does come into play because when you overwrite information, you have to spend that >>k*T*ln(2) energy per bit overwritten. Please stop denying scientific facts. Oh. And if CPUs and GPUs did not delete any information, we would already have reversible computers. Please try to learn what you are talking about before making a spectacle of yourself.

no its because in practice there is no reason for computations to be reversible. for example, imagine what bitcoin would be like if we required sha256 to be reversible. bitcoin wouldn't even exist. but i guess you think there is some reverse inverse function for sha256. there's not.
-You are really starting to )1$$ me off. I have a Ph.D. in Mathematics, so I @#$%ing know what an invertible function is you @$$hole. You are clearly the one who knows absolutely nothing about anything. SHA-256 is not an invertible function. And that means absolutely nothing at all. I ALREADY TOLD YOU THAT WE DO NOT NEED COMPLETE REVERSIBILITY FOR REVERSIBLE COMPUTATION TO BE USEFUL. WE ONLY NEED PARTIAL REVERSIBILITY FOR REVERSIBLE COMPUTATION TO BE USEFUL AND DOMINANT. Have you even looked at the circuit for SHA-256? Do you know about the properties that cryptographic hash functions are supposed to satisfy? Cryptographic hash functions are supposed to be COLLISION RESISTANT. That means that for a cryptographic hash function H, even though there are in principle distinct inputs w,x where H(w)=H(x), in practice, one should not be able to find two distinct inputs w,x with H(w)=H(x). Collision resistance is a weakened form of invertibility because it means that in practice, we are not able to find a specific example of non-invertibility (though it is easy to establish the non-invertibility without producing an example). Now, how do we build collision resistant and efficient cryptographic hash functions? That is right. We use mostly reversible components or at least components that can be made (mostly) reversible with a moderate computational overhead.

Now, the designers and standardizers of SHA-256 (that is the NSA and the NIST) made a big mistake since they did not design SHA-256 to run on reversible hardware or software, so I have to give the NSA and the NIST both an F- on their design of SHA-256. Yes. It was important for the NSA to design hash functions for partially reversible computers, but they did not do this. The NIST should have also standardized other cryptographic functions as a preemptive measure for unknown (at the time) applications where they needed to use a standard function for a certain piece of technology. Cryptographic hash functions were never designed for cryptocurrency mining, and it turns out that for cryptocurrency mining, you do not really need to have a function that takes arbitrary input and returns a 256 bit output. A 32 bit keyed permutation would be sufficient as the main part of a cryptocurrency mining algorithm, but such a permutation should be designed for reversible computation. Oh. And the NIST should have standardized a cryptographic hash functions to be as anti-reversible as possible. After all, reversible computation is a dangerous technology that will power the AI that should be concerning, and a reasonable way to delay the progress in reversible computing technologies would be to design a cryptographic hash function that can avoid reversible computation to the extent to which that is feasible.

that's because we have AI like Chatgpt it's pretty smart. we have low level quantum computers and they are getting bigger. what we don't have a need for is something that can reverse its computations. think about it. alot of computations are not based on things that have inverses.
-GPT is a dumbass. And the quantum computers that we have have not performed any useful calculations. But you say that we have no need for anything that can reverse its calculations? Do you even know the basics of how quantum computers work? The quantum gates are UNITARY MATRICES. A matrix X (over the reals, complex or even the quaternions) is said to be unitary if X\cdot X^*=X^*\cdot X=1 which means that X is invertible. Of course, with quantum computation, we do not really care much about the overall energy efficiency of those unitary transformations, but it is still invertible.

But I am not here to bash AI or quantum computation. I have been developing my own AI algorithms (or are you going to hate that too?), so I cannot be against them. I am just here to bash the hype behind AI and quantum computation because the people hyping up AI and quantum computation display their ignorance by being so oblivious to reversible computation. I am also here to bash the news behind AI and quantum computation because by ignoring reversible computation, the media has been completely one sided.

You have not given any scientific reason why reversible computation is not possible or feasible. I have given a reason why we do not see the technology yet, and I have given a reason why we should expect to see more work on reversible computing technologies. Yes. Developing practical physically reversible computers will be difficult. And so is developing quantum computation. The only reason why quantum computation is more popular than reversible computation is pure hype.

since you're the Ph.D. feel free and make any necessary corrections
-I already have corrected all your factually inaccurate claims. Factual accuracy is more important than grammar, but you at least need to try to communicate using proper English so that I can respond to your claims more effectively.

In order to stay on topic here, if you want to continue to attempt to discuss reversible computation, we should go on another thread.

-Joseph Van Name Ph.D.







legendary
Activity: 3500
Merit: 6320
Crypto Swap Exchange
I do not seem to understand how this challenge works.

In RSA you need the private key to decrypt anything that is encrypted by it by it. Now I don't know how it works with encryption, but when you're RSA/DSA/ECDSA signing something with the private key, the public key is exposed. But that doesn't really matter here because he also shared the public key by itself.

If something is "encrypted by the public key", that is by keypair and no password, right? Hopefully he did not use the public key as a pasword, because that would not make sense?

Looks like more or less he is just making the point that he really can't decrypt anything.

If he can decrypt it, he would then have the private key and then could sign anything. So if he really did find out a way to to decrypt something without the private key somehow then he would have the private key.

But, since it can't be done its more or less a FU to him.


This is the same as the magic carburetor that allowed you to get 200 miles to a gallon of gas in your 2 ton truck, it just can't be done.
You just need to use uranium instead of gasoline.





-Dave
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
Quote
Anton Guzhevskiy, the chief operating officer at Australian cybersecurity firm ThreatDefence, also challenged Gerck to prove his claims. "I've shared an RSA-2048 public key and a corresponding private key encrypted by this public key. If you can decrypt the private key, you can sign some piece of text with it, which will prove that you are in possession of the private key," he said in a response to Gerck's post on LinkedIn. "Can you do it?"

"There is a publication delay, and I do not control that," Gerck responded.
Says it all really.

This sounds like CSW all over again. Ask for a signed message as proof, which would be trivially easy to provide if any of the claims were true, and instead be told that we have to wait for some paper or evidence to be published which will "totally prove it". Roll Eyes

I won't be holding my breath.

I do not seem to understand how this challenge works.

In RSA you need the private key to decrypt anything that is encrypted by it by it. Now I don't know how it works with encryption, but when you're RSA/DSA/ECDSA signing something with the private key, the public key is exposed. But that doesn't really matter here because he also shared the public key by itself.

If something is "encrypted by the public key", that is by keypair and no password, right? Hopefully he did not use the public key as a pasword, because that would not make sense?
sr. member
Activity: 1190
Merit: 469
Reversible computation will simply use less energy than irreversible computation in order to perform the same task.
and you know that, how? from experience? or from theory?


Oh ok, you got me. Entire *observable universe then. Tongue
yeah i got you this time  Grin


member
Activity: 691
Merit: 51
For the purposes of this post and all future posts, when I refer to reversible computation, I am referring to classical partially reversible computation in order to perform calculations more energy efficiently.

Quantum computing promises exceptional speedup for some specific problems.
Some, like molecular dynamics are extremely important for humanity.
That is great. One can also simulate molecular dynamics by actually constructing the molecules that one wants to study. This is why I am more in favor of using quantum computation for AI. I have my own kind of quantum AI algorithm that reduces the dimension of quantum channels, but I do not have a quantum computer to run it on. But so far quantum computing has promised a lot while delivering no practical applications so far. Both reversible computing and quantum computing promise performance gains. No human has built a useful reversible or quantum computer yet, but there are prototypes of both. Quantum computation is difficult since one will need to use far more physical qubits because most of those physical qubits will be needed for error correction. Reversible computation is less difficult because one will need to overcome a moderate but manageable computational complexity theoretic overhead, but we can overcome this obstacle gradually using partial reversibility and first for computation that is more amenable to reversibility.

A rational society would look at reversible and quantum computation and value both types of computation more or less equally since they both have their purposes. A rational society would also value reversible computation because reversible computation is not entirely disjoint from quantum computation. How are people so confident that practical quantum computing will be achieved before practical energy efficient reversible computing? How are people so confident that the innovations in practical energy efficient reversible computing will not spur innovation in quantum computing?

There are plenty of reasons to favor reversible computing over quantum computing right now. Energy efficient reversible computing is easier to understand than quantum computing since with reversible computing, one does not have to take tensor products of many complex inner product spaces, nor does one have to worry about the weirdness of quantum information theory.

Quote from: jvanname
I personally ignore most quantum computation because quantum computing is overhyped. Think about it. Quantum computing promises exceptional speedup for some specific problems. Reversible computing promises moderate speedup for all problems.

reversible computing sounds like even more a fantasy than quantum computers.  just because you make a computer that is reversible doesn't guarantee it won't use electricity or even large amounts of it, i would think. i have no idea why something that is reversible would be faster than a conventional computer. so you have some explaining to do there  Shocked got any examples of reversible computers? probably not.  
-You need to learn a lot more about the claims of reversible computation before making such bold and arrogant statements about it. You cannot say that something is a 'fantasy' when later you claim that you have 'no idea why' reversible computation will outperform conventional irreversible computation. I never claimed that reversible computation would not use any electricity or gasoline. Reversible computation will simply use less energy than irreversible computation in order to perform the same task.

Landauer's principle states that in order to delete a bit of information (by delete, I mean replace the bit of information with a zero), one must expend more than k*T*ln(2) energy where k is Boltzmann's constant, T is the temperature, and ln(2)=0.69314... Here, k=1.38*10^(-23)Joules/Kelvin. As one approaches Landauer's limit, it gets difficult to reliably irreversibly delete information, so one should expect to expend about 100*k*T to delete the bit of information using irreversible computation regardless of the type of hardware one uses as long as one deletes bits of information one at a time.

The reason why we do not have any reversible computers yet (we do not have any practical quantum computers either) is that Landauer's limit is quite small, and there have been other ways of obtaining performance gains that do not require reversibility. These other methods include taking your 10 micrometer transistors and shrinking them down to 2 nm transistors which is about where we are now. There is currently not much room to shrink transistors because atoms have a diameter of about 0.1 nm. In the past, the energy efficiency of computation was too far away from Landauer's limit for very many people to be concerned with energy efficient reversible computation, but those days are over. This means that people need to be looking out for reversible computation because it is coming. Sure. Developing energy efficient reversible computers will be difficult, but so is developing practical quantum computers.

Anyone who knows nothing about reversible computation but instead hypes up stuff like AI, quantum computation, and even impossible things such as anti-matter engines that can propel spacecrafts to 90% of the speed of light may be safely ignored.

P.S. You would be much more persuasive or at least respectful if you used proper grammar, capitalization, spelling, and punctuation.


-Joseph Van Name Ph.D.


legendary
Activity: 2268
Merit: 18748
This is the same as the magic carburetor that allowed you to get 200 miles to a gallon of gas in your 2 ton truck, it just can't be done.
You just need to use uranium instead of gasoline.

just a minor thing to point out which is no one knows the size of the entire universe or if it is finite or infinite but i get what you're saying. some numbers are big to write down. for us mere mortals...
Oh ok, you got me. Entire *observable universe then. Tongue
sr. member
Activity: 1190
Merit: 469
even if every digit of such a number was stored in a single Planck volume, the entire universe would be too small to represent such a number. But this man has factorized such a number on a mobile phone? Lol.

just a minor thing to point out which is no one knows the size of the entire universe or if it is finite or infinite but i get what you're saying. some numbers are big to write down. for us mere mortals...

Quote from: jvanname
I personally ignore most quantum computation because quantum computing is overhyped. Think about it. Quantum computing promises exceptional speedup for some specific problems. Reversible computing promises moderate speedup for all problems.

reversible computing sounds like even more a fantasy than quantum computers.  just because you make a computer that is reversible doesn't guarantee it won't use electricity or even large amounts of it, i would think. i have no idea why something that is reversible would be faster than a conventional computer. so you have some explaining to do there  Shocked got any examples of reversible computers? probably not.  
staff
Activity: 4284
Merit: 8808
Shor's algorithm with noiseless qubits and gates works in theory.
Shor's algorithm with noisy qubits and gates does not work in theory. It needs exponentially small noise.
Both in theory.
That's it.
I thought that this is what error correction fixes-- with a huge but only polynomial blow up.

Quantum computing promises exceptional speedup for some specific problems.
Some, like molecular dynamics are extremely important for humanity.
full member
Activity: 149
Merit: 165
Metal Seed Phrase at the lowest price! From 44.99
With regards to QC, I always think about these facts:

- Is it easy to run software on QC? Not sure about it. Or develop something to destroy Bitcoin
- If QC is used for destroying Bitcoin, wouldn't it be easier to crack all the banks in the world, and more rewarding? (Unless we get into the fact that destroying Bitcoin is the true target)
- Worst case scenario, if Bitcoin becomes QC vulnerable, and a hard fork has to be implemented, you still keep the coins that you had already... There would be some confusion, like with BCH, but look at how it is now.
member
Activity: 691
Merit: 51
I personally ignore most quantum computation because quantum computing is overhyped. Think about it. Quantum computing promises exceptional speedup for some specific problems. Reversible computing promises moderate speedup for all problems. It is important to get better at computing everything and also to try to get an exceptional speedup for specific tasks. But quantum computing has been overwhelmingly hyped while few have heard of reversible computing. This is not good since the old strategy of simply shrinking transistors and making everything smaller is not going to work any more. This means that we will need another strategy for improving performance in computation, and partially reversible computation is a part of the solution.

Heck, even things that are pretty much impossible like antimatter engines that can propel people to the speed of light are hyped much more than reversible computation. The reason for this is psychological rather than a result of any deficiency of the promises of reversible computation. People are unwilling to accept that the laws of physics may not always make computation easier and in order to compute within the laws of physics and saenergy, one must make tradeoffs between energy efficiency per bit operation and time/space. And the word 'reversible' sounds less magical than the word 'quantum', and people cannot accept that.

For this reason, I am going to ignore any news about quantum computation.

-Joseph Van Name Ph.D.
full member
Activity: 206
Merit: 447
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically.
Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.
This is correct only with ideal noiseless qubits and gates.
That's exactly what a "theoretical QC" is. Hence, your claim of "not vulnerable to QC even theoretically" being wrong.

Shor's algorithm with noiseless qubits and gates works in theory.
Shor's algorithm with noisy qubits and gates does not work in theory. It needs exponentially small noise.
Both in theory.
That's it.

legendary
Activity: 990
Merit: 1108
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically.
Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.
This is correct only with ideal noiseless qubits and gates.
That's exactly what a "theoretical QC" is. Hence, your claim of "not vulnerable to QC even theoretically" being wrong.
full member
Activity: 206
Merit: 447
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically. Neither is RSA-128 - yes only 128 bits are beyond Shor's algorithm even in theory.

Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.


This is correct only with ideal noiseless qubits and gates. Shor's algorithm fails when exponentially small noise is present. That is, it needs noise levels of the order 2-n. The same is true for ECDLP. Good luck achieving noise level 2-256.

It is very easy to figure out why that happens. We start with qubits which are independent, each representing only 1 bit of information. But before reaching the final result it is not yet clear which of the 2n possibilities are the correct ones. So midway of the quantum computation, qubits have to represent 22n states at the same time. This enormous amount of information vanishes with any noise present.

Read the last sentence in this section https://en.wikipedia.org/wiki/Shor%27s_algorithm#Physical_implementation.

legendary
Activity: 990
Merit: 1108
You got it wrong. RSA-2048 is not vulnerable to QC even theoretically. Neither is RSA-128 - yes only 128 bits are beyond Shor's algorithm even in theory.

Now you're just talking nonsense. Shor's algorithm factorizes n-digit numbers on a theoretical QC in time O(n^2 * log n * log log n) [1]. Which can in theory factorize numbers of tens of thousands of digits.

Quote
Current QC hardware struggles with RSA-6 (six bits).

The only thing you got right. The current QC factorization record of 21 = 3 * 7 even used some shortcuts for numbers of a special form. So it's fair to say that we have yet to successfully run Shor's algorithm on a QC.

[1] https://en.wikipedia.org/wiki/Shor%27s_algorithm
full member
Activity: 206
Merit: 447
they might not factor 10^1000 size numbers but 2048 bit numbers are an entirely different animal and should be vulnerable to quantum computers at some point. very vulnerable. the only question is, when do companies like atom computing scale up past 1000 qbits to say 1 million qbits. i'd say in the next 10 years at worst.
they went from 100 qbits to 1000 in like 2 years.

You got it wrong. RSA-2048 is not vulnerable to QC even theoretically. Neither is RSA-128 - yes only 128 bits are beyond Shor's algorithm even in theory. Current QC hardware struggles with RSA-6 (six bits).

Finally someone did put the noise into quantum equations and this is the result:
We consider Shor's quantum factoring algorithm in the setting of noisy quantum gates. Under a generic model of random noise for (controlled) rotation gates, we prove that the algorithm does not factor integers of the form pq when the noise exceeds a vanishingly small level in terms of n - the number of bits of the integer to be factored, where p and q are from a well-defined set of primes of positive density. We further prove that with probability 1−o(1) over random prime pairs (p,q), Shor's factoring algorithm does not factor numbers of the form pq, with the same level of random noise present.


sr. member
Activity: 1190
Merit: 469

Not even close to that time-frame. As @o_e_l_e_o  pointed out that is an insane amount of data.
i wasn't talking about the 10^1000 number. i was just talking about 2048-bit integers. big difference. 2048-bit numbers are still pretty big but they're only about 600 base 10 digits. that's it! plenty of room to store it on a computer and do alot of calculations on it.

Quote
Things like this will keep coming up again and again and yet it never seems to happen. Nor will it.

-Dave
they might not factor 10^1000 size numbers but 2048 bit numbers are an entirely different animal and should be vulnerable to quantum computers at some point. very vulnerable. the only question is, when do companies like atom computing scale up past 1000 qbits to say 1 million qbits. i'd say in the next 10 years at worst.
they went from 100 qbits to 1000 in like 2 years.

https://arstechnica.com/science/2023/10/atom-computing-is-the-first-to-announce-a-1000-qubit-quantum-computer/
For Atom itself, the step up from 100 to 1,000 qubits was done without significantly increasing the laser power required. That will make it easier to keep boosting the qubit count. And, Bloom adds, "We think that the amount of challenge we had to face to go from 100 to 1,000 is probably significantly higher than the amount of challenges we're gonna face when going to whatever we want to go to next—10,000, 100,000

Quote from: gmaxwell
If you could crack RSA-2048 there are several very well known public challenges that you could break (one of which has a $200k prize, IIRC). 
That challenge ended in 2007. I guess RSA started getting worried they might have to pay out on some of the larger cash prizes like the $100k prize for RSA-1024....
https://en.wikipedia.org/wiki/RSA_Factoring_Challenge
staff
Activity: 4284
Merit: 8808
If you could crack RSA-2048 there are several very well known public challenges that you could break (one of which has a $200k prize, IIRC).  Anyone who claims to be able to crack RSA and doesn't prove it should just be disregarded.  It's not even worth reading the rest of their claims without the proof.

If they were instead saying they could reduce it to 2^80 work through some mathematical breakthrough, making it vulnerable but not *easily* cracked that would be another matter... but these posts talk about cracking it on a cell phone.

Looking at this I don't think the source is a scammer, but his posts are clearly nonsense.  We all need to keep in mind that *everyone* is online today.  Children, scammers, smart people, dumb people, trolls, and people who are delusional too and the lines between the classes aren't always crisp.

In the days before the internet with smaller communities, you'd learn what people tend to be unreliable sources of information.  But now everyone you hear from is a stranger, and you usually don't know what weight to give their claims.

Ironically the specific nonsense here-- path tracing quantum computation running efficiently on a classical computer is strikingly similar to VB's old quantum miner scam.  In that case it was much more clearly a scam rather than someone who is confused.
hero member
Activity: 714
Merit: 1298
The claim of RSA-2048 breaking with commercial phone has reminded me the declaration  on achievement of the cold nuclear fusion published, back in the past,  in  one  of the reputable scientific journals.

Guess what happened next. Right, it was proved that authors of that paper fell into illusion.

Both cases have in common their tendency towards wishful thinking.
legendary
Activity: 3500
Merit: 6320
Crypto Swap Exchange

And based on what i found, it looks like that researcher currently or used to be owner or that publication. This is what i did,
1. Read news mentioned by OP which mention https://www.researchgate.net/publication/373516233_QC_Algorithms_Faster_Calculation_of_Prime_Numbers.

All our computations were done in a commercial cellphone, or a commercial Linux desktop, as our QC devices -- opening the user market to many industries. No cryogenics or special materials were used. A post-quantum, HIPAA compliant, end-to-end, patent-free, export-free, secure online solution, is being created, based on ZSentry as used from 2004 to 2014, to replace RSA.



the first part doesn't even sound believable to anyone that know about the difficulty of factoring. the second part sounds like someone that had a little too much to drink.

well i guess this was a false alarm. but at some point in the future, we should expect to see a story with this exact title where they actually provide the factorization. and i wouldn't be surprised if it was in the next 5 years. but it could be 10.  Cheesy

and yeah, this guy gives off Craig Wright vibes. for sure.

Not even close to that time-frame. As @o_e_l_e_o  pointed out that is an insane amount of data.
This is the same as the magic carburetor that allowed you to get 200 miles to a gallon of gas in your 2 ton truck, it just can't be done.

Things like this will keep coming up again and again and yet it never seems to happen. Nor will it.

-Dave
legendary
Activity: 990
Merit: 1108
Quote
We factored numbers with more than 101000 decimal digits, and the capital cost was less than $1,000.
There is not enough computing power in the entire world to even store a number with 101000 digits, let alone even think about beginning to attempt to factorize it.
Here's a number with more than 101000 decimal digits:

10101000

And here's its factorization:

2101000 * 5101000

My capital cost was not even $1 Wink
legendary
Activity: 2268
Merit: 18748
Wait wait wait. I'm just reading this in a bit more depth.

Quote
We factored numbers with more than 101000 decimal digits, and the capital cost was less than $1,000.

Because the formatting is messed up I originally read that as 101000, but it's actually 101000. Cheesy There is not enough computing power in the entire world to even store a number with 101000 digits, let alone even think about beginning to attempt to factorize it. In fact, much like Graham's number, even if every digit of such a number was stored in a single Planck volume, the entire universe would be too small to represent such a number. But this man has factorized such a number on a mobile phone? Lol.
sr. member
Activity: 1190
Merit: 469

And based on what i found, it looks like that researcher currently or used to be owner or that publication. This is what i did,
1. Read news mentioned by OP which mention https://www.researchgate.net/publication/373516233_QC_Algorithms_Faster_Calculation_of_Prime_Numbers.

All our computations were done in a commercial cellphone, or a commercial Linux desktop, as our QC devices -- opening the user market to many industries. No cryogenics or special materials were used. A post-quantum, HIPAA compliant, end-to-end, patent-free, export-free, secure online solution, is being created, based on ZSentry as used from 2004 to 2014, to replace RSA.



the first part doesn't even sound believable to anyone that know about the difficulty of factoring. the second part sounds like someone that had a little too much to drink.

well i guess this was a false alarm. but at some point in the future, we should expect to see a story with this exact title where they actually provide the factorization. and i wouldn't be surprised if it was in the next 5 years. but it could be 10.  Cheesy

and yeah, this guy gives off Craig Wright vibes. for sure.
legendary
Activity: 3500
Merit: 6320
Crypto Swap Exchange
Yeah, it's just another scammer looking to scam.
But, I'm sure some people will follow him and buy the 'non crack-able device' that he will be selling soon.
Or the magic security box that will not allow for this to happen to YOU.

Not worth thinking about since there are better scammers then this out there.

-Dave
legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
Since we're in Bitcoin forum, IMO phrase "Don't trust, verify" is appropriate for claim of that researcher.

Quote
Anton Guzhevskiy, the chief operating officer at Australian cybersecurity firm ThreatDefence, also challenged Gerck to prove his claims. "I've shared an RSA-2048 public key and a corresponding private key encrypted by this public key. If you can decrypt the private key, you can sign some piece of text with it, which will prove that you are in possession of the private key," he said in a response to Gerck's post on LinkedIn. "Can you do it?"

"There is a publication delay, and I do not control that," Gerck responded.
Says it all really.

This sounds like CSW all over again. Ask for a signed message as proof, which would be trivially easy to provide if any of the claims were true, and instead be told that we have to wait for some paper or evidence to be published which will "totally prove it". Roll Eyes

I won't be holding my breath.

And based on what i found, it looks like that researcher currently or used to be owner or that publication. This is what i did,
1. Read news mentioned by OP which mention https://www.researchgate.net/publication/373516233_QC_Algorithms_Faster_Calculation_of_Prime_Numbers.
2. Research Gate says that researcher is part of "Planalto Research".
3. From DuckDuckGo search result, i found this LinkedIn page for "Planalto Research" which can be seen at https://www.linkedin.com/company/planalto-research.
4. That LinkedIn page mention http://gerck.com .
5. I can't access http://gerck.com , but archived page on archive.org show this,

© Ed Gerck, 2001-2015.

So i agree it's like CSW again. CMIIW.
legendary
Activity: 2268
Merit: 18748
Quote
Anton Guzhevskiy, the chief operating officer at Australian cybersecurity firm ThreatDefence, also challenged Gerck to prove his claims. "I've shared an RSA-2048 public key and a corresponding private key encrypted by this public key. If you can decrypt the private key, you can sign some piece of text with it, which will prove that you are in possession of the private key," he said in a response to Gerck's post on LinkedIn. "Can you do it?"

"There is a publication delay, and I do not control that," Gerck responded.
Says it all really.

This sounds like CSW all over again. Ask for a signed message as proof, which would be trivially easy to provide if any of the claims were true, and instead be told that we have to wait for some paper or evidence to be published which will "totally prove it". Roll Eyes

I won't be holding my breath.
full member
Activity: 161
Merit: 230
Current cellphones and desktops can't do quantum computing, so dude's full of ****
legendary
Activity: 3472
Merit: 10611
We all knew this day was coming sooner or later
You mean the day when someone claims they may theoretically crack a cryptography algorithm? That day came thousands of years ago when the concept of cryptography was introduced. Wink
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
Doesn't look like he actually did it:

Quote
Gerck said all his "QC computations were done in a commercial cellphone, or a commercial Linux desktop," at a capital cost of less than $1,000. "No cryogenics or special materials were used."

I mean, if anybody can demonstratively use this method to break RSA-2048 on my Elitebook 6930p, I will be amazed. But it won't happen.
sr. member
Activity: 1190
Merit: 469
https://www.bankinfosecurity.com/blogs/researcher-claims-to-crack-rsa-2048-quantum-computer-p-3536
As Ed Gerck Readies Research Paper, Security Experts Say They Want to See Proof

 We all knew this day was coming sooner or later but I guess we didn't realize it would be done without shors algorithm  Shocked
 Bitcoin should still be good since it doesn't require factoring large numbers
Jump to: