Author

Topic: Ordinary computers can beat Google’s quantum computer after all (Read 173 times)

legendary
Activity: 4424
Merit: 4794
At risk of making a completely dumb comment, bearing in mind I've never actually paid much attention to what quantum computing/computers really meant, so all I'm getting from here is that, quantum computers will be good at quantum computing - classic algos don't apply.

The layman equivalent, to me, would be asking my laptop and an ASIC to do multiple tasks other than mining, and poor ASIC will lose?

to dumb it down

think of 2d(x/y axis) and 3d maps(x+-/y+-/z+- axis)

binary is great at 2d. and when they draw a 2d maze that is complicated that takes 1,500,000 asics 10 minutes to solve a path through a 2d maze. then that means it would take centuries for a basic desktop binary computer to solve that same maze

quantum is great at 3d mazes. and can solve them quite quick, where as even 1.5m binary asics would take centuries to solve a 3d maze as its not designed for 3d maze paths..

however getting a quantum system to solve a 2d maze and stick to the rules of 2d logic, is not efficient for quantum.
thus although quantum might come up with multiple path options using its 3d design. converting the 'go x by 5 go y by 3 go z by 2.(3d axis) trying to convert z just does not convert into binary thus gets rejected as a possible path.

thus it reduces the solution finding ability of quantum.
thus instead of highly exponential efficiency its more like just 2x efficiency at some binary problems.

..
what google suggested years ago was that their quantum can solve a 3d maze in minutes which would take binary decades/centuries to solve knowing that the maze was never designed for binary..

however newer research suggests that if setting a binary system where 1 bit handles x 1 bit handles y and 1 bit handles z. might use up 3 bits per decision. instead of 1. but then by doing it that way. they can then solve 3d mazes alot quicker in binary compared to trying to find a 2d path through a 3d maze

thus they calculated they could solve a 3d maze ALOT quicker by doing this

..
anyway. quantum is great ad 3d stuff and not so great at 2d stuff while staying in the rules of 2d for a 2d system to read, understand and accept the answer in 2d
in short quantum cant send a [x:5 , y:3 , z: 2] to binary because binary will just error:'what is z?. reject answer
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
One day, when commercial quantum computers actually hit the market, we are going to look back at these races and cringe how slow they all were, like the lemming DEC Alpha vs. i386 vs. SPARC speed races back in the day.
legendary
Activity: 2968
Merit: 3684
Join the world-leading crypto sportsbook NOW!
At risk of making a completely dumb comment, bearing in mind I've never actually paid much attention to what quantum computing/computers really meant, so all I'm getting from here is that, quantum computers will be good at quantum computing - classic algos don't apply.

The layman equivalent, to me, would be asking my laptop and an ASIC to do multiple tasks other than mining, and poor ASIC will lose?

full member
Activity: 287
Merit: 159

....

Finally we have some honest commentary on false claims of quantum computers posing a legitimate threat to encryption standards cryptocurrencies like bitcoin are built upon. They admit their "quantum technology" was based around a loophole, rather than computational advantages. This acknowledgement voids their later claims that future quantum computers will fare better.

I think the issue with quantum computers is they lack basic foundational building blocks necessary to deliver on promises of fundamentally superior processing power. There is no quantum breakthrough analogy to the silicon transistor. Which would allow for a quantum computer to be built using basic blocks which are intrinsically superior to silicon semiconductors.

Basic logic gates made of silicon will always be superior to the existing logic gates of quantum computing. Unless a significant breakthrough occurs that basic fact will not change.
This reminds me of a comment I had made here, back in 2019:

https://bitcointalksearch.org/topic/m.51594701


...

And so, when it comes to building computing machines that will take advantage of this quantum wierdness, the actual devices will simply be employing a complex emergent classical property.  That is, the quantum computers will just be very advanced, very fast classical computer versions of what we have today. (can you see how I can find this topic of quantum computing to be rather silly).

...
copper member
Activity: 2856
Merit: 3071
https://bit.ly/387FXHi lightning theory
Unless a significant breakthrough occurs that basic fact will not change.

I think this has been my worry about most encryption on the Internet, once the breakthrough is found it'll spread and start to be mass produced fairly quickly imo and then everyone will have to upgrade their cryptographic algorithms very quickly.

The worrying part is we don't know when the breakthrough will come.and just have to hope everyone's given enough warning to adjust to it.



It's seemed very telling that nothing bigger than 300 qubits has come out of any quantum lab for a few years too - a lot of areas of research have stagnated around the same point for now.
legendary
Activity: 2562
Merit: 1441
Quote
If the quantum computing era dawned 3 years ago, its rising sun may have ducked behind a cloud. In 2019, Google researchers claimed they had passed a milestone known as quantum supremacy when their quantum computer Sycamore performed in 200 seconds an abstruse calculation they said would tie up a supercomputer for 10,000 years. Now, scientists in China have done the computation in a few hours with ordinary processors. A supercomputer, they say, could beat Sycamore outright.

“I think they’re right that if they had access to a big enough supercomputer, they could have simulated the … task in a matter of seconds,” says Scott Aaronson, a computer scientist at the University of Texas, Austin. The advance takes a bit of the shine off Google’s claim, says Greg Kuperberg, a mathematician at the University of California, Davis. “Getting to 300 feet from the summit is less exciting than getting to the summit.”

Still, the promise of quantum computing remains undimmed, Kuperberg and others say. And Sergio Boixo, principal scientist for Google Quantum AI, said in an email the Google team knew its edge might not hold for very long. “In our 2019 paper, we said that classical algorithms would improve,” he said. But, “we don’t think this classical approach can keep up with quantum circuits in 2022 and beyond.”

The “problem” Sycamore solved was designed to be hard for a conventional computer but as easy as possible for a quantum computer, which manipulates qubits that can be set to 0, 1, or—thanks to quantum mechanics—any combination of 0 and 1 at the same time. Together, Sycamore’s 53 qubits, tiny resonating electrical circuits made of superconducting metal, can encode any number from 0 to 253 (roughly 9 quadrillion)—or even all of them at once.

Starting with all the qubits set to 0, Google researchers applied to single qubits and pairs a random but fixed set of logical operations, or gates, over 20 cycles, then read out the qubits. Crudely speaking, quantum waves representing all possible outputs sloshed among the qubits, and the gates created interference that reinforced some outputs and canceled others. So some should have appeared with greater probability than others. Over millions of trials, a spiky output pattern emerged.

The Google researchers argued that simulating those interference effects would overwhelm even Summit, a supercomputer at Oak Ridge National Laboratory, which has 9216 central processing units and 27,648 faster graphic processing units (GPUs). Researchers with IBM, which developed Summit, quickly countered that if they exploited every bit of hard drive available to the computer, it could handle the computation in a few days. Now, Pan Zhang, a statistical physicist at the Institute of Theoretical Physics at the Chinese Academy of Sciences, and colleagues have shown how to beat Sycamore in a paper in press at Physical Review Letters.

Following others, Zhang and colleagues recast the problem as a 3D mathematical array called a tensor network. It consisted of 20 layers, one for each cycle of gates, with each layer comprising 53 dots, one for each qubit. Lines connected the dots to represent the gates, with each gate encoded in a tensor—a 2D or 4D grid of complex numbers. Running the simulation then reduced to, essentially, multiplying all the tensors. “The advantage of the tensor network method is we can use many GPUs to do the computations in parallel,” Zhang says.

Zhang and colleagues also relied on a key insight: Sycamore’s computation was far from exact, so theirs didn’t need to be either. Sycamore calculated the distribution of outputs with an estimated fidelity of 0.2%—just enough to distinguish the fingerprintlike spikiness from the noise in the circuitry. So Zhang’s team traded accuracy for speed by cutting some lines in its network and eliminating the corresponding gates. Losing just eight lines made the computation 256 times faster while maintaining a fidelity of 0.37%.

The researchers calculated the output pattern for 1 million of the 9 quadrillion possible number strings, relying on an innovation of their own to obtain a truly random, representative set. The computation took 15 hours on 512 GPUs and yielded the telltale spiky output. “It’s fair to say that the Google experiment has been simulated on a conventional computer,” says Dominik Hangleiter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the computation would take a few dozen seconds, Zhang says—10 billion times faster than the Google team estimated.

The advance underscores the pitfalls of racing a quantum computer against a conventional one, researchers say. “There’s an urgent need for better quantum supremacy experiments,” Aaronson says. Zhang suggests a more practical approach: “We should find some real-world applications to demonstrate the quantum advantage.”

Still, the Google demonstration was not just hype, researchers say. Sycamore required far fewer operations and less power than a supercomputer, Zhang notes. And if Sycamore had slightly higher fidelity, he says, his team’s simulation couldn’t have kept up. As Hangleiter puts it, “The Google experiment did what it was meant to do, start this race.”



https://www.science.org/content/article/ordinary-computers-can-beat-google-s-quantum-computer-after-all


....


Finally we have some honest commentary on false claims of quantum computers posing a legitimate threat to encryption standards cryptocurrencies like bitcoin are built upon. They admit their "quantum technology" was based around a loophole, rather than computational advantages. This acknowledgement voids their later claims that future quantum computers will fare better.

I think the issue with quantum computers is they lack basic foundational building blocks necessary to deliver on promises of fundamentally superior processing power. There is no quantum breakthrough analogy to the silicon transistor. Which would allow for a quantum computer to be built using basic blocks which are intrinsically superior to silicon semiconductors.

Basic logic gates made of silicon will always be superior to the existing logic gates of quantum computing. Unless a significant breakthrough occurs that basic fact will not change.
Jump to: