Author

Topic: Wall Observer BTC/USD - Bitcoin price movement tracking & discussion - page 115. (Read 26711086 times)

legendary
Activity: 3990
Merit: 4597
OT: Google's quantum computer.
https://techcrunch.com/2024/12/10/google-says-its-new-quantum-chip-indicates-that-multiple-universes-exist/
and
https://blog.google/technology/research/google-willow-quantum-chip/

Quote
Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

The logic of this is difficult to follow for me: they claim that the fact that their quantum computer is so much faster than the regular computer (does something in 5min that "regular"supercomputer can only do in 10^25 years, which is roughly 1 million billion times longer than our Universe existed somehow indicates that the quantum computer uses parallel universes to do the calculations. This just does not seem right to me.

However, I can imagine the quantum computer "spawning" those universes, maybe virtually, to "help" it's calculations.
Otherwise, it is not clear to me how it managed to link up with those preexisting universes as there is nothing there that one might consider an interface.
I read the D. Deutsch book, but the argument there was just as unclear. Perhaps, I am just not getting it.

One could make an algorithm arbitrarily slow, so that it could take several times the age of our universe (which is an estimate based on our current understanding) to compute. Think of using an abacus to solve a complex problem. The fact that a modern computer (quantum or not) can solve the same problem near-infinitely faster does not mean it's somehow using parallel universes to solve it.

I'm not buying it.

I'm pretty sure you are familiar with this, Biodom. And probably AlcoHoDL, too. But I'm laying this down for those who might not be.

The fastness/slowness of an algorithm does not depend on the hardware used to run its implementation (i.e., abacus vs fast CPU). "Time complexity" - that's what it's called - is measured as the number of steps the algorithm must go through depending on the size of its input. The same number of steps performed by an abacus or by a CPU take different times, of course, but what's relevant is the rate of growth of the number of steps as the size of the input grows. That's why doubling the key length does not double the number of steps to brute force it, but turns it into its square. For example: if it takes 1000 steps to break a 100 bit key, a 200 bit key would take 1,000,000. Bring the key length to, say, 1000 bits and you're set for a few decades. Or centuries.

Quantum computing's breakthrough is not about a device faster than abacus, or faster than the best CPU. It is about a new paradigm of computation that introduces different algorithms with a smaller time complexity. For example, if all possible 100 bit keys could be tested in just one operation, that would require constant time, and this could be cleverly exploited to speed up the brute forcing of longer keys by suitable grouping or whatnot.

That's why I called out to Google hoping they publish a paper with statements of

1. The problem to be solved
2. The classical (non quantum) algorithm
3. The quantum algorithm

If/when they do, the scientific community will be able to evaluate the complexity speedup. Until that day, it's hype and uninformed journalists - people who say "exponentially" thinking it means "much".

Thanks, d_eddie, for that reply. I'm well aware of your point, but I don't think the Google scientist frames it in that way, or maybe he tries to, but fails. If anything, he seems to do the exact opposite, i.e., he seems to equate the quantum paradigm with the conventional computing paradigm. He says that the conventional computer takes 10 septillion years, which is longer than the age of the universe, to do the same task their quantum computer does in 5 minutes, therefore the quantum computer must somehow use multiple universes (i.e., use a multitude of conventional computers) to manage to do it so fast. His argument collapses the moment he compares conventional CPU time with the age of the universe.

Thought experiment: imagine a time far into the future, when an ultra-fast, but still conventional supercomputer (essentially an ultra-fast abacus) has been developed, which manages to complete the given task in 26 billion years. This is just under 26.7 billion years, which is the most recent estimate of the age of our universe. Does this mean that the same old quantum computer that was using multiple universes in 2024, has now suddenly switched to using only one?

Quantum computers are exciting in that they can simultaneously evaluate many parameters of a problem, and this may well mean that they are indeed using multiple universes to achieve it (far-fetched, as it may sound), but Google's argument that "conventional computers take too long, therefore our QC uses multiple universes" does not make logical sense (to me at least).

I hope the above explanation is clear enough.

Not to me, sorry.
Do they mean that once you switch it on, it (or just qubits in it) basically 'floats' in multiple universes, somehow?
additionally, since different universes suppose to have different properties, how would that come into play?
I can understand it more if they were to say that it "creates" multiple 'mathematical" virtual universes (or computational continuums), but all this talk about universes makes me cringe.



Bitcoin is pulling up because it went to near infinity in multiple universes...we just overcoming the resistance in this one  Cheesy

donator
Activity: 4760
Merit: 4323
Leading Crypto Sports Betting & Casino Platform
you seem distracted, which might be part of the explanation why you have been in bitcoin since mid-2011 and you still are "accumulating bitcoin."  You might not either know what bitcoin is (or have confidence in it) or you might be distracted into shitcoins and trading... which goes back to not knowing what bitcoin is, even though you've been on the forum since mid-2011.  Who would-a-thunk?

I’m now 100% convinced that you are a moron. Thankfully, I won’t have to pretend to sift through your long-winded nonsensical posts of idiocy anymore. Smiley
legendary
Activity: 4354
Merit: 9201
'The right to privacy matters'
Last time sub 100K.
I'm willing to bet my stupid alpaca on it.

I'm as bullish as the next guy, but it might not be that easy.

The pullback in the next bear could well go under 100k again, just to stay on a midterm viewpoint. I'd like JJG to provide his personal 2-significant-digits-after-the-decimal-point estimate of such probability.

Additionally, as an esteemed and knowledgeable bitcoiner friend of mine recently said, "we must run over 100k and back so many times that it becomes just like any other number". Only then can 100k be properly thrown in the roadkill bucket.

Just pass it and move on to 110k then 120k by years end no one will give a fuck about it at all.

look at dec 2020

we came in at 19.6k

we left at 28.8k

Jan 31 2021 we left at 34.2k

Feb 28 2021 we left at 46.1k

Mar 31 2021 we left at 58.9k
legendary
Activity: 2590
Merit: 4839
Addicted to HoDLing!
OT: Google's quantum computer.
https://techcrunch.com/2024/12/10/google-says-its-new-quantum-chip-indicates-that-multiple-universes-exist/
and
https://blog.google/technology/research/google-willow-quantum-chip/

Quote
Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

The logic of this is difficult to follow for me: they claim that the fact that their quantum computer is so much faster than the regular computer (does something in 5min that "regular"supercomputer can only do in 10^25 years, which is roughly 1 million billion times longer than our Universe existed somehow indicates that the quantum computer uses parallel universes to do the calculations. This just does not seem right to me.

However, I can imagine the quantum computer "spawning" those universes, maybe virtually, to "help" it's calculations.
Otherwise, it is not clear to me how it managed to link up with those preexisting universes as there is nothing there that one might consider an interface.
I read the D. Deutsch book, but the argument there was just as unclear. Perhaps, I am just not getting it.

One could make an algorithm arbitrarily slow, so that it could take several times the age of our universe (which is an estimate based on our current understanding) to compute. Think of using an abacus to solve a complex problem. The fact that a modern computer (quantum or not) can solve the same problem near-infinitely faster does not mean it's somehow using parallel universes to solve it.

I'm not buying it.

I'm pretty sure you are familiar with this, Biodom. And probably AlcoHoDL, too. But I'm laying this down for those who might not be.

The fastness/slowness of an algorithm does not depend on the hardware used to run its implementation (i.e., abacus vs fast CPU). "Time complexity" - that's what it's called - is measured as the number of steps the algorithm must go through depending on the size of its input. The same number of steps performed by an abacus or by a CPU take different times, of course, but what's relevant is the rate of growth of the number of steps as the size of the input grows. That's why doubling the key length does not double the number of steps to brute force it, but turns it into its square. For example: if it takes 1000 steps to break a 100 bit key, a 200 bit key would take 1,000,000. Bring the key length to, say, 1000 bits and you're set for a few decades. Or centuries.

Quantum computing's breakthrough is not about a device faster than abacus, or faster than the best CPU. It is about a new paradigm of computation that introduces different algorithms with a smaller time complexity. For example, if all possible 100 bit keys could be tested in just one operation, that would require constant time, and this could be cleverly exploited to speed up the brute forcing of longer keys by suitable grouping or whatnot.

That's why I called out to Google hoping they publish a paper with statements of

1. The problem to be solved
2. The classical (non quantum) algorithm
3. The quantum algorithm

If/when they do, the scientific community will be able to evaluate the complexity speedup. Until that day, it's hype and uninformed journalists - people who say "exponentially" thinking it means "much".

Thanks, d_eddie, for that reply. I'm well aware of your point, but I don't think the Google scientist frames it in that way, or maybe he tries to, but fails. If anything, he seems to do the exact opposite, i.e., he seems to equate the quantum paradigm with the conventional computing paradigm. He says that the conventional computer takes 10 septillion years, which is longer than the age of the universe, to do the same task their quantum computer does in 5 minutes, therefore the quantum computer must somehow use multiple universes (i.e., use a multitude of conventional computers) to manage to do it so fast. His argument collapses the moment he compares conventional CPU time with the age of the universe.

Thought experiment: imagine a time far into the future, when an ultra-fast, but still conventional supercomputer (essentially an ultra-fast abacus) has been developed, which manages to complete the given task in 26 billion years. This is just under 26.7 billion years, which is the most recent estimate of the age of our universe. Does this mean that the same old quantum computer that was using multiple universes in 2024, has now suddenly switched to using only one?

Quantum computers are exciting in that they can simultaneously evaluate many parameters of a problem, and this may well mean that they are indeed using multiple universes to achieve it (far-fetched, as it may sound), but Google's argument that "conventional computers take too long, therefore our QC uses multiple universes" does not make logical sense (to me at least).

I hope the above explanation is clear enough.
legendary
Activity: 2478
Merit: 1220
Privacy Servers. Since 2009.
Back to six digits! Next target 100k EUR.  Cool
legendary
Activity: 2380
Merit: 1823
1CBuddyxy4FerT3hzMmi1Jz48ESzRw1ZzZ

Explanation
Chartbuddy thanks talkimg.com
legendary
Activity: 2520
Merit: 3038
Last time sub 100K.
I'm willing to bet my stupid alpaca on it.

I'm as bullish as the next guy, but it might not be that easy.

The pullback in the next bear could well go under 100k again, just to stay on a midterm viewpoint. I'd like JJG to provide his personal 2-significant-digits-after-the-decimal-point estimate of such probability.

Additionally, as an esteemed and knowledgeable bitcoiner friend of mine recently said, "we must run over 100k and back so many times that it becomes just like any other number". Only then can 100k be properly thrown in the roadkill bucket.
legendary
Activity: 1526
Merit: 2617
Far, Far, Far Right Thug
Last time sub 100K.
I'm willing to bet my stupid alpaca on it.
legendary
Activity: 2380
Merit: 1823
1CBuddyxy4FerT3hzMmi1Jz48ESzRw1ZzZ

Explanation
Chartbuddy thanks talkimg.com
legendary
Activity: 1612
Merit: 1608
精神分析的爸


And also: Yay, once again crossed 100k.

Edited to add: I wonder what Scrooge McDuck said in the english original really, I only know the german translation (and the honorable Erika Fuchs took quite some freedom in translating, i.e. Scrooge was in Germany Dagobert Duck).
In german the original above sentence was "Wer den Kreuzer nicht ehrt, ist des Talers nicht wert") and it was truely something I learned to adhere to when I was like 7 or 8 in the 70s.
hero member
Activity: 758
Merit: 1844
OT: Google's quantum computer.
https://techcrunch.com/2024/12/10/google-says-its-new-quantum-chip-indicates-that-multiple-universes-exist/
and
https://blog.google/technology/research/google-willow-quantum-chip/

Quote
Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

The logic of this is difficult to follow for me: they claim that the fact that their quantum computer is so much faster than the regular computer (does something in 5min that "regular"supercomputer can only do in 10^25 years, which is roughly 1 million billion times longer than our Universe existed somehow indicates that the quantum computer uses parallel universes to do the calculations. This just does not seem right to me.

However, I can imagine the quantum computer "spawning" those universes, maybe virtually, to "help" it's calculations.
Otherwise, it is not clear to me how it managed to link up with those preexisting universes as there is nothing there that one might consider an interface.
I read the D. Deutsch book, but the argument there was just as unclear. Perhaps, I am just not getting it.

One could make an algorithm arbitrarily slow, so that it could take several times the age of our universe (which is an estimate based on our current understanding) to compute. Think of using an abacus to solve a complex problem. The fact that a modern computer (quantum or not) can solve the same problem near-infinitely faster does not mean it's somehow using parallel universes to solve it.

I'm not buying it.

I'm pretty sure you are familiar with this, Biodom. And probably AlcoHoDL, too. But I'm laying this down for those who might not be.

The fastness/slowness of an algorithm does not depend on the hardware used to run its implementation (i.e., abacus vs fast CPU). "Time complexity" - that's what it's called - is measured as the number of steps the algorithm must go through depending on the size of its input. The same number of steps performed by an abacus or by a CPU take different times, of course, but what's relevant is the rate of growth of the number of steps as the size of the input grows. That's why doubling the key length does not double the number of steps to brute force it, but turns it into its square. For example: if it takes 1000 steps to break a 100 bit key, a 200 bit key would take 1,000,000. Bring the key length to, say, 1000 bits and you're set for a few decades. Or centuries.

Quantum computing's breakthrough is not about a device faster than abacus, or faster than the best CPU. It is about a new paradigm of computation that introduces different algorithms with a smaller time complexity. For example, if all possible 100 bit keys could be tested in just one operation, that would require constant time, and this could be cleverly exploited to speed up the brute forcing of longer keys by suitable grouping or whatnot.

That's why I called out to Google hoping they publish a paper with statements of

1. The problem to be solved
2. The classical (non quantum) algorithm
3. The quantum algorithm

If/when they do, the scientific community will be able to evaluate the complexity speedup. Until that day, it's hype and uninformed journalists - people who say "exponentially" thinking it means "much".


Just to also add on to this...

They were using "random circuit sampling (RCS) benchmark" as the performance test....

https://research.google/blog/validating-random-circuit-sampling-as-a-benchmark-for-measuring-quantum-progress/

"This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes".... this statement is a little misleading as well.... Sure, if you were using SC, but you are not, you are using QC, just as d_eddie stated above "abacus vs CPU"....

I am still reading through it, to understand it a little more, but to draw conclusion that quantum computer operates in a multiverse because it is fast at doing it, is a bit of a stretch!...
hero member
Activity: 758
Merit: 1844
hero member
Activity: 3010
Merit: 666
legendary
Activity: 3620
Merit: 4813
member
Activity: 240
Merit: 62
legendary
Activity: 4354
Merit: 9201
'The right to privacy matters'
legendary
Activity: 2380
Merit: 1823
1CBuddyxy4FerT3hzMmi1Jz48ESzRw1ZzZ

Explanation
Chartbuddy thanks talkimg.com
legendary
Activity: 2520
Merit: 3038
OT: Google's quantum computer.
https://techcrunch.com/2024/12/10/google-says-its-new-quantum-chip-indicates-that-multiple-universes-exist/
and
https://blog.google/technology/research/google-willow-quantum-chip/

Quote
Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

The logic of this is difficult to follow for me: they claim that the fact that their quantum computer is so much faster than the regular computer (does something in 5min that "regular"supercomputer can only do in 10^25 years, which is roughly 1 million billion times longer than our Universe existed somehow indicates that the quantum computer uses parallel universes to do the calculations. This just does not seem right to me.

However, I can imagine the quantum computer "spawning" those universes, maybe virtually, to "help" it's calculations.
Otherwise, it is not clear to me how it managed to link up with those preexisting universes as there is nothing there that one might consider an interface.
I read the D. Deutsch book, but the argument there was just as unclear. Perhaps, I am just not getting it.

One could make an algorithm arbitrarily slow, so that it could take several times the age of our universe (which is an estimate based on our current understanding) to compute. Think of using an abacus to solve a complex problem. The fact that a modern computer (quantum or not) can solve the same problem near-infinitely faster does not mean it's somehow using parallel universes to solve it.

I'm not buying it.

I'm pretty sure you are familiar with this, Biodom. And probably AlcoHoDL, too. But I'm laying this down for those who might not be.

The fastness/slowness of an algorithm does not depend on the hardware used to run its implementation (i.e., abacus vs fast CPU). "Time complexity" - that's what it's called - is measured as the number of steps the algorithm must go through depending on the size of its input. The same number of steps performed by an abacus or by a CPU take different times, of course, but what's relevant is the rate of growth of the number of steps as the size of the input grows. That's why doubling the key length does not double the number of steps to brute force it, but turns it into its square. For example: if it takes 1000 steps to break a 100 bit key, a 200 bit key would take 1,000,000. Bring the key length to, say, 1000 bits and you're set for a few decades. Or centuries.

Quantum computing's breakthrough is not about a device faster than abacus, or faster than the best CPU. It is about a new paradigm of computation that introduces different algorithms with a smaller time complexity. For example, if all possible 100 bit keys could be tested in just one operation, that would require constant time, and this could be cleverly exploited to speed up the brute forcing of longer keys by suitable grouping or whatnot.

That's why I called out to Google hoping they publish a paper with statements of

1. The problem to be solved
2. The classical (non quantum) algorithm
3. The quantum algorithm

If/when they do, the scientific community will be able to evaluate the complexity speedup. Until that day, it's hype and uninformed journalists - people who say "exponentially" thinking it means "much".
legendary
Activity: 2380
Merit: 1823
1CBuddyxy4FerT3hzMmi1Jz48ESzRw1ZzZ

Explanation
Chartbuddy thanks talkimg.com
Jump to: