Pages:
Author

Topic: I don't believe Quantum Computing will ever threaten Bitcoin (Read 5409 times)

member
Activity: 846
Merit: 22
$$P2P BTC BRUTE.JOIN NOW ! https://uclck.me/SQPJk
for get privkey from 2^256 pubkey needs about 2^60 publick key generstions.

Q computer can't generate 2^60 pubkeys you think ?
legendary
Activity: 1904
Merit: 1277
if given a public key, how would you use AI to tackle the algorithm and solve for the appropriate private key?

You wouldn't. You'd use a quantum computer running Shor's algorithm.

AI might help you to derive a more efficient algorithm, and improve your solution time slightly, but it does next to nothing to address the fundamental issue, which is the sheer number of potential solutions. Whereas a quantum computer does address this, because its processing power scales differently.

A conventional computer can solve a problem 'x' in 'y' seconds, taking 'z' number of steps.
If you use AI to improve your algorithm, then perhaps it can solve problem 'x' in 'y/2' seconds, so twice as fast - but it will still take 'z' number of computational steps to do so.
The advantage of a quantum computer is that it can drastically reduce 'z', the number of steps required. This is why they are 'faster'. 

Where a classical computer with 'n' bits can represent 'n' states, a quantum computer can represent 2^n states. This is because the potential outcomes are superposed.
So as we increase complexity, the number of states that can be represented are as follows:
Classical: 1,2,3,4,5,6,7,8 etc.
Quantum: 1,2,4,8,16,32,64,128 etc.

The upshot is that whilst a classical computer takes an unimaginably huge 2^128 operations to derive the bitcoin private key, a QC running Shor takes a mere 128^3.
It doesn't matter how great your algorithm is, there is always the limit that a classical computer still faces that huge number of processing steps.
full member
Activity: 224
Merit: 120
Today, the main danger for hacking comes from artificial intelligence. And it is no longer theory, it is practice. There are already break-ins based on this technology. Keys and passwords are being compromised again. It was reported this month that more than one billion accounts could be hacked using artificial intelligence. Why? Because there's something to steal...
... Now, back to the topic. Public key cryptography uses randomness generated from sources to generate the private key. Since the generation methods involves large amount of entropy, if given a public key, how would you use AI to tackle the algorithm and solve for the appropriate private key?
-----------------------------
Now back to the subject of the question you asked.
How can we use artificial intelligence to solve the problem of finding a private key if we know the public key?
I could be wrong, but the principle of artificial intelligence is algorithms, it is a program with self-development.
If there was an algorithm to find the private key through the public key, there would not be the cryptography we use. Makes sense?
It makes sense, except for cryptography built on elliptic curves, for the reasons described in the last post.
So, the solution algorithm known to us (not to humans in general) does not exist.
Then, I would use artificial intelligence in another way - I would break the whole amount of computation into billions of components (into groups of large sets of numbers to check), and in a mode of covertly forcing computation on all remote network computers, make them work on the problem in secret. This is similar to the process of covert mining of a cryptocurrency, the task being distributed to all systems available for such a covert attack. Then all that remains is to hope for a result in polynomial time. Naturally, I would apply all known algorithms that reduce calculations when solving discrete logarithm or factorization problems of large numbers.
As for the human social graph and guessing, artificial intelligence will help with passwords if they are not random, but will not help at all with keys, with the pair of public and private key, which are generated without taking into account the peculiarities of the person's personality.
And of course, the best and most effective way to get the public key with artificial intelligence is banal phishing, theft, covert espionage, Trojan horse program and other nasties, with which the artificial intelligence will be loaded in the first place.
In that sense, it's interesting to have a discussion, will our security increase or decrease in the age of artificial intelligence?
It's not as simple a question as it seems at first glance...
full member
Activity: 224
Merit: 120
Today, the main danger for hacking comes from artificial intelligence. And it is no longer theory, it is practice. There are already break-ins based on this technology. Keys and passwords are being compromised again. It was reported this month that more than one billion accounts could be hacked using artificial intelligence. Why? Because there's something to steal...
How do you think AI would affect ECDSA or more specifically public key cryptography? AI (or rather machine learning) does analysis based on certain trends and using passwords/dictionary attacks usually results in the algorithm being fed with big data and finding association and possible passwords based on the targets. If you want to bruteforce using this method, you could find success with leaked database but most likely not with sites that are designed to deter such attempts.

Now, back to the topic. Public key cryptography uses randomness generated from sources to generate the private key. Since the generation methods involves large amount of entropy, if given a public key, how would you use AI to tackle the algorithm and solve for the appropriate private key?
--------------------------
I do not see the use of artificial intelligence technology to solve the problem of breaking cryptography on elliptic curves. The point is that this problem, so far, belongs to the class of NP-hard problems. Any program, and artificial intelligence is a program with feedback on itself, doesn't like to solve such problems. the program needs an algorithm. And NP-hard problems do not have a known algorithm to solve them in polynomial time.
It seems that there is not.
However, there are very big doubts about it.
No, it is not that this problem has solution algorithms that are hidden from us, but that initially, elliptic curves in finite number fields - have hidden loopholes, weak, but this is known only to the initiated. Here, read this analysis and draw your own conclusions.
Analytica, in abbreviated form, on this topic:
-----------------------   

I do not want to escalate the fear of those present here, but you need to know this if you study the issue of security - for real.

This material reasonably answers important 2 questions:

1. Is cryptography on elliptic curves so safe as we think?

2. Are quantum computations really dangerous for
modern public key cryptosystems?

In higher circles, official organizations, whose activities are directly related to cryptography, since 2015, there is a lively activity.
Why everything so suddenly turned up so hard, no one explains to us.
They probably know more than they say. Yes, and hide the ends ...

The competent organizations involved in setting universal technical standards are very noticeably concerned about the problems of the so-called quantum-safe cryptography. Here are the facts that you should pay attention to, even to us, non-specialists in the field of cryptography.

The next international symposium entitled “ETSI / IQC Workshop on Quantum Secure Cryptography” (https://www.etsi.org/events/1072-ws-on-quantumsafe was held on September 19-21, 2016 in Toronto, Canada, 2016). To emphasize the significance of this event, it should be clarified that ETSI is the European Telecommunications Standards Institute (that is, the industry equivalent of the American NIST, the main standardization body in the United States). And IQC, respectively, is the Institute of Quantum Computing at the University of Waterloo, that is, one of the world's leading research centers that have been dealing with cryptography problems in the context of quantum computers for more than a dozen years.

With such solid organizers of the event, not only leading scientists of academic structures and industry, but also important people from the leadership of transnational corporations and government departments of Europe, North America, Japan, China and South Korea were noted among the participants of the symposium.

And besides, there are also big chiefs of special services involved in the protection of information in states such as Britain, Canada and Germany.

And all these very busy people gathered in Toronto, back in 2016, to discuss how to strengthen cryptography to withstand technologies that, even according to the most optimistic estimates, will become a real threat in twenty years, at least.

If we take into account the fact that, almost simultaneously, in August 2016, NIST (USA) officially announced the launch of its own large-scale program for the transition from traditional cryptography to “post-quantum” cryptography, then the conclusion will be quite obvious.

In the world of cryptography, big changes have already clearly begun. And they started up somehow very hastily and even with some signs of panic. Which, of course, raises questions. And that's why.

In the United States, the first official signal that an urgent need to do something with the modernization of traditional cryptography was August 2015. It was then that the National Security Agency, as the main authority of the state in the field of ciphers, issued a statement on significant changes in its basic policy, in connection with the need to develop new standards for post-quantum cryptography, or, briefly, PQC (National Security Agency, Cryptography today, August 2015 )
The parties involved in this process, and the NSA itself, stated that it considers the present moment (this is still 2015-2016) the most suitable time to come to grips with the development of new protocols for public-key cryptography. Such cryptography, where the strength of the cipher will not depend on calculations using quantum computers.

Naturally, the idea comes that someone somewhere, secretly from the rest, still built a real quantum computer, back in those days. And since the most visible and decisive initiative for the early transition to a new, quantum-safe cryptography was demonstrated by the NSA, it is easy to guess which state comes to mind in the first place. Having not only the largest budget for such initiatives, but also all the necessary scientific and technical capabilities. The NSA, an organization highly classified and secretly able to use the most powerful supercomputers on the planet.

In an open community of cryptographers, puzzled by the haste of new initiatives, there are naturally a lot of other various speculations to explain what is happening. The most informative, perhaps a review work, summarizing and comparing all such hypotheses and assumptions without a final answer, can be considered the well-known article “Puzzle wrapped in a riddle”, prepared by the very famous cryptographers Neil Koblitz and Alfred Menezes at the end of 2015 (Neal Koblitz and Alfred J . Menezes, “A Riddle Wrapped in an Enigma”).
In order to make it clearer why it makes sense to focus on the facts precisely from this analytical work, two points should be briefly clarified.
First: what place do its authors occupy in open academic cryptography.
Second: how closely their own scientific developments are intertwined with the NSA's initiatives to accelerate the transfer of used cryptographic algorithms to other tracks.

The American mathematician and cryptographer Neil Koblitz, is (along with Victor Miller) one of those two people who in 1985 simultaneously and independently came up with a new public key crypto scheme, called ECC (this is, we recall, an abbreviation for Elliptic Curve Cryptography , that is, "cryptography on elliptic curves").

Without going deep into the technical details of this method and its difference from the RSA cryptographic scheme that appeared earlier, we note that ECC has obvious advantages from the point of view of practical operation, since the same theoretical stability of the algorithm is provided with a much shorter key length (for comparison: 256-bit ECC operations are equivalent to working with a 3072-bit module in RSA). And this greatly simplifies the calculations and significantly improves the system performance.
The second important point (almost certainly related to the first) is that the extremely secretive NSA in its cryptographic preferences from the very beginning began to lean in favor of ECC. (!)

In the early years and decades, this reached the academic and industrial circles only in an implicit form (when, for example, in 1997, an official of the NSA, Jerry Solinas, first spoke at the Crypto public conference - with a report on their modification of the famous Koblitz scheme).

Well then, it was already documented. In 2005, the NSA published its recommendations on cryptographic algorithms in the form of the so-called Suite B (“Set B”) - a set of openly published ciphers for hiding secret and top-secret information in national communication systems.

All the basic components of this document were built on the basis of ECC, and for RSA, the auxiliary role of the “first generation” (!) Was assigned, necessary only for a smooth transition to a new, more efficient cryptography on elliptic curves ... (!)
Now we need to remember about Alfred Menezes, the second co-author of the article about "Puzzle, shrouded in a riddle." Canadian mathematician and cryptographer Menezes has been working at the University of Waterloo, one of the most famous centers of open academic cryptography, all his scientific life since the mid-1980s. It was here that in the 1980s, three university professors created Certicom, a company dedicated to the development and commercial promotion of cryptography on elliptic curves.

Accordingly, Alfred Menezes eventually became not only a prominent Certicom developer and author of several authoritative books on ECC crypto schemes, but also a co-author of several important patents describing ECC. Well, the NSA, in turn, when it launched its entire project called Suite B, previously purchased from Certicom a large (twenty-odd) package of patents covering “elliptical” cryptography.

This whole preamble was needed in order to explain why Koblitz and Menezes are precisely those people who, for natural reasons, considered themselves knowledgeable about the current affairs and plans of the NSA in the field of cryptographic information protection.
However, for them, the NSA initiative with a sharp change of course to post-quantum algorithms was a complete surprise. (!)
Back in the summer of 2015 (!) The NSA “quietly”, without explaining to anyone at all, removed the “P-256” ECC algorithm from its kit, while leaving it with its RSA equivalent with a 3072-bit module. Moreover, in the NSA's accompanying statements it was quite clearly said that all parties implementing the algorithms from Suite B now no longer make any sense to switch to ECC, but it is better to simply increase the RSA key lengths and wait until new post-quantum ciphers appear ...
But why? What is the reason for such a sharp rollback to the old RSA system? I do not think that such a serious organization will make such serious decisions, for no reason.
Koblitz and Menezes have every reason to consider themselves people competent in the field of cryptography on elliptic curves, but they did not hear absolutely anything about new hacking methods that compromised “their” crypto scheme. So everything that happens around ECC amazed mathematicians extremely.
People who have close contacts with this industry know that large corporations that provide cryptographic tasks and equipment for the US government always get some kind of advance warning about changing plans. But in this case there was nothing of the kind.
Even more unexpected was the fact that no one from the NSA addressed the people from NIST (USA), who are responsible for the open cryptographic standards of the state.

And finally, even the NSA’s own cryptographic mathematicians from the Information Security Administration (IAD) were extremely surprised by the surprise that the leadership presented them with their post-quantum initiative ...

It can be concluded that those very influential people who in the bowels of the NSA initiated a public change of course did this without any feedback and consultation, even with their own experts. It is to this conclusion that Koblitz and Menezes come in their analyzes. And they readily admit that in the end no one really understands the technical background of everything that happens here.
The conclusion suggests itself that there was some unknown activity, some hidden actors.

For an adequate perception of intrigue, it is very desirable to know that in fact the principles of public key cryptography were discovered almost simultaneously (in the 1970s) in two fundamentally different places at once. At first, a few years earlier, this was done by three secret cryptographs within the walls of the British secret service GCHQ, an analogue and the closest partner of the American NSA. But as it has long been wound up, everything was done in deep secrecy and "only for yourself."

The discovery was not made by GCHQ full-time employees, but by the mathematicians of the CESG unit, responsible for national ciphers and the protection of government communications systems in the UK. And the close interaction between the GCHQ and the NSA of the USA takes place primarily along the lines of joint intelligence activities. In other words, since the NSA also has its own IAD (Information Assurance Directorate) department, specializing in the development of cryptographic algorithms and information protection, the discovery of British colleagues was a complete surprise for the mathematicians of this unit. And for the first time they learned about it from their fellow spies who closely interact with the British ...

And when the same algorithms, in fact, based on factorization and discrete logarithms, regardless of the special services, were soon invented in the USA by open community researchers (Diffie, Hellman, Merkle, Raivest, Shamir, Adleman), the NSA made a huge effort to cram this genie back to the bottle.

Without revealing that the special service already has this math, the NSA chiefs simply tried in every possible way to prevent scientists from publishing this information widely. National security advocates have been pushing that strong cryptography is too serious a weapon, and their new public key encryption algorithms allow anyone, even people and parties who have never met each other, to be hidden from control.

As everyone knows, absolutely nothing with a ban on knowledge and gagging scientists at the NSA did not work. As a result, the open scientific community was very angry with the NSA. And besides, under the pressure of scientists and industry, it was not the spy intelligence service, but the civilian structure, NIST, USA, that began to lead the development and implementation of commercial cryptography in the country.

And although this story is very old, it is quite clearly repeated. Unless, of course, watch carefully.

The ETSI / IQC International Symposium on Quantum Secure Cryptography (in 2016), from which this story began, has several notable features.
Firstly, it was very solidly represented by the heads of important structures, special services of Great Britain, Canada, Germany. All these national special services are analogues of the American NSA. However, absolutely no one was mentioned explicitly from the NSA. And this, of course, is not an accident.

There is plenty of evidence, both from business leaders and directly from the heads of intelligence agencies, that after revelations from Edward Snowden, almost the entire US IT industry (not to mention other countries) reacts extremely negatively to NSA activities. In other words, at international forums discussing ways to strengthen cryptography in the light of new threats, it is now prudent for the NSA to simply not shine.

Another notable feature of what is happening is that this “workshop” in Toronto is not the first, but the fourth in a row. The first was in 2013 in Paris, and the second - especially interesting for us - took place in the fall of 2014 in the capital of Canada, Ottawa.
This event is interesting for the reason that there was a highly unusual report on behalf of the secret British secret service GCHQ (P. Campbell, M. Groves, D. Shepherd, "Soliloquy: A Cautionary Tale"). This is a report from the CESG information security division, which was personally made by Michael Groves, who leads cryptographic research at this intelligence agency.

It must be emphasized here that it is completely uncharacteristic for people from the British special services to talk about their secret developments at open conferences. However, this case was truly exceptional.

In his report, Groves not only said that British cryptographers have been developing quantum-safe algorithms for a long time, since the beginning of the 2000s.

At the same time, it is important that the decision to completely refuse (and not to strengthen-modernize the old design) was mainly made by the special services, due to a very powerful and very impressive attack by the British, developed back in 2013 (!) By a group of researchers from the open academic community . In the work of these authors: K. Eisentraeger, S. Hallgren, A. Kitaev, and F. Song. "A quantum algorithm for computing the unit group of an arbitrary degree number field." In STOC ACM, 2014, an essentially new quantum attack of a very general type is described, covering, in particular, a wide range of "post-quantum" crypto circuits, including Soliloquy, unknown to anyone at that time ...

The effect of this “half-open” speech by a large cryptographer of the British secret service turned out to be exactly as it was obviously intended. The information security industry and academy readily accepted CESG people as very knowledgeable consultants (who clearly demonstrated not only their “leading” competence, but also their willingness to share even their failure experience). At a forum in Toronto, the two CESG bosses were even entrusted with chairing sessions and moderating discussions. (!)

A completely different effect immediately manifested itself, usually accompanying any cooperation with special services. This refers to all excess of secrecy, attempts to drown out even the already published research results.

The story about the CESG grand cryptographer's performance at the open symposium was extremely sparingly covered in the media, and the article and presentation slides about Soliloquy can be found on the Web only to those who very clearly know what they are looking for (on the ETSI website, where these files are exclusively located, direct links to them are not detected).

But the most unpleasant is otherwise.

If anyone interested wants to get acquainted with the very article of scientists of the open community, which greatly impressed the British intelligence service, it quickly becomes clear that it is not so easy to find it. This article is not only on the site of scientific preprints Arxiv.org, where for a long time, along with physicists and mathematicians, both computer scientists and cryptographers are published. It is also not on the specialized site of purely cryptographic preprints Eprint.iacr.org, owned by IACR, or the International Association of Cryptographic Research. Moreover, each of the authors of the article we are interested in has many other publications on this and the other or even both of these sites.

But there is not only the work we need. Strange, but true.
Worse, if you set off to search for a file on the researchers ’personal web pages on university sites, an ambush awaits there too. The most famous of the co-authors, Aleksey Kitaev, is famous as a superstar in the horizon of quantum computing, has only a purely tangential relation to cryptography, and does not accumulate links to files of his publications anywhere.

Another co-author, Sean Holgren, really known as a cryptographer, like many other researchers, used to be used to post links to his publications on a university web page. But it was precisely on the article we were interested in that this case suddenly stopped. For all previous articles, files are available, but for the right one - only the name. For all subsequent publications 2015-2016. not even a name. Although such works are found in preprint archives ...

A truly complete list of everything that was, is, and will even be done (with appropriate links to files) is found only on the site of the youngest of the co-authors - named Fang Song. But, significantly, not on his university web pages, but on his personal website FangSong.info. And even here strange losses are revealed. We still have the PDF file with the variant of the article we are looking for, however, links to about the same file, but with names like "full version" and "Arxiv.org" turn out to be broken, looping back to the main page. That is, the files were clearly laid out by the author, but even here - as on the ArXiv site - inexplicably disappeared ...
All “disappearances” of this kind (quite a lot of similar cases) can be considered only with a very naive and superficial view of things. Most often, the explanation of what is happening is already contained in the headings of the articles, where the authors (in accordance with the rules instituted by scientists for a long time) are obliged to indicate the sources of financing and grants for the money of which the studies were conducted.

Specifically, in our case, the sponsor of the uniquely outstanding article on the new method of quantum cryptographic attack is (surprise!) The US National Security Agency. Well, "whoever pays for it dances," as you know. It is clear that the authors of the study themselves are always interested in the wide dissemination of their results, but their sponsors often have directly opposite goals ...

The only dark and really important point that has not yet been covered in this entire story is this.

What can be the relationship between the new, very effective (and very impressive special services) algorithm for opening all kinds of cryptosystems using a hypothetical quantum computer, and the hasty steps of the NSA to remove (back in 2015-2016) from cryptography circulation on elliptic curves? The connection here, as it turns out, is completely direct. But in order to notice it, again, one must carefully monitor what is happening.

When, at the turn of 2014-2015, the open community just became aware of the post-quantum Soliloquy algorithm from the British intelligence service, its subsequent compromise and the parallel invention of quantum attack, one of the very competent and knowledgeable cryptographers, Dan Bernstein, made an interesting generalization:
https://groups.google.com/forum/#!topic/cryptanalytic-algorithms/GdVfp5Kbdb8

Comparing all the facts known at that time, Bernstein put forward the assumption that in fact the new quantum algorithm from Holgren, Fang Song (and the company) also indicates the path to significantly more powerful attacks using traditional classical computers.

Moreover, on the basis of well-known, but very vague comments by the British, Bernstein concluded that the British special services know this, but prefer to keep it secret from everyone ...

And we know what happened afterwards. A few months later, in August 2015, the NSA suddenly surprised the whole cryptographic world with its sharp rejection of ECC cryptography with a relatively short key length.

The only ones who were hardly surprised were probably the cryptographers of the British intelligence service.

Well, six months later, at the beginning of 2016, already in the open cryptographic community, at least two independent publications from scientific researchers appeared, which in the most general terms confirmed Dan Bernstein's assumption:

1) Ronald Cramer, Léo Ducas, Chris Peikert, Oded Regev. "Recovering Short Generators of Principal Ideals in Cyclotomic Rings." In Eurocrypt 2016;

2) Jean-François Biasse and Fang Song, "Efficient quantum algorithms for computing class groups and solving the principal ideal problem in arbitrary degree number fields". In 27th ACM-SIAM Symposium on Discrete Algorithms).

In other words, it has now been rigorously and for everyone shown that yes, indeed, the new purely “quantum” approaches to solving difficult cryptographic problems, in fact, can significantly reduce labor costs when breaking cryptoschemes using classical computers.

Specifically, nothing has been openly announced yet about compromising the ECC scheme.

Or maybe you don’t need to do this?
Let's think together whether this is beneficial to the one who is aware?

But this, it seems, is only a matter of time.
legendary
Activity: 2954
Merit: 4158
Today, the main danger for hacking comes from artificial intelligence. And it is no longer theory, it is practice. There are already break-ins based on this technology. Keys and passwords are being compromised again. It was reported this month that more than one billion accounts could be hacked using artificial intelligence. Why? Because there's something to steal...
How do you think AI would affect ECDSA or more specifically public key cryptography? AI (or rather machine learning) does analysis based on certain trends and using passwords/dictionary attacks usually results in the algorithm being fed with big data and finding association and possible passwords based on the targets. If you want to bruteforce using this method, you could find success with leaked database but most likely not with sites that are designed to deter such attempts.

Now, back to the topic. Public key cryptography uses randomness generated from sources to generate the private key. Since the generation methods involves large amount of entropy, if given a public key, how would you use AI to tackle the algorithm and solve for the appropriate private key?
full member
Activity: 224
Merit: 120
OP's "don't believe" is a pure speculation at the moment. We know nothing about potentials of future technologies. If quantum computing become power enough to break the current ECDSA scheme and other algos involved then quantum resistant   cryptography will  take the place. The biggest problem  for scientists in the 17th century was how to clean the Earth from a layer of manure that (as they believe)  will cover it in 100 years. That  problem disappeared after the horses (- the main means of locomotions in that time) were replaced by steam and an internal combustion engines.
I think that post-quantum cryptography will take its place before quantum computers appear or not. The problem with elliptic curve cryptography is not that it can be broken, but that it is impossible to check the reliability of the elliptic curves we are forced to use. There is a lot of information on this subject from specialized sources, the main outcome of which is the fact that some elliptic curves proved to be unreliable, even though they were recommended by very influential, world-renowned organizations.
In addition, the existing cryptography on elliptic curves is based on unproven statement, on assumption, on hypothesis.
Another problem is that hackers do not break cryptography, but steal keys, cracking key infrastructure.
No one is paying attention to this. As long as it doesn't affect anyone personally.
And here quantum cryptography on the one hand solves all the problems of elliptic curve cryptography, but on the other hand does not solve the problem of key infrastructure compromise at all.
The solution of the future is keyless encryption technology. Such technologies, as far as I know, are already being developed.
Today, the main danger for hacking comes from artificial intelligence. And it is no longer theory, it is practice. There are already break-ins based on this technology. Keys and passwords are being compromised again. It was reported this month that more than one billion accounts could be hacked using artificial intelligence. Why? Because there's something to steal...
full member
Activity: 224
Merit: 120
I don't know how dangerous a quantum computer is, but I know how dangerous, even now, artificial intelligence, a system that supports password guessing!
Cybercriminals use artificial intelligence and neural networks to improve user password guessing algorithms. More traditional approaches, such as HashCat and John the Ripper, already exist and compare different variants of the password hash to successfully identify the password that matches the hash. However, using neural networks and Generative Adversarial Networks (GAN), cybercriminals will be able to analyze vast sets of password data and generate password variations that match a statistical distribution. In the future, this will lead to more accurate and targeted guessing of passwords and a higher chance of profit.

In a February 2020 clandestine forum post, we found a GitHub repository that has a password analysis tool with the ability to parse 1.4 billion accounts and generate password variation rules.
In addition, we also saw a post listing a collection of open-source hacking tools that have been cracked. Among these tools is AI-based software that can analyze a large set of password data from data leaks. This software ensures that it extends its ability to guess passwords by teaching GAN how people tend to change and update passwords.
legendary
Activity: 1904
Merit: 1277
Cross-posting from ...

How hard would it be to brute force an address. (Numerically)
- https://bitcointalksearch.org/topic/how-hard-would-it-be-to-brute-force-an-address-numerically-5267859

Thanks for that. I've made a quick post on that thread now, briefly summarising how much more effective a QC is at breaking bitcoin's cryptography, and outlining why QCs have such vast potential. Hopefully this is of some use to the discussion!
full member
Activity: 224
Merit: 120
I think that anonymity of a bitcoin owner and guessing or calculating a bitcoin address are different things.
No matter what bitcoin address, I'm far from the idea that a self-respecting hacker will pick up the code to get the hash sum. If I wanted to calculate the owner of a bitcoin, I would have done it through calculating the client's IP. If I knew one or the required set of bitcoin owner IP addresses, I would attack the owner with special software, I don't want to advertise bad things, so I don't name which one.
Hi.
Such turnkey software has already appeared a lot, starting from buying components just in the network and ending with the purchase of ready-made complex solutions, which can use even a child. Probably after 24 hours, I would see which keys of the keyboard (both physical and screen) my attacker presses, and even where he drives the mouse on the screen. I think, but I don't know, the whole financial part of the attack would take me $1,500. If the attacker has a financial interest for the hacker, then it is a matter of technique and ingenuity, not quantum computers and code brute force attack.
Hackers are a thinking people, unlike many network users. If only there was a point...
legendary
Activity: 2646
Merit: 1720
https://youtu.be/DsAVx0u9Cw4 ... Dr. WHO < KLF
Interesting thread!  Smiley

Cross-posting from ...

How hard would it be to brute force an address. (Numerically)
- https://bitcointalksearch.org/topic/how-hard-would-it-be-to-brute-force-an-address-numerically-5267859

In my opinion, sooner or later computers will be so efficient that they will perform these calculations so quickly that the whole process will take no more than 30 minutes, instead of an infinite number of years, as indicated in the author's post.
Maybe, I am somehow confident that quantum computers can do that. However, I don't think hackers could affort to have one just for brute forcing wallet addresses Grin. Now I know why they use other efficient means of hacking and set this as their last resort lol.
Quantum computers with connection on Bitcoin started to pop again, I also heard before about these quantum computers will make Bitcoin disappear or cryptocurrency itself. I still don't believe it, it's kinda a myth, lol.
For sure, if people will let this happen, I don't think cryptocurrency will only be in danger here.

Quantum computers are not a myth. Lots of companies already exist in the field of quantum simulation, both software and hardware development.

- https://www.rigetti.com/
- https://www.zapatacomputing.com/
- https://strangeworks.com/
- https://www.riverlane.com
- https://qcware.com/
- https://otilumionics.com/quantum-computing/
- http://horizonquantum.com/
- https://quantumsimulations.de/
- https://entropicalabs.com/
- https://1qbit.com/
- https://www.dwavesys.com/

...

Post-quantum cryptography, Bitcoin can move the with times when necessary, both the signing algorithm and the hashing algorithm can be upgraded to be quantum-proof, quantum-safe, quantum-resistant and quantum-enabled.

Post-quantum cryptography
- https://en.wikipedia.org/wiki/Post-quantum_cryptography

Bitcoin Q&A: Migrating to post-quantum cryptography
- https://youtu.be/dkXKpMku5QY

Bitcoin Q&A: Is Quantum Computing a Threat?
- https://youtu.be/wlzJyp3Qm7s

Christian Schaffner: Quantum Cryptography
- https://youtu.be/Lh8OGDNJZQk?t=1238

...

Quantum supremacy
- https://en.wikipedia.org/wiki/Quantum_supremacy

Bitcoin Q&A: "Quantum Supremacy"
- https://youtu.be/eo7mwcsUbdo

...

Quantum simulation, problem solving and mathematical discovery utilizing blockchain / timechain technology, now there's a thought.

Cool
full member
Activity: 224
Merit: 120
Yes, I don't believe that quantum computing is dangerous right now. However, progress does not stand still if Bitcoin is now a powerful source of power for many social networks. Since it allows you to transfer amounts anywhere, and at the same time allows you to control your money, ensuring you complete safety. But this does not mean that in 20-30 years it will be as safe. What we could not even imagine could happen. Even now, more and more powerful advanced models are being created, which (maybe) will be further refined and evolved in the future. So in the future, there is a risk that bitcoin will not be so secure. Huh Huh Roll Eyes
-----------------
Bitcoin will always be safe because it's based on good cryptography. In other words, cryptography, as a science and as a practice, is already 100 years ahead of technical progress. But these achievements have not yet been used. There's no need to. As soon as there is a need, these new cryptographic systems will be introduced immediately into bitcoin. The danger of bitcoin is completely different - it's not anonymous at all. If desired, all bitcoin owners can be identified. And only you will be identified, making a targeted attack will not be a problem.  A trained attacker will no doubt steal your keys. He'll take your bitcoins without hacking into the cryptography. You don't have to think about the security of your cryptography, no fools to break it, but about your anonymity on the network. Here's the big problem. And I don't know how to solve them. VPN or Tor don't solve them. Only an anonymous operating system...
full member
Activity: 224
Merit: 120
Yes, I agree, video understandable, the idea of curvature of our space-time is old as the world. I don't understand why passing near a massive object a ray of light has exactly such curvature as in the picture - deviates from the mass. By the way, it completely contradicts the behavior of objects in the video you provide.
About the mass.
Curvature is good, but it's only a way to talk about it.
I cannot but confirm that these questions are better known to the creator of our world, if he himself has not forgotten what he did.
But the photon beam itself - it has a mass, and completely independent of the so-called curvature of space-time. That is why the heated gas in a closed measuring system - has its own weight, which increases with increasing temperature. Because as the temperature increases - there is an increase in the flow of photons of infrared radiation in the closed system. A closed system is one in which the photons do not fly out, but are reflected and remain inside (Thermos).  Temperature rise - is an increase in the flow of photons, an increase in their number. That is why the mass of such a system, when heated, will increase.
I think a photon has a mass of motion...
The fact that time and space are one continuum is only a hypothesis.  That hypothesis has a lot of evidence. But the opposite hypothesis that time and space may sometimes be not in phase, not in such a single and indivisible continuum as we think, also has no evidence to disprove it.
An electromagnetic wave is also a continuum of electric and magnetic fields. But, in the absence of oscillations, an electric field can successfully exist without a magnetic one.  And there is no continuum! It's broken.
This indirectly confirms that the continuum also has its own time-space (a more correct definition than space-time, in my humble opinion) oscillations which we do not notice while we are inside this medium. Probably, this continuum can be as broken as any other. Nature is infinite and does not like the limitations of its manifestation.
An example in support of this view.
The theory of black hole existence. A place where gravity is enormous. Time increases (let us define that when time slows down in relation to our reference system, it means that time as a parameter of the length of events increases (!!!), and not vice versa, it is very important not to confuse and not to give in to the opinion of one's "common sense", science has often proved that it is "common sense" that is false), and space in these conditions decreases.
Let's check it out.
The rate at which any object falls under these conditions relative to us seems to be decreasing. The object is slowing down. That's because there's more time than we have. Let's see: velocity V is distance / time. That's right, only in this concept of time and space evaluation, speed V by its formula tends to zero relative to our reference system. The object will never fall to the surface of a black hole, it will seem to us that it has stopped. Yes, I know the surface of a black hole isn't, that's the way to talk.
On the other hand, if we fell to the center of the black hole and were alive, we would see our universe moving faster and faster and all the stars flying apart at increasing speed, our solar system dying, new ones forming... Here's the continuum, clap and it's gone.

It turns out that the places where there is no gravity are places where time flows as slowly as possible and space is enormous. That's why there's a constant of maximum speed in this environment - the speed of light in a vacuum, but now you have to add it immediately:
1) in the place with the least gravity;
2) and immediately add - for our reference system, which is also in the place with the smallest gravity.
Here is what is not in the formula for the speed of light - no relativity itself.
If we were in a place with strong gravity (in a black hole), the speed of the same light - would be for us completely different, larger, huge, any.

Old Einstein was right to say that speed is relative. He was right to say that no other object in our world can move faster than the speed of light. But he didn't agree that this is under the condition that the gravitational field in the place where light moves and where we observe and measure it from. 
After all, speed is the ratio of two components of the continuum time/space: and space (distance) / time (length of event) - both there, in the formula for speed. And both of these parameters are not constant in nature.

And there are suspicions that gravity not only curves space-time, to be more precise, in my terms, violates the conditions of their inviolable continuum, but is also a clear characteristic of our world and, therefore, when it changes - a passage to other worlds.
legendary
Activity: 1904
Merit: 1277
Please specify one thing. In the picture in the past, you can see that the beam from the star, passing near the star (the sun probably) - repulses. Is it?
I've always thought that large gravity objects attract a flow of photons to them, that's how gravity works in our everyday experience. That is why an electromagnetic wave (a flow of photons) cannot break out of the horizon of the black hole events.
I think the essence of space curvature by gravity in the picture is wrongly depicted.
No, the star doesn't repulse the light. Gravity is always attractive, never repulsive. The light travels in a straight line in spacetime, it's just that spacetime is curved around objects that have mass. A common way to visualise this is to place heavy objects on a rubber sheet. This is probably a better picture. If you imagine someone rolling a small ball from one side of the sheet to the other, the ball's path will curve as it passes close to the heavy object in the centre of the sheet. The ball itself travels in a straight line, it's just that the thing it's travelling on is curved.


If in electromagnetic interaction there is a rule of attraction of differently charged particles and a rule of repulsion of equally charged particles, we intuitively want to use the discovered effect - on gravity.
But gravity is a different field, with different properties. Gravity is always attractive, never repulsive.

large gravity objects attract a flow of photons to them, that's how gravity works in our everyday experience. That is why an electromagnetic wave (a flow of photons) cannot break out of the horizon of the black hole events.
If you look at that sheet above, the effect of a black hole is to produce such extreme curvature that eventually, in the centre, it drops vertically. Nothing can escape. Everything, photons included, has to follow the curves of spacetime, the only difference with a photon is that because it is massless, it doesn't create its own small curvature. A photon in the above image might be an effectively weightless ping pong ball - but it still has to follow the contours of the sheet. When it passes through the event horizon of a black hole, it still falls in. And we have to remember also that we are talking about spacetime rather than space and time as separate things. Gravity doesn't just cause curvature of space, it also slows down time, which is why for a distant observer, something that falls towards the event horizon never actually seems to go through it and disappear, because the spacetime curvature is so great.

Actually, better than that screengrab, have a look at this video. The balls that are thrown at around 2:45 all travel in a straight line - across a curved surface. This is how gravity works.

Think also of the moon orbiting the Earth. What is happening is that the moon is actually falling towards the Earth (in a straight line), it's just that its speed is sufficient to keep it moving forever around the lip of the gravity well.
full member
Activity: 224
Merit: 120
Please specify one thing. In the picture in the past, you can see that the beam from the star, passing near the star (the sun probably) - repulses. Is it?
I've always thought that large gravity objects attract a flow of photons to them, that's how gravity works in our everyday experience. That is why an electromagnetic wave (a flow of photons) cannot break out of the horizon of the black hole events.
I think the essence of space curvature by gravity in the picture is wrongly depicted.

And I want to note that the substitution of words:
1. two objects having mass - are attracted to each other or 2. the space around the massive object is curved and therefore the straight beam of light is also curved = identical, and do not explain the essence of the phenomenon of gravity. It's just a way of saying things differently, no more.

If in electromagnetic interaction there is a rule of attraction of differently charged particles and a rule of repulsion of equally charged particles, we intuitively want to use the discovered effect - on gravity. It's not only that objects absolutely identical to the atom can be attracted, but also different physical essence of the physical value "object mass" and gravitational attraction between objects - is present!  Mass is not identical to gravity, but these two phenomena always go hand in hand. Plus there is no possibility to make a gravitational insulator, and in electromagnetism it is possible.

And what's more interesting is that gravity reigns in the macro world.
In the microcosm, electromagnetism is at the level of the atom. There's nothing worth gravity. The whole substance surrounding us is of electromagnetic nature, plus virtual (I called them so here, this is my opinion) forces of weak and strong interaction, which are also a way to discuss the observed, not tools to understand it or notions that explain anything. Scientist sees that the atomic nucleus is held by something, so there is a force. Whatever you want to call it, it's what you want to call it. We transfer Newton's macrocosm laws to the microcosm. Force, acceleration, and speed itself are not very convenient concepts for the microcosm, where all objects are blurred in space and in essence are not defined by coordinates and there is no possibility to simultaneously measure both their coordinate and their physical parameters. Either that or that. How can we afford the observed effect of holding the components of the atomic nucleus together - the effect of strong or weak interaction? And add the term "force" to that. This is not the case when there is an object to which one can apply force and get acceleration. It's a microcosm. Everything there is dual (two or more have meanings, everything is not unambiguous) and uncertain. The most important thing is discretely space (distance) itself, energy (Planck's constant), mass (a multiple of God's Higgs boson), spin, electric charge, and I suspect time. Well, anything you don't take has no smooth nature. There's stairs everywhere!!!

That's why I didn't study the physics of the microcosm, that I didn't agree with the approach that science takes from the beginning. It seems to me that the microcosm is much thinner and more intellectual than the laws of our macrocosm and its view of nature. That's right, philosophy...
legendary
Activity: 1904
Merit: 1277
I'm thrilled. You have such a deep understanding of quantum mechanics, and you can explain these complicated things so clearly
Thanks Smiley I'm trying my best, but whilst I do have some background in theoretical physics, I am not an expert. Please don't assume I am getting everything correct!

the Higgs boson, a particle whose presence in matter determines its mass of rest.  So it's a gateway to the world of particles. A world in which one can exist - without having to move at the speed of light?
The Higgs boson is the manifestation of an excitation of the Higgs field. The Higgs field, as with other fields, permeates spacetime, is everywhere in spacetime, and may be thought of as a property of spacetime. So all particles in spacetime interact with this Higgs field to a greater or lesser degree. The extent of the interaction determines a particle's 'mass'. We are talking about rest mass at the moment, because we are talking about elementary particle physics and ignoring relativity. Technically, as far as I understand it, the Higgs boson manifests at the moment that electroweak symmetry is broken, where the W and Z bosons are created.
I don't think any of this has any impact on the speed of light; we are not talking about the Higgs boson travelling around and imparting mass, it is just a manifestation of an underlying field that is everywhere all the time.

If a photon has no resting mass, it means it has no Higgs boson, so I understand. It turns out that he (the photon) is doomed, has to move only at the speed of light and no less than (!), precisely because he would have grounds to participate in the gravitational interaction.
Yes, photons do not interact with the Higgs field.

So gravity is not a property of our world, but our world itself. The property of gravity is our macro and micro world, not the other way around.
I would say that gravity is an underlying field, which is a property of spacetime. Relativity expresses this as spacetime curvature. When we talk about light from a distant star 'bending around' a nearby star due to gravity, this could be misleading. It is better understood as light travelling in a straight line across a curved space; it is the star's gravity well that creates the curvature.


https://astronomy.com/-/media/Images/Magazine%20Articles/2019/October/sunandearth.png?mw=600

A good analogy is the flight path below. The shortest distance between Madrid and New York is the upper line, not the lower one. The upper line represents travelling in a straight line around the curved surface of the Earth.

https://gisgeography.com/wp-content/uploads/2016/11/RhumbLine-GreatCircle-2-678x421.png

everything in our world that cannot participate in the gravitational interaction with its objects - for our world does not exist.
As a result of this reasoning, there is a question.
And are there real mass-free particles, which exist without the obligation to move at only the speed of light in a vacuum?
If such particles are known, I am completely wrong.
The known massless particles are photons and gluons. Photons travel at the speed of light. Gluons, I suppose, technically, travel at the speed of light. However we come back to what a particle 'is'. Gluons are virtual particles bound within nucleons, and when expressed in quantum chromodynamics we talk rather of the gluon field. Beyond this, there is the possibility that gravitons exist as mediators of the gravitational field. Again, it becomes complex, because we don't yet have a proper theory of quantum gravity.
full member
Activity: 224
Merit: 120
I'm thrilled. You have such a deep understanding of quantum mechanics, and you can explain these complicated things so clearly that I will not understand what you are doing in this forum. According to my observations, there's a great deal of popularity here for topics related to quick practical results for readers.

Let's get back to our topic.
Physicists have long wanted to bury the old particle theory of the structure of the universe. Modern trends in science - a particle is a private state of the wave nature of matter. Simply put, it's a standing electromagnetic wave.
The theory of particles - this is the so-called Standard Model of the universe, to date, in a layer of the most elementary particles, found 6 quarks, 6 leptons, gluon, photon, z-boson and w-boson.
And this could be the end of this model.

But recently (it seems on a large collider in Switzerland) found the main find of modernity - "God's particle": the Higgs boson, a particle whose presence in matter determines its mass of rest.  So it's a gateway to the world of particles. A world in which one can exist - without having to move at the speed of light?

If a photon has no resting mass, it means it has no Higgs boson, so I understand. It turns out that he (the photon) is doomed, has to move only at the speed of light and no less than (!), precisely because he would have grounds to participate in the gravitational interaction.
This participation is the main law of our world, isn't it?
So gravity is not a property of our world, but our world itself. The property of gravity is our macro and micro world, not the other way around.
The idea is that its (photon) gravitational interaction with the surrounding world is not a consequence of its movement at the speed of light, as it is proved in science, but exactly the opposite. Due to the fact that he is deprived of the Higgs boson, he (poor) is forced to fly at only such a speed. Otherwise he will have no mass of movement (impulse). Hence the conclusion - everything in our world that cannot participate in the gravitational interaction with its objects - for our world does not exist.
As a result of this reasoning, there is a question.
And are there real mass-free particles, which exist without the obligation to move at only the speed of light in a vacuum?
If such particles are known, I am completely wrong.
legendary
Activity: 1904
Merit: 1277
About the 1000cc quantum computer - you're right, maybe that's not what it says. But then it's completely unclear why the US Department of Defense (DARPA) not only signed such a contract, but also made this information public.
A quantum annealer - if it is indeed an annealer, I am only speculating - is still very useful. It is perfect for solving certain types of problems, and doing so exponentially faster than a classical computer. This 1000 qubit annealer could be an important advancement... it's just that it's no threat to asymmetric cryptography.


I wonder what is the speed of interaction of two connected photons from the viewpoint of our macro-world, our three-dimensional in space and one-dimensional in time world? If it were the speed of light, then this interaction would be tied to the distance between the photons. But I read that there's no difference in the distance between the photons in our world.
I specifically wrote "in our world", meaning that perhaps there is another world in which these same linked photons look completely different.
I think the speed of interaction would always be light speed. The speed of a photon from a normal human perspective is c. As for the speed of a photon from the viewpoint of another photon, well, this is problematic. I don't think we can say that a photon sees another photon moving at c, or at a proportion or c, or even, for two photons travelling in the same direction, that they each see the other as having zero relative velocity. The reason we can't say this, is because the perspective of a thing travelling at c is the limiting case. No time passes for a photon. A photon regarding another photon sees simply a thing, not a thing that moves in time.


A photon is [...] The term "mass-free" does not accurately reflect the nature of this particle. Due to the principle of equivalence of inert and gravitational masses, all mass-free particles participate in the gravitational interaction.
I agree with your explanation of a photon, which I have truncated here. Yes, when we say it is 'massless', we are referring to it having zero rest mass. The problem is complicated because we have no proper relativistic theory of quantum mechanics. For QM, a photon is massless. For relativity, a photon has zero rest mass, but does have relativistic mass.
E=mc2 expands out to: E2 = p2c2 + m2restc4, where p = mrelv

But we can't talk about relativity from the perspective of quantum mechanics, because we don't yet have a proper marriage of the two theories.


And what photon (what spectrum of electromagnetic waves oscillations) is used in quantum computers - nowhere I have found.
Boson sampling is one approach:

https://www.scientificamerican.com/article/quantum-computer-made-from-photons-achieves-a-new-record/
Quote
Boson sampling can be thought of as a quantum version of a classical device called the bean machine. In that device, balls are dropped onto rows of pegs, which they bounce off of, landing in slots at the bottom. The random motion of the balls typically leads to a normal distribution in the slots: most balls fall near the center, and fewer fall toward the sides, tapering off at the edges. Classical computers can easily simulate random motion to predict this result.

Boson sampling replaces the balls with photons and the pegs with optical devices such as mirrors and prisms. Photons are fired through the array and land in a “slot” at the end, where detectors register their presence. Because of photons’ quantum properties, a device with just 50 or 60 photons could produce so many different distributions that classical computers would take billions and billions of years to forecast them.

But boson sampling can predict the results by carrying out the task itself. In this way, the technique is both the computational problem and the quantum computer that can solve it.
full member
Activity: 224
Merit: 120
About the 1000cc quantum computer - you're right, maybe that's not what it says. But then it's completely unclear why the US Department of Defense (DARPA) not only signed such a contract, but also made this information public.

But the photon, how it is arranged, why it is arranged in such a way and why finding the optimal way to control its nature for solving our computational problems - it captures me and I can continue the discussion.

I wonder what is the speed of interaction of two connected photons from the viewpoint of our macro-world, our three-dimensional in space and one-dimensional in time world? If it were the speed of light, then this interaction would be tied to the distance between the photons. But I read that there's no difference in the distance between the photons in our world.
I specifically wrote "in our world", meaning that perhaps there is another world in which these same linked photons look completely different.

A photon is not only a particle, a standing wave of light spectrum electromagnetic waves, but also thermal radiation, another spectrum of electromagnetic waves. Everywhere, they are similar or not, but photons.
And what photon (what spectrum of electromagnetic waves oscillations) is used in quantum computers - nowhere I have found.
This massless particle - has no mass of rest, but has mass when moving. It follows from the fact that, according to the formulas of the theory of relativity for energy and impulse, the speed "v" of a particle is determined through its impulse p, mass m and the speed of light with the ratio where E=mc2 is the energy of the particle. It follows that they cannot be in a state of zero energy.
It also follows that the spin values of mass-free particles can only be integer or semiparticles.

Therefore all "mass-free" particles should move only with speed of light. And that's why they all seem to have mass because of the speed of light.
That's why light and any electromagnetic radiation has a gravitational interaction, and therefore attracted by massive objects of the macro-world. That is how old Einstein became famous, received confirmation of his theories, when at the moment of solar eclipse was recorded deviation of the light beam emitted by the star and passing near the sun.
History writes that he woke up famous on that day. And so, all his life proving, almost without success, that he was right.
That's what I meant when I wrote "one against all".
But in reality there were 3 people (like him then supported by Lorenz and Poincaré).
For example, the thermal radiation inside a litre container weighs approximately one carbon atom.
The mass of radiation grows rapidly with temperature, but only at one billion degrees does it compare in density with our usual substance.
The term "mass-free" does not accurately reflect the nature of this particle. Due to the principle of equivalence of inert and gravitational masses, all mass-free particles participate in the gravitational interaction.

For our topic it is interesting that the properties of qubit can have any objects that are in the free state in the superposition of any 2 of its states. It is interesting why a photon was chosen for a quantum computer. Although I understand why a photon spin was chosen as a measurement parameter. Because this parameter in "mass-free" particles can have only integer values. but why the photon? It's very difficult to work with it, cool it, protect it, a little decoherence time and so on.
legendary
Activity: 1904
Merit: 1277
- according to the words of Bo Ewald, CEO of ColdQuanta, within the next 40 months, under the terms of this contract, a machine will be created, which will consist of 1000 (one thousand!!!!) cubits, and it will be able to make the necessary calculations... to create medicines and... (not interesting and not true) - to crack the ciphers.

All this suggests that fans and users of modern key cryptography have no more than 40 months
I can't find a ton of detail on the proposed ColdQuanta machine, but a couple of things lead me to think this will pose zero threat to cryptography. First is the name Bo Ewald - he is famous as the president of D-Wave, which is a quantum annealer rather than a general purpose QC. Second is the quote below:

And even the postulate about impossibility of speed exceeding the speed of light in a vacuum is only a temporary mistake. This ban was found by one man, Einstein. One genius - against all the others who stubbornly looked only at the official scientific line.
You can understand them. It's convenient. It's prestigious, scientific titles, respect, certainty. But you find one against all, and he wins this battle. Now everybody, just as they've always been used to it, is sticking to this official line of science. It's the same with quantum technology, too. But it's not as real as we think it is.
There's also one madman who will win the next battle, one against all, and give mankind a speed greater than the speed of light, much greater...
Relativity is experimentally verifiable. We can't really say it's the word of one man, when the effects are proven and reproducible. I'm not saying we'll definitely never find a way to travel faster than light, but the evidence at the moment is that this is the absolute limit. The effects on time and space are well known and, as I say, verifiable. It would require infinite energy for a thing with mass to hit light speed. Additionally, if object A and object B are moving away from a stationary observer in opposite directions, both travelling at c, then the relative speed of one to the other is not c+c=2c, it's c... the answer to the apparent discrepancy lying with time dilation.

By the way, and what speed of interaction of connected photons, after all this phenomenon is used in construction of the closed communication channel protected by methods of quantum cryptography? Isn't this speed greater than the speed of light? I have heard that the speed of this interaction between bound photons Absolutely does not take into account the distance between these photons. Isn't that proof of speed greater than the speed of light?
Photons are massless and travel at the speed of light. This means that from the perspective of the photon, time does not pass. The distance from the Sun to the Earth is around 8 light minutes. This means that when you see the Sun, you are seeing it as it was 8 minutes ago, because the photons from the Sun travel at the speed of light. If the Sun were suddenly to go out, we would not notice this for 8 minutes. If instead we consider photons moving between two points on Earth, the distance is so small to make it effectively instantaneous.
full member
Activity: 224
Merit: 120
Development and quantum technology are moving forward much faster than we think.  A powerful quantum computer will be built soon. There's even a contract with a serious organisation, as it turns out.
Information about real achievements in this field of knowledge is not disclosed to us, partly hidden from a wide range of readers.  This is what makes me think of the next news in the media.
The Office of Advanced Research Projects of the U.S. Department of Defense (DARPA) has signed a contract with ColdQuanta to build a new quantum computer. 
As we've been informed, building a quantum computer for 1000 cubic meters will be possible in the coming decades. But, based on what we're told, the deadline for creating such a computer has already come today. Here is what is reported on the details of this new quantum project:
 - according to the words of Bo Ewald, CEO of ColdQuanta, within the next 40 months, under the terms of this contract, a machine will be created, which will consist of 1000 (one thousand!!!!) cubits, and it will be able to make the necessary calculations... to create medicines and... (not interesting and not true) - to crack the ciphers.

All this suggests that fans and users of modern key cryptography have no more than 40 months (less than two years) left to change all their software, from operating systems to bitcoins. I'm not talking about the most asymmetric encryption anymore. And this applies to any end-to-end encryption model that we all use, in almost all communications solutions, because all of these technologies are built on asymmetric encryption methods in the phase of matching the shared key for symmetric encryption systems with the variable session key.

In 40 months' time, the era of quantum cryptography for the strong world and keyless encryption for the common people will begin.
If much is said about quantum cryptography, then keyless encryption methods are considered fiction and are not worthy of public attention.

However, I don't think so. For those who want to make a journey into the possible future of keyless encryption methods, I recommend to look at this forum: https://bitcointalksearch.org/topic/keyless-encryption-and-passwordless-authentication-5204368 (there are a lot of my posts were removed by the administration, so the sequence of thought was broken).
or this project: https://toxic.chat/.

The fact is that once a 1000 cubic meter quantum computer is created - the growth of computing power of new quantum computers - will not stop. The next one may be 10 000 cubic meters and so on.

Everything is moving much faster than we think.
And even the postulate about impossibility of speed exceeding the speed of light in a vacuum is only a temporary mistake. This ban was found by one man, Einstein. One genius - against all the others who stubbornly looked only at the official scientific line.
You can understand them. It's convenient. It's prestigious, scientific titles, respect, certainty. But you find one against all, and he wins this battle. Now everybody, just as they've always been used to it, is sticking to this official line of science. It's the same with quantum technology, too. But it's not as real as we think it is.
There's also one madman who will win the next battle, one against all, and give mankind a speed greater than the speed of light, much greater...
By the way, and what speed of interaction of connected photons, after all this phenomenon is used in construction of the closed communication channel protected by methods of quantum cryptography? Isn't this speed greater than the speed of light? I have heard that the speed of this interaction between bound photons Absolutely does not take into account the distance between these photons. Isn't that proof of speed greater than the speed of light?

Additionally, about the quantum paradoxes of our world, which we are successfully learning, I'll tell you later.
Pages:
Jump to: