Author

Topic: If we find DLP solution for EC, what is the alternative to replace ECC? (Read 281 times)

copper member
Activity: 1330
Merit: 899
🖤😏
How can you divide a number by an unknown number?  Tongue
You can't, there could be a closed source curve parameters, but it would be useless because G can reveal start and range of the group order.

If you hide that piece of information, how can you be sure that a private key is within the group?
I was thinking about a provable algorithm which could be placed on a computer to randomly generate a curve and then produce a closed source library, then we could burn that computer so that nobody can access any data from it.

But as I said, it won't work, and I'm not sure doing that is even possible.  I know this sounds stupid, hence the need to ask it from those who'd know better.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
I was thinking about a provable algorithm which could be placed on a computer to randomly generate a curve and then produce a closed source library
Being closed-source and being unknown aren't synonyms. More like antonyms. First of all, algorithms are deterministic. You cannot produce randomness out of an algorithm, because randomness is lack of predictability, and algorithm is completely predictable. That's why we use sources from the outside world in RNG. Secondly, there is no manner to prove a number was generated randomly.

But, forget about these. A closed-source library, on a low-level, is transparent. Binaries are transparent if you're competent of reverse engineering. There's a good saying that if you know Assembly, then every software is open-source.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
Is there any possible way to hide the group order of a curve? Because the only way I can think of solving DLP in polynomial time is by knowing the group order.
If you hide that piece of information, how can you be sure that a private key is within the group?

Edit: I doubt even hiding N could help, because you could still work with points, mod n
How can you divide a number by an unknown number?  Tongue



The elliptic curve cryptography, which is probably the most widely used asymmetric cryptographic system, is based on the knowledge of constants like α, β (:y = αx + β), N, G etc.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
The world is full of mysteries, and we will probably never know, because even if we will understand quantum things, then we may reach just the next stage of knowledge, and find another problem, that we will classify as "random".
That's the spirit. So, it'd be arrogant to say that P = NP, merely because you've observed that all of your problems whose solutions are easy to verify, are mathematically "easy" to solve.

They could exist. But as long as our machines are quite deterministic, we don't have good enough tools to play with that.
I agree that P = NP may be true in our deterministic machines, when it comes to problems that are fundamentally deterministic like hash functions. Practically, though, being constant isn't the same with being easy, and P versus NP is about easiness IIRC.
copper member
Activity: 821
Merit: 1992
Quote
So, by "breaking" a curve you mean brute forcing it, expecting to finish in a reasonable time?
Yes. You can always take some O(2^n) problem, and increase your constants, then it will turn into O(n) problem. Or maybe even O(1) problem, if you just use even bigger constants. It is just abusing the fact, that O-notation ignores constants, so if you can use "for all" instead of "for N", you can always simplify it into one, and go on, until reaching practically useless, but mathematically correct O(1).

Quote
I suppose that includes finding an algorithm that can do it in a better than linear time complexity for large curves?
Better than linear? It is constant! Funny thing: you can always measure the time of execution, and put "sleep(maxTime-time)" at the end of your algorithm, then you will always reach "constant execution time". But of course, it will be useless, even if correct.

Quote
But, what about non-deterministic things, like quantum phenomena?
If we don't know something, we can perceive that as "random". But if we would know all of that, then maybe we could look at it in a fully deterministic way? Who knows, you can always model some kind of game, where you, as a programmer, know all rules, and they are deterministic, but all NPCs inside can still perceive some changes as "quantum" or "random". The world is full of mysteries, and we will probably never know, because even if we will understand quantum things, then we may reach just the next stage of knowledge, and find another problem, that we will classify as "random".

Quote
Couldn't there be problems which cannot be solved in a constant time, as their complete randomness changes them fundamentally?
They could exist. But as long as our machines are quite deterministic, we don't have good enough tools to play with that. For example, some people thought that bitaddress can generate random enough keys, based on user input, when moving the mouse, or typing some characters. But it turned out, that if you do a "spider" on keyboard, or if you move your mouse "randomly", then it could still be predicted to some extent. For example, "34hr807wyr9awehgt978w43qr87wyer7" looks randomly, but is definitely non-random. You can easily notice, which characters are close to each other, and based on that input, you can determine, that it was done on QWERTY keyboard, and create "spider generator", that will try to guess such patterns without bruteforcing. The same is true for mouse movements.

Quote
But, isn't philosophy pretty much suggesting that you can't know everything?
It is true to some extent. Yes, you cannot know absolutely everything, and answer all questions about the world. But: if you narrow the scope, then you can reach the point of full knowledge. For example, you can say: "I want to find any SHA-1 collision". And then, you can complete that goal, and think "so, now people will move into SHA-256 and other hash functions". And it is true, some people moved there. But because of backward-compatibility, we still have "hardened SHA-1". And you can say, again: "I want to find any hardened SHA-1 collision". It is never ending story of asking questions, and finding answers. Life is full of mysteries, and many people have a lot of fun, when they play such games. And in many such problems, there are some rewards for those efforts, for example there was a reward for generating SHA-1 collision. This story never ends, because you can still say: "well, that solution is good, but now I want to receive exactly two-block SHA-1 collision, not that five-block collision from this famous PDF". And you can use OP_SIZE to enforce that.

So, even if you don't know everything, the knowledge you have, can still provide you a lot of fun. And you probably can still reach full knowledge, if you define your problem correctly, and narrow the question to some achievable state.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
1. Is it brute-force? Yes, of course.
So, by "breaking" a curve you mean brute forcing it, expecting to finish in a reasonable time? I suppose that includes finding an algorithm that can do it in a better than linear time complexity for large curves?

It is not "complete randomness involved". Hash functions are deterministic.
My response goes to your philosophical conclusion, that every computation can be performed in a constant time no matter what. Hash functions are deterministic, and thus fall into your generality. But, what about non-deterministic things, like quantum phenomena? Couldn't there be problems which cannot be solved in a constant time, as their complete randomness changes them fundamentally?

I understand your assertion like "We can solve anything in constant time if we are able to try out everything in the worst case". But, isn't philosophy pretty much suggesting that you can't know everything?
copper member
Activity: 906
Merit: 2258
Quote
How can you be certain that a problem can be solved in O(1) if for example there is complete randomness involved?
It is not "complete randomness involved". Hash functions are deterministic. We exactly know, which operations are used inside. Which means, you exactly know, that for example "a[1]=rol5(a[0])+f[0]+e[0]+k[0]+w[0]". You exactly know that "a[1]=0x9fb498b3+w[0]" is always the case, no matter which message you will pass into SHA-1. It is always true, it is always the case, it is not random. It is just deterministic, and uses avalanche effect, to look randomly. But it is not, because you can reproduce the same outputs for the same inputs.

Note that for small enough hashes, you can empirically prove all of that (for example, if you replace 32-bit internal values by 1-bit internal values, and reach 5-bit SHA-1 like function). Then, you can for example reach 2^5 possible input hashes, and 2^16 possible message blocks, and by collecting 2^21 inputs, you can deterministically explore all of them, and count, how many, and which exactly messages, are used for any given 5-bit hash. And then, you can just take your algorithm, increase your numbers, and mathematically prove, that it applies into the real SHA-1, but you don't have enough resources to compute all of that, just because you work with bigger numbers.

Quote
Also, doesn't this theory rely on determinism?
Yes, but our computers are deterministic machines. Or at least they should be, for all we know. Maybe quantum ones will produce some noise instead, but our currently used machines, based on binary system, are definitely as deterministic, as they could be, and when they are not, then it is usually submitted as a bug, and people try to fix it then, and make it deterministic again in all of those weird cases.

Also note that you need a deterministic machine, and you want a deterministic one. You want to always compute block hashes in the same way. You always want to compute public keys and signatures, and reach the same result on all your machines. Bitcoin and blockchain relies on that determinism, when you don't have it, you can reach a chain split.
copper member
Activity: 821
Merit: 1992
Quote
If both curves share the same properties except size, can you break the 32-bit curve without breaking the 256-bit one? If yes, can you please explain me how?
Why not? You can find my list of smaller curves here: https://bitcointalksearch.org/topic/smaller-elliptic-curves-y2-x37-based-on-secp256k1-5459153

We pick some 32-bit curve: p=0xfffff9af, n=0xfffe390b, base=(0x1,0x3cad5d2d)

We start from private key equal to one, and iterate through all values, up to 0xfffe390b. We always iterate through all values, no matter what. Let's say, your goal is to find lambda value, and you have this public key: (0x25934cd2,0x3cad5d2d). So, how to do that? This Sage code should do:
Code:
solution = 0x1
p = 0xfffff9af
K = GF(p)
a = K(0x0)
b = K(0x7)
E = EllipticCurve(K, (a, b))
G = E(0x1, 0x3cad5d2d)
E.set_order(0xfffe390b * 0x1)
P = solution * G
while(solution<0xfffe390b):
    solution+=1
    P = solution*G
    if((P[0]==0x25934cd2) and (P[1]==0x3cad5d2d)):
        print("solution="+hex(solution))

1. Is it brute-force? Yes, of course.
2. Is it O(1)? Yes, it always go through all points, even if the whole solution is just the second point after the base point.
3. Could you put bigger numbers here, and reach an algorithm for secp256k1? Yes, go on, and change p-value, n-value, the base point, and the point you are trying to find. It will work fine.
4. Could you use that in practice for bigger curves? Of course not, even for 32-bit ones, it is much more complex than it should be, because you should get it after 2^16 operations, not 2^32. But hey, it has O(1) time complexity. It works! And it is useless for secp256k1. All conditions are met.

Edit: Will Sage server stop your script, before it will find a proper solution? Yes, of course. But you can try the same thing on smaller curves from my topic, and for some of them, it will compute all of that before reaching timeout. Or you can run it locally, and then remove all time-based restrictions, and wait longer to get the answer.
copper member
Activity: 906
Merit: 2258
Quote
So all these examples you quote of hash functions and curves are only amenable to asymptotic complexity analysis if you assume that they are defined on arbitrarily large sizes.
They are never defined "on arbitrarily large sizes" in any standards. For example, both SHA-1 and SHA-256 contain a field called "message size", that takes 64 bits, which means, you can have only messages from 0 to 2^64-1 size in bits. You cannot pass a bigger message into any of those hash functions, you have to patch it, for example by taking your size modulo 2^64, if you want that.

Quote
When people say that finding a hash with d leading 0s takes 2^d time, they specifically *ignore* the fact that d is bounded by 256, and instead pretend that the hash function somehow generalizes to arbitrarily large inputs.
Or, you can say that they just include the fact that computations are not instant. Which means, "2^32" is just a constant, but we assume it is "2^d", and that d-value has exponential complexity.

So, probably we are both right, but it is just about semantics, and saying the same things, using different words. But still, I argue we could have some algorithm, running with less complexity, but with bigger constant, and mathematically it could be correct, but practically, everyone will turn it off, just by being not patient enough to wait for those 2^256 computations to complete.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
It depends. Because I can imagine a successful attack, that works on one curve, but does not work on another. The simplest case is about the size of the curve: if you can break some 32-bit curve, it doesn't mean you can break some 256-bit one.
If both curves share the same properties except size, can you break the 32-bit curve without breaking the 256-bit one? If yes, can you please explain me how?

I believe that P = NP, but of course I have no proof. However, if you think philosophically about it, each and every computation, that can happen in practice, can be always performed in a constant time.
But P versus NP isn't just about algorithm complexity. It's about solving the problem easily if it's easy to verify that it can be solved. It's objectively difficult to find the solution in something like an SHA256 collision by brute forcing. Also, doesn't this theory rely on determinism? How can you be certain that a problem can be solved in O(1) if for example there is complete randomness involved?
copper member
Activity: 821
Merit: 1992
I agree with vjudeu: you can find a confirmation by reading about Hutter Prize:

Quote
Losslessly compress the 1GB file enwik9 to less than 114MB. More precisely:

* Create a Linux or Windows compressor comp.exe of size S1 that compresses enwik9 to archive.exe of size S2 such that S:=S1+S2 < L := 114'156'155 (previous record).
* If run, archive.exe produces (without input from other sources) a 10^9 byte file that is identical to enwik9.
* If we can verify your claim, you are eligible for a prize of 500'000€×(1-S/L). Minimum claim is 5'000€ (1% improvement).
* Restrictions: Must run in ≲50 hours using a single CPU core and <10GB RAM and <100GB HDD on our test machine.

Why those restrictions are needed?

1. Size-based restriction, and requiring a decompressor is needed to avoid "extreme decompressor solution":

Quote
consider an extreme "decompressor" of size 1GB that simply outputs enwik9 byte by byte (from a zero byte archive), thus achieving a compressed size of 0

2. Time-based restriction is needed, because you can "teleport" many size-based things into time-based things. For example, you can create "extreme contest solution" by creating a program, that will try every possible Turing-complete program, until reaching better results than competitors. It is mathematically guaranteed, that your program will always beat the competition in the future, but nobody knows exactly, when, and how it will look like. So, time restriction is needed to avoid that, and say "mathematically, you can be right, but we waited 50 hours, and your program lost the battle, even if it could win, if it would be left running 10 years longer".

3. System-based restriction is needed, because even if you want to invent your own CPU architecture, with your own machine instruction set, you should still translate it for example into x86, to make it possible to compare with other competitors.

4. Other restrictions are also well-justified, but those ones above are probably most important, and there are many obvious solutions, that could be submitted, if those restrictions would be missing.
legendary
Activity: 990
Merit: 1108
Quote
Of course the latter runs in some fixed time, since you just fixed n.
But "n" is always fixed.
No! Not if you want asymptotic time complexity to have any meaning at all.

That aims to study how running time changes as a function of (ever growing) input size.

So all these examples you quote of hash functions and curves are only amenable to asymptotic complexity analysis if you assume that they are defined on arbitrarily large sizes.

When people say that finding a hash with d leading 0s takes 2^d time, they specifically *ignore* the fact that d is bounded by 256, and instead pretend that the hash function somehow generalizes to arbitrarily large inputs.

You can argue that any running time on Earth is limited to the time until the heat death of the Universe and therefore is O(1), but that just makes the notion of time complexity completely useless.
copper member
Activity: 906
Merit: 2258
Quote
since you cannot even examine all of the input in constant time
Why not? For example, the input in SHA-256 is just a single 512-bit block, and some 256-bit initialization vector. Which means, you have only 2^768 possible input states for SHA-256, that will cover exactly all cases, because every chunk can be described by such model, no matter if it is at the beginning, in the middle, or is it the last chunk in the Merkle-Damgard construction.

In ECDSA, it is even simpler: for secp256k1, you have exactly N valid inputs, for which you can get exactly N valid public keys. This is guaranteed to be exhaustive, because n-value is not "picked", but "calculated", and mathematically based on p-value, and the curve equation, so for such input, you cannot use any different n-value (it is always deterministic). Which means, you can just always iterate from 1 to N (and not stop your computation, if you find any solution faster), then your algorithm is guaranteed to run in O(1) time. For smaller curves, with sufficiently small n-value, you can even empirically proof, that it reaches O(1) complexity.

Quote
Of course the latter runs in some fixed time, since you just fixed n.
But "n" is always fixed. If you have 1 TB SSD, and 8 GB RAM, you can mathematically prove, that for any algorithm running on that device, your "n" is never going to be "infinite", but you have finite limitations on top of everything you want to execute. And if you want to argue, that "I can just process some data, and forget about them", then you can add for example 4 GHz CPU speed, and count all of your processors, to exactly calculate, what is the upper bound, when it comes to speed (so, how many steps of the algorithm per second you can compute). It was demonstrated for example in hashing, where you have O(1) brute forcing of the block hash, but you have for example "petahashes per second" to actually compare the same bruteforce-based algorithms on different machines (even if they have the same computational complexity, they have different performance).
legendary
Activity: 990
Merit: 1108
each and every computation, that can happen in practice, can be always performed in a constant time. Which means, every single algorithm can be written with O(1) complexity
You're confusing two things. An algorithm in general works for arbitrary input size n. A computation would be running an algorithm on a specific input. Of course the latter runs in some fixed time, since you just fixed n.

But when talking about the time complexity of an algorithm, you need to consider the running time as a function of input size n. Almost no algorithms run in time O(1), since you cannot even examine all of the input in constant time.
copper member
Activity: 906
Merit: 2258
Quote
There are none if it turns out that P = NP.
I believe that P = NP, but of course I have no proof. However, if you think philosophically about it, each and every computation, that can happen in practice, can be always performed in a constant time. Which means, every single algorithm can be written with O(1) complexity, but then, your constant will be huge. For example, if you want to break 256-bit hash function in O(1), then you can just take every single hash, and check it by using brute force. Then, for all inputs, you have a constant execution time, even if it is useless.

Also, in general, if you think about any resources: disk space, memory size, network speed, just everything has some limits. We can roughly estimate, how many particles exist in our universe, and then, we can calculate, how many possible states there are, and make it a huge constant. Which means, everything can be always done in O(1), but of course that will be always some kind of "useless bruteforce".

So, my answer to that is: even if you assume, that not only P = NP, but also that everything can be always solved in O(1), then still, those constant values in your algorithm, will make it unreachable in practice, so we can still be safe. Also, guess what: O(n^n^n^n) looks scary, but if n=2, then it is only 2^2^2^2=2^2^4=2^16=65536. Terrible complexity with low constants can be good enough for many problems.
legendary
Activity: 990
Merit: 1108
Is there any alternatives to replace  ECC?
There are none if it turns out that P = NP.
In that case nearly all cryptography is broken (only information theoretically secure constructs like one-time pads will survive).

That also means the end of SSL, TLS, HTTPS and all forms of secure online communication.
That will impact society in far bigger ways than just the downfall of all cryptocurrencies.

So let's hope and pray that P != NP ...

(some people may argue that P=NP has no practical effect if solving NP-hard problems like SAT requires time
Ω(n^100) to solve, but most complexity theory experts believe that if SAT is in P then it will be solvable in
a much more reasonable time like O(n^6)).
legendary
Activity: 1568
Merit: 6660
bitcoincleanup.com / bitmixlist.org
Is there any alternatives to replace  ECC?

|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Name        | PrivK |  PubK |   Sig | Security |  PubK |   Sig | block |  average |
|             | bytes | bytes | bytes |     bits | *secp | *secp | *secp | block MB |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| secp256k1   |    32 |    32 |    64 |      128 |   1.0 |   1.0 |   1.0 |      1.3 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| RSA 2048    |   512 |   256 |   256 |      112 |   8.0 |   4.0 |   6.0 |      7.8 |
| RSA 3072    |   768 |   384 |   384 |      128 |  12.0 |   6.0 |   9.0 |     11.7 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Dilithium2  |  2528 |  1312 |  2420 |      128 |  41.0 |  37.8 |  39.4 |     51.2 |
| Dilithium3  |  4000 |  1952 |  3293 |      192 |  61.0 |  51.5 |  56.2 |     73.1 |
| Dilithium5  |  4864 |  2592 |  4595 |      256 |  81.0 |  71.8 |  76.4 |     99.3 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Falcon-512  |  1281 |   897 |   666 |      128 |  28.0 |  10.4 |  19.2 |     25.0 |
| Falcon-1024 |  2305 |  1793 |  1280 |      256 |  56.0 |  20.0 |  38.0 |     49.4 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Sphincs+    |    64 |    32 |  7856 |      128 |   1.0 | 122.8 |  61.9 |     80.5 |
| Sphincs+    |    96 |    48 | 16224 |      192 |   1.5 | 253.5 | 127.5 |    165.8 |
| Sphincs+    |   128 |    64 | 29792 |      256 |   2.0 | 465.5 | 233.8 |    303.9 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|


If you look at the average block MB, I can at least guess why Satoshi chose the secp256k1 keypairs for including their signatures in transactions, as the rest of the cryptosystems have large block-sizes that resemble Bitcoin Cash and SV. This is most likely because of the ballooning sig-bytes sizes.

These two:

Quote

| Falcon-512  |  1281 |   897 |   666 |      128 |  28.0 |  10.4 |  19.2 |     25.0 |
| Falcon-1024 |  2305 |  1793 |  1280 |      256 |  56.0 |  20.0 |  38.0 |     49.4 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Sphincs+    |    64 |    32 |  7856 |      128 |   1.0 | 122.8 |  61.9 |     80.5 |
| Sphincs+    |    96 |    48 | 16224 |      192 |   1.5 | 253.5 | 127.5 |    165.8 |
| Sphincs+    |   128 |    64 | 29792 |      256 |   2.0 | 465.5 | 233.8 |    303.9 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|


are the quantum algorithms, right? Or is Dilithium also one of them?

Is there any quantum digital signature algorithm that does not make humongous signatures?
full member
Activity: 206
Merit: 447
What happens if such solution is found?
Nothing. Anybody finding it would keep silent. The incentive to not spill out the secret is just too big.

Is there any alternatives to replace  ECC?

|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Name        | PrivK |  PubK |   Sig | Security |  PubK |   Sig | block |  average |
|             | bytes | bytes | bytes |     bits | *secp | *secp | *secp | block MB |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| secp256k1   |    32 |    32 |    64 |      128 |   1.0 |   1.0 |   1.0 |      1.3 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| RSA 2048    |   512 |   256 |   256 |      112 |   8.0 |   4.0 |   6.0 |      7.8 |
| RSA 3072    |   768 |   384 |   384 |      128 |  12.0 |   6.0 |   9.0 |     11.7 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Dilithium2  |  2528 |  1312 |  2420 |      128 |  41.0 |  37.8 |  39.4 |     51.2 |
| Dilithium3  |  4000 |  1952 |  3293 |      192 |  61.0 |  51.5 |  56.2 |     73.1 |
| Dilithium5  |  4864 |  2592 |  4595 |      256 |  81.0 |  71.8 |  76.4 |     99.3 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Falcon-512  |  1281 |   897 |   666 |      128 |  28.0 |  10.4 |  19.2 |     25.0 |
| Falcon-1024 |  2305 |  1793 |  1280 |      256 |  56.0 |  20.0 |  38.0 |     49.4 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|
| Sphincs+    |    64 |    32 |  7856 |      128 |   1.0 | 122.8 |  61.9 |     80.5 |
| Sphincs+    |    96 |    48 | 16224 |      192 |   1.5 | 253.5 | 127.5 |    165.8 |
| Sphincs+    |   128 |    64 | 29792 |      256 |   2.0 | 465.5 | 233.8 |    303.9 |
|-------------+-------+-------+-------+----------+-------+-------+-------+----------|


copper member
Activity: 821
Merit: 1992
Quote
I don't know how to find a working alternative if elliptic curve cryptography is broken by solving DLP.
https://en.wikipedia.org/wiki/Post-quantum_cryptography

Quote
I'd imagine if we find the solution for one curve, all the curves could be solved as well.
It depends. Because I can imagine a successful attack, that works on one curve, but does not work on another. The simplest case is about the size of the curve: if you can break some 32-bit curve, it doesn't mean you can break some 256-bit one.

Quote
What happens if such solution is found?
Then, next steps will depend on the attack. I assume people will try to construct some "hardened" version of secp256k1, exactly as it was done with SHA-1. Of course, over time, people will move to another solution, for example NTRU, but the first steps will probably focus just on patching the network in a backward-compatible way, and blocking just that particular attack, to get some time to implement the new solution properly.

Quote
Is there any alternatives to replace  ECC?
Of course there are alternatives. As long as there are some unsolved mathematical problems, you can just take your mathematical question, share it with others, and if it remains unsolved for a long time, you can try to build a new crypto system around that specific problem.

Quote
This is a serious discussion, please don't try saying "devs will find a way".
You don't have to trust there is a way. You can read some history of cryptography, and see, how it was done in the past. In ancient times, using encryption similar to ROT13, but when letters were moved by three characters, was sufficient. A century ago or so, doing a simple replacements like "A=T", "H=E", was sufficient (see: Enigma). When computers became more popular, more advanced methods were used, and currently used cryptography is so complex, that you can no longer conveniently compute it with pencil and paper. So, what will happen, when another crypto problem will be solved? Well, we will pick a new one, because mathematicians have more than one question, and ECDLP is not the only unsolved math problem.
copper member
Activity: 1330
Merit: 899
🖤😏
This has been my concern for a long time, I don't know how to find a working alternative if elliptic curve cryptography is broken by solving DLP.

I'd imagine if we find the solution for one curve, all the curves could be solved as well.

What happens if such solution is found?
Is there any alternatives to replace  ECC?

This is a serious discussion, please don't try saying "devs will find a way".  I want to know if finding a way is even possible or not.
Jump to: