Pages:
Author

Topic: Relaunched Completely - page 2. (Read 11506 times)

newbie
Activity: 56
Merit: 0
February 21, 2016, 09:30:23 PM
Well, given QC is possible, you could use Shor's Algorithm (https://en.wikipedia.org/wiki/Shor%27s_algorithm) to solve the discrete logarithm problem in polynomial time.

DL is not believed to be NP-complete to my knowledge.  It is no harder than integer factorisation, for which a sub-exponential non-quantum algorithm exists.  However some NP problems (and therefore all NP-complete problems) are believed to require exponential time.

It's been a long time since I studied this, and I can't recall if NP-complete problems were proven to require exponential time (assuming P/=NP), rather than being merely assumed to do so.  If proved, then it follows that DL is provably not NP-complete.  However if it was merely a belief, then beliefs can change.

Or perhaps I'm remembering it wrong, and all that was ever said was that the only known algorithms for known NP-complete problems were exponential.  It really is a long time ago that I took that class.
newbie
Activity: 56
Merit: 0
February 21, 2016, 09:01:20 PM
Is there any consideration for protection against quantum computation? If not I suggest there should be it may not be too long from now.

Lionel's original white paper had another vulnerability which I haven't discussed in this thread because it was only theoretical - the computational resources needed to exploit it would have been enormous.  A quantum computer might possibly have bust it wide open.  The vulnerability is easier to fix than it is to explain.

The rule of thumb I use is that QC effectively halves the sizes of the crypto primitives you're using, so SHA-256 will be as strong as a 128-bit hash is now, which should be good for a long time, if not forever.  RIPEMD-160, used to turn ECDSA public keys into bitcoin addresses, looks very vulnerable.  I don't honestly know whether the ECDSA keys are big enough to resist QC.  I do know that only last year the NSA recommended moving away from elliptic curve cryptosystems which include ECDSA, or at least not migrating to them, in anticipation of QC.  Make of that what you will, this is the NSA, after all.

IMO it is a good design decision to make the keys compatible with bitcoin.  There will be plenty of time for us, and for bitcoin to move to something better later.  Moreover, there is not yet a consensus among crypto experts as to what "something better" might be.

Edited to add:  I notice EK has posted a contrary opinion.  I cannot vouch for his expertise in this field.  I can only vouch for my own lack thereof.
full member
Activity: 124
Merit: 100
February 21, 2016, 08:11:59 PM
Is there any consideration for protection against quantum computation? If not I suggest there should be it may not be too long from now.
newbie
Activity: 56
Merit: 0
February 21, 2016, 08:01:26 PM
#99
If we would really accept programs with undecidable termination time, how do we make sure that we still have the algorithm output some feedback which is a) deterministic and b) can be used to measure the amount of work that has been already put into the calculation of such function: at least - on average - every 10 minutes there should be some feedback.

Every 10 milliseconds would be better.  Bitcoin generates a new block every ten minutes on average, but individual miners produce millions of hashes per second.

Quote
Another problem with this scheme is, that the verification of such PoW proofs will take forever. I mean if you have programs that terminate after minutes for a random input, then verification will take that long as well. So if we support arbitrary program execution times, this has to be thought of as well.

It's a tradeoff.  The greater the hashrate, the greater the computational overhead, and the less useful work that gets done.  On the other hand, the longer between hashes the longer the confirmation overhead, so again there is a computational overhead.

There will be a sweet-spot, which will probably have to be found by experimentation, and which may depend upon different miner's hardware.  It will have to be a tunable (and if necessary, a retunable) parameter.

Quote
hinking about this one for 1 hour already, but all I come up is to check the cpu registers / memory / etc. at a deterministic interval of instructions.

That won't work because it's not machine independent.  You're on the right track though.
newbie
Activity: 56
Merit: 0
February 21, 2016, 06:15:31 PM
#98
Finally my construction works even if you don't cripple the language.  Arbitrary loops are allowed, as are programs which never halt.  You just abort them if they run too long; the PoW still works.

Thanks for your valuable comments so far.
Well, at some point i'm afraid we must cripple the language so that we can reliably determine the runtime beforehand.
This is required imho to set fees that are "in some way related" to the computational power required to such the proof-of-work function.

You don't need to know in advance how long it will run.

The buyer commits a number of ELC, and specifies how much he will pay the winning miner for each block solved while running his program, and what bounties he will pay for what results.  This is written into the blockchain.  Once committed, the network won't let him spend those coins in any other way, until the offer terminates or is withdrawn (Edit: by which I mean the buyer withdraws the offer.  There should be a short time before the withdrawal becomes effective, in order to allow the miners to notice the withdrawal and stop work).  It is then up to each miner to decide for himself if he accepts the deal or not.  There's no need for him to notify the buyer of his acceptance (Edit: or anyone else, until he actually solves a block). He just starts running the program.  The network confirms each block in the usual way and pays the winning miner out of the committed funds.  Similarly the network confirms the bounties, and pays the solving miners.  When the funds are exhausted, or when the offer terminates or is withdrawn, then the miners will abandon work on that program and run another.  Any ELC left over are returned to the buyer.

Quote
This again is required to discourage functions that run "too long" (see thoughts below your second quote).
Otherwise an attacker might create a proof of work function that only terminates when the (for him) desired result is returned and otherwise not. Otherwise it runs indefinitely until aborted.

It runs indefinitely until the funds are exhausted, at which point the miners say "thank you and goodbye".  There is no attack here.

Quote
This again would mean, that only blocks which are relevant to him (i.e., contain relevant information such as the to-be-cracked hash) would be solved. Depending on the complexity of the problem, he this way might get allocated much more computation power than wanted. Blocks must be solved on a regular basis, so the termination is required here.

It's not required with my scheme.  A user-supplied program could run for hours or days, while generating proof of work blocks every ten minutes on average (or whatever other period is specified)..

Quote
I personally think that either there should be a hard limit on the execution time (which is not my favorite) or the fees must increase asymptotically faster than the execution time. Going past the 10 minute execution mark should be expensive enough such that users are discouraged to submit such POW functions.
Here, we should calibrate the fees/other parameters in a way that typical applications (this is yet to be defined, what is typical?) are cheap, and everything that derives from this standard gets more and more expensive.
(In this scheme, btw., the crippled language is required again)

With these limitations, you are foreclosing on one particular application which is ideal for the kind of distributed network which we are talking about.  Iinteger factorisation using the Elliptic Curve Method (ECM), uses the mathematics behind elliptic curve cryptography, but to a different purpose.  ECM is currently the fastest known algorithm for finding factors larger than about 20 decimal digits, and smaller than about one-third of the number of digits of the number to be factored.  (For factors larger than one-third of the length, sieving algorithms are better.  Don't even think of trying to use ECM to factor an RSA modulus.)

With ECM, the payload would be the target integer, a depth parameter B1, and possibly a few other switches and parameters.  The algorithm takes a random parameter s which defines the specific elliptic curve.  This would be the random input supplied by the miners.

For each s, the algorithm has a small chance of finding a factor.  The usual process given a target whose factor lengths are unknown is to start with a modest B1, run a few curves.  If no factor is found, increase B1 and run more curves, and so on.  Here is a list of recommended B1 levels, and the corresponding recommended number of curves to run.

Currently I am trying to factor a 187 digit number.  By my own efforts, I have reached the B1=110,000,000 level, and each curve is taking about eight minutes to run using GMP-ECM.  A python implementation would surely be slower, and of course there will certainly be some computational overhead associated with the Elastic Coin infrastructure.

Just as I reach the point where I think a cluster would be useful, are you seriously telling me that the cluster you're building can't hack it?
full member
Activity: 124
Merit: 100
February 21, 2016, 05:52:24 PM
#97
Funding is not the main concern here. 23 bitcoins could finish this project using a p2p automated sofware creation service eventually I really believe. There is codevalley coming soon for instance. With Evil-Knievel redoing the white paper among other things it may begin to look a lot more appealing to potential investors. I see the current rate of funding as improving to the point of selling out before it ends.
I decided to invest a few more btc just in case it increases my computational wealth. I don't need to own a whole percentage though I may change my mind.
legendary
Activity: 1073
Merit: 1000
February 21, 2016, 03:29:38 PM
#96

I took part in several Poloniex ICOs (and a couple were really profitable), a couple of small-scale non-exchange ones (and even 1-2 without escrow), and you probably notice my signature is another ICO coin.

I'm fine with ICOs and have taken part in probably half a dozen or so of them. That's why I have asked questions here and pointed out some odd things ... usually the best run ICOs aren't managed like things have been here. If the fellow running the ICO states he will post his CV, for instance, it's sort of nice if he actually does ... and provides info on who is doing the coding, his other devs, past experience and so on. It's also a valid question to ask what Lionel planned to do if you didn't come aboard, or how he planned things out. And it'd be nice for a change if he answered questions about his own coin.

I completely agree with this. If Lionel makes himself known like how most developers make themselves known in a successful IPO, he will get a lot more funding.
hero member
Activity: 1204
Merit: 509
February 21, 2016, 02:58:24 PM
#95


From all the posts you have written so far I get the opinion that IPOs in general might not be the right thing for you. Maybe you should just miss out on this one.

I took part in several Poloniex ICOs (and a couple were really profitable), a couple of small-scale non-exchange ones (and even 1-2 without escrow), and you probably notice my signature is another ICO coin.

I'm fine with ICOs and have taken part in probably half a dozen or so of them. That's why I have asked questions here and pointed out some odd things ... usually the best run ICOs aren't managed like things have been here. If the fellow running the ICO states he will post his CV, for instance, it's sort of nice if he actually does ... and provides info on who is doing the coding, his other devs, past experience and so on. It's also a valid question to ask what Lionel planned to do if you didn't come aboard, or how he planned things out. And it'd be nice for a change if he answered questions about his own coin.
hero member
Activity: 1204
Merit: 509
February 21, 2016, 01:48:12 PM
#94


I would consider me the main dev at the moment.
Of course this can change quickly as more and more interested and motivated devs jump on board.
Just to give you a few more information about me:


I admit I am a little confused here.

Lionel started the ICO, said he had some devs working on the project... then a short while later someone in the forums (you), who invested, all of a sudden becomes the main dev on the project? What was Lionel planning to do if he didn't find someone to help out here?

You do seem to be qualified (at least tech/experience-wise, not sure about coding-wise), but it just seems like a weird ICO setup. There are definitely things that could have been done early on to maximize investment in the project -- I won't go into the escrow issue again, but how about some background info on the other team members, including Lionel?

member
Activity: 86
Merit: 10
February 21, 2016, 12:58:18 PM
#93
22.39 BTC funded now
37 buyers
legendary
Activity: 1092
Merit: 1001
February 21, 2016, 12:56:15 PM
#92
I have few questions:
- elastic.pro points to 0 BTC funded yet, is that correct?
- Is there any available information beside the name about the main dev?
- can you estimate how much time have we got left until BTC block 400000 will be mined?

Thank you for the responses.
hero member
Activity: 924
Merit: 1000
February 21, 2016, 06:07:19 AM
#91
It looks like a really cool project but I don't really know what all the computation power will be used for, why would a regular person want to use this?

The regular person probably has not much interest in getting something calculated.
But there are many researchers and companies who have a high demand for computational power to solve complex mathematical, cryptographic or other problems which are very hard to calculate.
Now if you own the coins, and they want to get their calculations done, then they must buy these coins from you, right? This is what gives them value.

Imagine Elastic Coin has 5000 miners with more-or-less modern computers that work on the proof-of-work functions.
Now, let us say someone originally uses a well-known (im leaving out the name on purpose) cloud computing service and purchases 5000 nodes there. Typically the price for one node is in the order of 0.10$ per hour.
Renting 5000 nodes would therefore cost him 500$/hour. Using Elastic Coin can save him a lot of money here.

As you can assume that many people try to optimize their costs, I am pretty confident that dozens of users of current cloud-computing solutions will consider switching to Elastic Coin.
I mean if they can get their work done for 10 ELC / hour instead of 500$ / hour, that would be great for him, wouldn't it? Not sure how the market will decide on the price of 1 ELC after the IPO,
but I believe that there are some good opportunities here.


Also it turns into an effectively any algo possible to cheaper than now mine any crypto coin any person so desires on a singular level
sr. member
Activity: 462
Merit: 250
February 21, 2016, 05:30:01 AM
#90
It looks like a really cool project but I don't really know what all the computation power will be used for, why would a regular person want to use this?
sr. member
Activity: 597
Merit: 253
... and the swarm is headed towards us
February 21, 2016, 05:26:52 AM
#89
poornamelessme, just to announce:
I have been in touch with Lionel and I made the decision that I will start working on this coin.



Thanks for that. It does seem like you know what you are doing there. The way you had been posting (more than Lionel even), led me to believe something like this, or that you invested most of the current ICO amount (guess it's a bit of both).

Lionel still plans to post his CV as well as the other team members though, right? I understand it can be tricky, as nobody here wants to post too much info about themselves ... but if going the ICO route, some info is needed. Usually when I invest in a coin (and expect many others do this too), it's not even necessarily about the coin ... it's about the devs.

I agree with poornamelessme. The main reason why I'm not investing so far is not because the code is not fully developed but because Lionel has not given his CV.

+1

OrsonJ, i will contact you in person and answer your questions in detail. For the public CVs we first have to decide how "detailed" they should be. Maybe you have some answers ... you will get a PM soon.

I've received no contact and I don't know why questions like this shouldn't be discussed publicly. The premise of Elastic Coin seemed interesting but I had misgivings about the non-escrowed ICO. Based on the suspicious behaviour of another forum member in this thread I actually have more now. Again, if you want to get the maximum amount of funding for this you will need to reveal some information about yourself publicly Lionel.
hero member
Activity: 690
Merit: 505
Cryptorials.io
February 21, 2016, 03:31:13 AM
#88
I believe I have a solution to the faster algorithm attack.

Specifically believe I have a modification of your PoW function such that the faster algorithm attack can achieve at best a linear speedup for the attacker.  I believe I can provide a security proof of this fact.  Moreover in any practical implementation it should be possible to determine and indeed control the multiplier.

Finally my construction works even if you don't cripple the language.  Arbitrary loops are allowed, as are programs which never halt.  You just abort them if they run too long; the PoW still works.

There is a price to pay for this: It's computationally expensive.

I withdraw the above.

It's funny.  I've been thinking about this stuff in general terms for the past couple of days.  I've had this specific construction in mind for the past day or so.  Then within a few minutes of publicly committing myself, I see a flaw in my reasoning.

Damn this stuff is hard.  I will think about it some more.

Lol, I often get a similar thing when I'm writing, I turn it over in my head for ages and its not till I write it down that I realize my mistake.

I've even been thinking about this myself, which is stupid since I'm not a programmer but its still interesting to think about - looking forward to seeing how they solve it. My only solution was to force a certain percentage of the blocks to be solved from a specified sub-set of problems, like PoW from another coin, so that any attack cannot win enough blocks to control the consensus.
legendary
Activity: 1775
Merit: 1032
Value will be measured in sats
February 21, 2016, 01:51:55 AM
#87
I just invested 1btc

https://blockchain.info/tx-index/df71cd7e97c66a899db7f7c66965856323c4b9fdf801eea2fa9b629a2dd47770

project has promise...he who dares wins Smiley

How do you decide which computation request gets priority? In the order they are submitted?


Thank you very much and welcome on board. I am sure you won't regret your decision.
The priority of computation requests is a good point: this will also be covered in the white paper revision. I will write that part next.  Wink

looking forward to reading the white paper, keep on trucking fellas. Will you launch a second ICO later on?
newbie
Activity: 56
Merit: 0
February 20, 2016, 10:03:36 PM
#86
I believe I have a solution to the faster algorithm attack.

Specifically believe I have a modification of your PoW function such that the faster algorithm attack can achieve at best a linear speedup for the attacker.  I believe I can provide a security proof of this fact.  Moreover in any practical implementation it should be possible to determine and indeed control the multiplier.

Finally my construction works even if you don't cripple the language.  Arbitrary loops are allowed, as are programs which never halt.  You just abort them if they run too long; the PoW still works.

There is a price to pay for this: It's computationally expensive.

I withdraw the above.

It's funny.  I've been thinking about this stuff in general terms for the past couple of days.  I've had this specific construction in mind for the past day or so.  Then within a few minutes of publicly committing myself, I see a flaw in my reasoning.

Damn this stuff is hard.  I will think about it some more.
newbie
Activity: 56
Merit: 0
February 20, 2016, 09:17:52 PM
#85
I believe I have a solution to the faster algorithm attack.

Specifically believe I have a modification of your PoW function such that the faster algorithm attack can achieve at best a linear speedup for the attacker.  I believe I can provide a security proof of this fact.  Moreover in any practical implementation it should be possible to determine and indeed control the multiplier.

Finally my construction works even if you don't cripple the language.  Arbitrary loops are allowed, as are programs which never halt.  You just abort them if they run too long; the PoW still works.

There is a price to pay for this: It's computationally expensive.
legendary
Activity: 1092
Merit: 1001
February 20, 2016, 02:33:25 PM
#84
When is the IPO going to end?
newbie
Activity: 56
Merit: 0
February 20, 2016, 02:28:38 PM
#83
I have a question of my own.  Is there a limit to the execution time for a program run?  For example, suppose the desired time between blocks is ten minutes.  Does the program need to finish within that time, in order to make its output available to be hashed?
Pages:
Jump to: