Author

Topic: Least action principle as an alternative proof of work system (Read 2725 times)

newbie
Activity: 32
Merit: 0
My apologies for raising such an old thread from the grave, but I could find no more recent discussion.


The point of offline work is that it lets you store up computing power and unleash it all at once, so even a tiny fraction like 1% of the network could run a double-spend every few weeks or other naughty behavior like that. None of your 3 points seem to eliminate the possibility of offline work, and so your scheme is wildly insecure.

Okay, so instead of the small list of lagrangians that we're trying to minimize let's have a nigh-infinitely large list.  So large that even with huge computing power no one will be able to pre-compute a meaningful portion of it.  We will randomly pull lagrangians from this list and minimize to a degree set by the difficulty, first person to present a solution below this gets the block.

This does mean that we probably won't ever compute any useful minima (only minima for random useless systems); however, we will create a market for better software and hardware for computing lagrangian minima in general.  Still a good thing, I think!


Yes, but it also means that the difficulty is going to be wildly unpredictable, unlike cryptographic hashes which smoothly vary in their difficulty. (Miner X finds solution Y to random lagrangian Z after ten minutes of processing; how much proof-of-work does a new solution to Z represent? Fuck if we know, maybe a better heuristic will find another path in 1 minute - all depends on details like what heuristics X used and how good Y is compared to the unknown global optimum and niggling details of Z like maybe it's an easy instance.)

We could present a number of lagrangians to solve each round to smooth out the effect of outliers.  The number could also vary with difficulty, perhaps.



Does this properly address the concerns?  Are there other issues preventing this idea from working?
newbie
Activity: 47
Merit: 0
That's not too bad a way to find a global minimum, because if someone ever finds it, he will always find a better path than the existing record.

Yes, but it also means that the difficulty is going to be wildly unpredictable, unlike cryptographic hashes which smoothly vary in their difficulty. (Miner X finds solution Y to random lagrangian Z after ten minutes of processing; how much proof-of-work does a new solution to Z represent? Fuck if we know, maybe a better heuristic will find another path in 1 minute - all depends on details like what heuristics X used and how good Y is compared to the unknown global optimum and niggling details of Z like maybe it's an easy instance.)

Quote
I'm not convinced by this "offline processing" argument.  It's not too hard to evaluate the pertinence of a Lagrangian, for instance with the existing performances of the path proposed so far.   Say you make your offline processing on your secretly chosen Lagrangians.  Once you found the  global minima, you publish the Lagrangians but keep the paths secret.

1.  since you keep the optimal paths secrets, some other miner can beat the paths you proposed at anytime.
2.  your set of lagrangians is probably only a tiny subset of all the accepted lagrangians (unless you have a really huge computing power, or your Lagrangians are really easy to solve, in which case honest miners will ignore them)
3   whenever you publish your paths, assuming your Lagrangians are interesting, then you will provide valuable info and if you are rewarded for your offline work, that's totally fair.

But more generally offline work is probably not a big threat against any cryptocurrency.  It's quite unlikely that a lone miner could outperform the rest of the collaborative network and come out with a better chain made on his own.

The point of offline work is that it lets you store up computing power and unleash it all at once, so even a tiny fraction like 1% of the network could run a double-spend every few weeks or other naughty behavior like that. None of your 3 points seem to eliminate the possibility of offline work, and so your scheme is wildly insecure.
legendary
Activity: 1288
Merit: 1080
Computing a Lagrangian for a given path is fast and straightforward.   Finding the path that gives a local or global minimum is the hard part.

Basically there would be a public database of known optimals for a whole bunch of Lagrangians.  Whenever someone breaks the currently known record, he gets the point.

So you're not actually trying to compute a global minimum, you're merely trying to compute a better path than the existing record.

That's not too bad a way to find a global minimum, because if someone ever finds it, he will always find a better path than the existing record.

Quote
This still sounds vulnerable to offline processing, and seems like one could easily cheat and expend very little computation when a new Lagrangian is introduced (and if new Lagrangians can't be introduced, then difficulty will behave very weirdly as optimal or close-to-optimal solutions are eventually found & also not be very useful in non-coin applications).

I'm not convinced by this "offline processing" argument.  It's not too hard to evaluate the pertinence of a Lagrangian, for instance with the existing performances of the path proposed so far.   Say you make your offline processing on your secretly chosen Lagrangians.  Once you found the  global minima, you publish the Lagrangians but keep the paths secret.

1.  since you keep the optimal paths secrets, some other miner can beat the paths you proposed at anytime.
2.  your set of lagrangians is probably only a tiny subset of all the accepted lagrangians (unless you have a really huge computing power, or your Lagrangians are really easy to solve, in which case honest miners will ignore them)
3   whenever you publish your paths, assuming your Lagrangians are interesting, then you will provide valuable info and if you are rewarded for your offline work, that's totally fair.

But more generally offline work is probably not a big threat against any cryptocurrency.  It's quite unlikely that a lone miner could outperform the rest of the collaborative network and come out with a better chain made on his own.
newbie
Activity: 47
Merit: 0
I don't see how that is a reply to the offline problem, and actually that's what it sounds like from your original post:

"Basically, all nodes try to find an geometrical structure of the molecule that minimizes a certain function.  They use more or less a brute force method:  they try plenty of structures until they find one that gives an action that is considered low enough.'
legendary
Activity: 1288
Merit: 1080
So you're not actually trying to compute a global minimum, you're merely trying to compute a better path than the existing record. This still sounds vulnerable to offline processing, and seems like one could easily cheat and expend very little computation when a new Lagrangian is introduced (and if new Lagrangians can't be introduced, then difficulty will behave very weirdly as optimal or close-to-optimal solutions are eventually found & also not be very useful in non-coin applications).

There is not just one lagrangian in my proposal. 
newbie
Activity: 47
Merit: 0
Computing a Lagrangian for a given path is fast and straightforward.   Finding the path that gives a local or global minimum is the hard part.

Basically there would be a public database of known optimals for a whole bunch of Lagrangians.  Whenever someone breaks the currently known record, he gets the point.

So you're not actually trying to compute a global minimum, you're merely trying to compute a better path than the existing record. This still sounds vulnerable to offline processing, and seems like one could easily cheat and expend very little computation when a new Lagrangian is introduced (and if new Lagrangians can't be introduced, then difficulty will behave very weirdly as optimal or close-to-optimal solutions are eventually found & also not be very useful in non-coin applications).
legendary
Activity: 1288
Merit: 1080
2. how does using a Lagrangian allow a solution to be very efficiently verified as an optimal solution? Sounds like the only way to verify some solution is really gobally optimal is to redo all the computation that went into it, in which case it cannot serve as a proof-of-work.

Computing a Lagrangian for a given path is fast and straightforward.   Finding the path that gives a local or global minimum is the hard part.

Basically there would be a public database of known optimals for a whole bunch of Lagrangians.  Whenever someone breaks the currently known record, he gets the point.
newbie
Activity: 47
Merit: 0
Evaluating arbitrary Lagrangians could make the proof-of-work quite useful, sure, but I don't see how this satisfies at least two properties necessary:

1. what stops someone from evaluating a bunch of Lagrangians offline over the years and then spending them all at once?
2. how does using a Lagrangian allow a solution to be very efficiently verified as an optimal solution? Sounds like the only way to verify some solution is really gobally optimal is to redo all the computation that went into it, in which case it cannot serve as a proof-of-work.
legendary
Activity: 1288
Merit: 1080
Attaching folding to a bitcoin like system would be huge.

I think so as well.

I'll try to make a more accurate description here.

Everything would work as in bitcoin, except the proof of work part.  ECDSA, bitcoin addresses, transaction mechanism and all:  it's all working in the same way.

A block header would have the same fields, apart from nNonce and nBits.  We'll use two other fields:  hashWorkBench and difficulty.  I'll describe them later.

- version
- hashPrev
- hashMerkleRoot
- nTime
- nNonce hashWorkBench
- nBits difficulty

The hash of the block is a double SHA-256 of this header, just as in bitcoin.  But it doesn't have to begin with zeros anymore, so it does not contain any "work".  The work and its proof are somewhere else, in some additional data that has to be given along with the block in order for the block to be considered valid.  

I'll try to describe what the work consists on, now.  I'll be generic so that it can actually be applied to anything, not just molecules.

The work consists in finding paths of least action for Lagrangians.  Any Lagrangian will do.  They are submitted by miners and as the block chain increases, nodes maintain a database of known, already studied Lagrangians and paths.

A Lagrangian is nothing but a mathematical function of a set of parameters.  It returns a real value.   A path is an ordered set of such parameters.   The action is the integral of the lagrangian along a path.

We'll assume that there is a unique, unambiguous way to serialize the mathematical expression of Lagrangians and paths into bit strings.

A miner can work on any Lagrangian he wants.  Even one that has never been studied before in the block chain.  That's how new Lagrangians are introduced in the system.

So a miner must tell other nodes what lagrangian he's working on, and on which paths.  That's called his workbench.  Say he works on four Lagrangians A, B, C, D.  (In fact, he'll need much more lagrangians than that, but that's a showcase).  We need an unamigous way to serialize those four lagrangians, since B, D, A, C for instance would be the same set.

So he first serializes those mathematical expression (according to the serialization hypothesis, he can) into four bit strings a b c d, and then he sorts those bit strings.  He then serializes the list and that gives him a long bit string.  He does the exact same thing for the paths he found.  That gives him an other bit string.  Well, actually he can not use the full paths.  He can only use the extremities of the paths.  The full paths must be disclosed somewhere else.  It's a tricky part to explain.  I'll explain it later.

He concatenates the lagrangians bit string with the path bit string, computes the double SHA-256 of this, and that is the hashWorkBench.  It's a index to the workbench database (EDIT:  see note 1), which consists of records of the form (lagrangian)+(path extremities).

The difficulty is evaluated by comparison with the previous known results.  If a Lagrangian has never been used before, or if the extremities of the associated path were never used either, it does not contribute to the difficulty.

Otherwise, we use the relative decrease of the action.

(Action(previousBestPath) - Action(newPath)) / Action(previousBestPath) =
(
   (integral of Lagrangian on previousBestPath)
     -
   (integral of Lagrangian on newPath)
)  /
   (integral of Lagrangian on previousBestPath)

That is only for one lagrangian.  The total for the set of lagrangian is the sum of these values.

Now, the tricky part.  We have to make it impossible for an other node to just use this work and make it available for an other block, i.e. with an other hashMerkleRoot, or an other block hash.  Somehow, we need this work to be linked to the hash of the block.

To do so, the miner will work on those lagrangians with a preference order.  He will work a lot on some of them, and much less on others.   Sorting the resulting difficulties will give a permutation of the workbench.  This permutation will have to be associated to the hash of the block.

The hash of the block is a large integer inferior to 2^256.

2^256 is a large number, but it is inferior to 58!

This means that we can encode any number up to 2^256 with a permutation of 58 workbench entries.

So a miner just has to chose 58 couples of lagrangians/path extremities, and work on them so that the difficulties are in decreasing order when using the permutation corresponding to the hash of the block.  In the example of four Lagrangian A B C D, say the permutation corresponding to the number 13 is:

C A D B

Then the miner works first on C, then he works on A until he finds a better work (achieving a better difficulty) than what was found for C, and so on until he reaches B.

The full paths he computed can not be published with the workbench, because the workbench was needed to compute the hash of the Block, which was needed to compute the full paths.  So to avoid circularity we can only publish the extremities of the path in the workbench.

So in addition to the transaction database, i.e. the block chain, there would be two additional databases:

- a workbench database, which contains lagrangians and path extremities.  This would be linked to the block chain database via the hashWorkBench external key ;
- a full path database, which contains full paths giving least known actions for workbench entries ;



Note 1:  I've made a mitake here.  There is an intermediate database.  Because what I call a workbench here is not made of just one lagrangian, but a series of some.
hero member
Activity: 686
Merit: 500
Wat
I think its worth pursuing as a thought experiment. Attaching folding to a bitcoin like system would be huge.
legendary
Activity: 1288
Merit: 1080
I don't quite understand your question.  Please rephrase.

I guess that's because I have no idea what folding is really doing.

But if reaching a result for a folding problem using today's existing methods takes 10,000 operations, let's say, and the method to get the same result when done through this "least action / alternate" method takes 15,000 operations, then there is a 50% efficiency loss.    

That's what I was getting at..., is the loss minimal or might it be large?

[Edit: fix typo on math.]

Computing the action is very fast for a given path.  That's one reason why I think it's a good candidate to be used instead of a digest algorithm.

Basically, the action uses a function called the Lagrangian, which is easy to compute with a certain set of parameters.
But the action is the integral of the Lagrangian along a certain path of parameters, and we want the path that gives the least action.  So the only way is to compute the action for all possible paths, and there are really a lot of them.

But for a given path, and a given Lagrangian, the action is very quick and easy to compute.

It's really, just like in bitcoin, a brute force computation, and the result with least value is the winner.


Edit:  Oops.  Sorry, I'm not sure I answered your question, actually.

I did read your question again.  The method I propose here is not much different from the standard, current method used in BOINC (I don't think so anyway).  If there is a loss, I think it would be compensated by the fact that miners would get rewarded for their work.  folding@home is already much less powerful than bitcoin, and that's kind of a shame, imho.
legendary
Activity: 2506
Merit: 1010
I don't quite understand your question.  Please rephrase.

I guess that's because I have no idea what folding is really doing.

But if reaching a result for a folding problem using today's existing methods takes 10,000 operations, let's say, and the method to get the same result when done through this "least action / alternate" method takes 15,000 operations, then there is a 50% efficiency loss.    

That's what I was getting at..., is the loss minimal or might it be large?

[Edit: fix typo on math.]
legendary
Activity: 1288
Merit: 1080
The overall sum of all values is used to compare the work of different miners.

Would this cause significantly more computing work to be expended to support this alternate proof of work approach?

I don't quite understand your question.  Please rephrase.
legendary
Activity: 2506
Merit: 1010
The overall sum of all values is used to compare the work of different miners.

Would this cause significantly more computing work to be expended to support this alternate proof of work approach?
legendary
Activity: 1288
Merit: 1080
I've just had an idea about how to insert information, thus avoiding the whole tournament scheme and switching back to a more bitcoin-like process.

Let's say we have 40 molecules that we want to fold.   We can code information using the permutations of these molecules.  There are 40! permutations, which is about 2^160.  That's about enough to encode a bitcoin address (as it is precisely a RMD-160).

So, a miner must compute minimal values for each molecule, in a decreasing order, assuming there is a way to compare minima for different molecules (maybe with a ratio energy/molecular mass or something).  The overall sum of all values is used to compare the work of different miners.

By doing so, a miner can publish his result as soon as he finds them, pretty much as in with bitcoins.  I'm pretty sure this information can not be used by other miners to forge their own result.  Yet it's not obvious and should be studied more seriously.  I like the idea, though.


PS.  Mapping permutations is not an easy task but it seems to be well known in maths.  Also, we don't have to use a finite set of molecules.  As long as a molecule has a unique, unambiguous way to write its chemical composition, we'll be fine:  we just hash this expression and sort the resulting numbers.  Then the miner just has to encode its bitcoin address using a permutation of this list of molecules.
legendary
Activity: 1288
Merit: 1080
It is often objected that the current computing task performed by miners is a waste of CPU.  Though many good answers have been given against this idea, the most basic is still that we just don't know any other way of electing nodes in an anonymous, decentralized and verifiable way.  Sure, it would be great if the mining process could have a usefull side effect.  But so far we just don't know any way to do that.

I've been thinking about it and here is a rough idea:  using the least action principle to solve physical problems.  As you may know, finding the correct folding process for long molecules is a very tough problem to solve with a computer.  It is so tough that it often requires the use of a large number of computers working together in a framework such as BOINC (see folding).   IIRC, this problem is solved using the least action principle (or the least configurational energy, I confess I don't know the details much, but you get the idea), which is the at the root of both classical and quantum mechanics.  Basically, all nodes try to find an geometrical structure of the molecule that minimizes a certain function.  They use more or less a brute force method:  they try plenty of structures until they find one that gives an action that is considered low enough.

See the parallel with the proof of work system used in bitcoin?

Now, of course replacing the cryptographic proof of work by a least action calculus would not be easy, because they are still some very crucial differences.

In a cryptographic proof, you can put information in your work, so that nobody can "steal" your work.

Let's call "PAYLOAD" the message you must work on (in bitcoin, that would be the Merkle root of the transactions).

If you want people to have an incentive to work on this payload, they have to be allowed to add some way to identify themselves, so instead of computing a hash of the form:

PAYLOAD:NONCE

They will compute a hash of the form:

PAYLOAD:USER_ID:NONCE

In bitcoin the USER_ID is basically the equivalent of the coin_base transaction.

With the least action, it is much more difficult to do something like that, because the action depends only on the sequence of amino acids and their geometrical structure.   The sequence is the equivalent of the payload, and the geometrical structure is the equivalent of the nonce.  There is no way to include any extra information.

This means that as soon as someone publishes a particularly efficient folding, there is no way to prevent anyone to just quickly copy it and also publish it to the network pretending he found it himself.   Trying to use timing consideration would probably be useless, as the inevitable latency in the network makes it impossible to make sure of the exact time a message was introduced.  That's basically the main reason why the double spending problem existed and why bitcoin had to solve it.

Isn't there a way to overcome this difficulty, though?  I believe there has to be.  It would work in the same way as sport events (I was inspired here by chess tournaments):  you need to find a way to force players to admit their defeat.

##
EDIT:  This scheme is probably not the best way to do this.  See next message for a better idea.
##

Here is the basic idea.

First, we share a list of molecules that we want to work on.  Then we organize the "competition", pretty much as we would do for a sport event:

- We register all players.  They are of course identified by a public key.   Decentralizing this step might not be easy, but I doubt it's impossible ;
- "Round" begins:  players compute actions for each molecule.  They can focus on a particular molecule, or work on all of them.  Doesn't matter.  If they want to work on only one, they just pick one random geometric structure for all the others, but in the end they publish a result for each molecule ;
- they don't publish clear text results:  they publish encrypted, signed results.  The encryption scheme has been chosen by the player and only him knows the key (which is either an assymetric key or a symetric key, in which case it is necessary a different one for each round ;
- when a player receives the (encrypted) results of an other player, he must sign it and publish his signed version to the rest of the network ;
- once a certain ratio of player has submitted their results, or once all players that have published their results agree that some time limit is over, the round is finished ;
- at this moment, all results are available in their encrypted version, and they are all signed by all players who have published some results ;
- Now all players publish their decryption key.  Everyone can see who has found the minimal action, and nobody can pretend he did it if he actually did not.
- We can choose to assign a different crypto-currency for each molecule, or mesure the relative progress for each molecule and select the player who has reach the best progress for any molecule.  Does not matter much.
- The winner has the right to publish a block of transactions for this crypto-currency, the first transaction being a creation transaction, just as in bitcoin.

There are certainly lots of difficulties with this scheme.  One that I see is that in order to verify the block chain, one must know the whole history of all the tournaments that occured.  That can require a lot of disk space.  But hopefully with increasing disk capacities it should not be that big of a problem.

Well, here it is.  I know it can look far fetched, but I believe the "bitcoin is a waste of CPU" argument is not a stupid one.  So I really wanted to imagine an alternative protocol that would solve this issue.  Or at least I tried.

Jump to: