Quoted for future reference. I'm not going to bold the "eventually" in the last paragraph for emphasis. It is a raw quote.
I just wanted to point out that gmaxwell is again spreading FUD and false information.
Only the decoder portion of this software is normative, though a significant amount of code is shared by both the encoder and decoder.
You appear to be claiming that I misquoted RFC 6716. You do realize that I
wrote (and revised in many subsequent revisions) that section of the RFC, right? It's not clear to me specifically how you're misunderstanding that sentence but what it's saying is no different there than what we might say about Bitcoin: only the code that defines how a node responds to a block is normative, not the code that actually authors a new block. I'll consider accepting an apology— but after all the other times you've been a jerk spewing adhominem and vague allegation about subject matter you were totally wrong about you're going to have to phrase your apology very nicely.
And would you cut with the allegations that you need to quote me for any particular reason? If for some bizarre reason I wanted to edit my comments I could edit them right inside your posts. As far as I can tell you're doing that to imply that I'm dishonest in order to piss me off. Congratulations, you've been successful. I'm wondering if you behave like this in person, and if so— do you get a discount on cold-compresses for all the black eyes you must inspire people to give you?
Algorithms and protocols are typically specified in machine-checkable form using languages such as +CAL/TLA+ and PROMELA. Not C code. The implementation should follow the specification: "a program is an algorithm plus an implementation of its data operations".
An argument can be made for that, but I am very skeptical. Bitcoin is a distributed consensus system. The deepest and most fundamental definition of correctness— above _all_ other considerations— is that they actually achieve global consistency. It is more important the Bitcoin nodes achieve a consensus than behave in any accordance to an other particular definition of "correct" (though, other forms of correctness matter of course, but they is a dim flicker to the furnace of consistency). A specification can say many things, but if the installed nodes behave differently from the specification but consistently with each other and its over a matter which would cause a failure to achieve a consensus then it is the specification which is wrong in any practical sense, not the nodes, and the specification which must be changed. In effect any not-practically-executable specification is a dead letter, pure fodder for rules-lawyers to debate, and not something that would exert influence on the real world.
A parallel formal description may be good for additional confidence, but unless it can be extracted (e.g. like the software to extract code from COQ proofs) into reasonably performant production grade code it cannot practically be
the normative specification. Sure, you could call it that— but the claim would be a lie. It's worth noting that suitably authored (e.g. ACSL annotated) C is machine checkable, and also runnable— allowing a who layer of additional tests (e.g. catching some cases where a checker can prove that the algorithm will complete, but failing to warn you that it would take 2^64 operations or words of memory to get there...)— and also allowing it to be an actual normative specification which could functionally define the behavior on the network both on rules-layer paper but also in practice by actually defining the behavior of real nodes which must be exactly matched.