There is absolutely _no difference_ between booting from the hash of the genesis and the hash of a UTXO set at given height in terms of resistance to forge, both are immune to forge as long as the node we are booting from is able to convince us about the raw data each hash is committed to.
That is completely wrong. First of all, nodes do not just have a hard coded hash of the genesis block. The genesis block is small, so it is hard coded into the software itself.
It is irrelevant, no need to any hardcopy of the raw genesis block, just a small, I mean very
small optimization it is, not important in protocol level.
But a UTXO set as initial state, that's very different. Hardcoding a UTXO set is unfeasible because it is large. Because it is large, you can't visually inspect it and make sure that there are no unexpected UTXOs there.
No need to hard coding neither the genesis raw block nor a hypothetical UTXO. For the latter we don't need even the hash. PLease check the link to SDUC above.
According to my proposed algorithm, nodes do not check any initial-utxo, they just need an arbitrary number of blocks as their pruning length and can tune how many commitments they need to be achieved bu a utxo to be considered secure. A straightforward calculation would show that a SDUC compatible node with like 10,000 pruned blockchain is secure against attacks that cost up to half a billion USD with current prices and hashrates. Far more than enough for most of the ordinary applications.
Miners couldn't collude for inserting malicious UTXO commitment to blocks, for the same reason that they couldn't do it for any other form of malicious data: full nodes will reject such blocks and commit to the alternative healthy chain.
New nodes coming online won't know. Under this scheme, a new node that comes online is unable to verify that their initial UTXO set is correct. They can be forked off onto a separate chain which has an invalid initial UTXO set and the new node would have no idea.
Not necessarily, even current spv nodes do not commit to an invalid chain. It is absolutely possible for nodes to download block headers before starting to count number of commitments made by
latest blocks to a candidate UTXO, note that in this scheme I've propose a reverse verification algorithm in which we verify the blockchain from latest blocks down.
Speaking of centralization, let's take a look at trade-offs involved:
Current Situationpros:
- secure against complete chain rewrite attacks
- nothing else
You forget a few.
- Almost no trust in any third parties required. The only trust necessary is in developers for software whose code can also be independently audited
- The entire history from the beginning of Bitcoin is verifiable
ok, but the first one is obvious and common between the two methods and the second one is not an advantage, just a disaster pointed as a limitation.
- downward compatibility and software bloat because of the need for re-running the protocol from root
Have you looked at Bitcoin Core in regards to the consensus rules? There's very little bloat because the protocol really hasn't changed that much. Those things that would cause bloat have been reduced to simple if statements or outright removed (BIP 34 style activation has been entirely removed and those soft forks are activated by block height/block hash or enforced from genesis).
The activation of forks based on block height/hash is an example of software bloat and your argument is not correct anyway because we are not speaking of just last ten years, think more strategic.
- inherent resistance to change and evolution in time
Why is that a con? The blockchain is supposed to be immutable after all. Not only that, but BItcoin does evolve, just slowly and cautiously, as you should be when dealing with a system that handles large amount of money.
[/quote]
Securing money needs evolution and adoption when you impose backward compatibility on nodes it acts like an
artificial inertia that complicates things and increases the risks, it is why bitcoin has evolved
slowly. To be honest it hasn't happened intentionally and because you guys have been
cautious it has always been about downward compatibility. I understand there is
a natural inertia as well and there should be, but it is just about the need for guaranteeing few things 1-inflation policy 2-security, 3-immutability (of data, not the protocol nor the code) 4-decentralization and 5-censorship resistance.
- Great decentralization effects because of realization of fast and cheap fully functional nodes
There's more to decentralization than just more nodes. More nodes does not mean more decentralization. Take Ethereum for example. It has many light nodes, but I would not say it is decentralized. They are centralized under the Ethereum developers who can just say "we're having a hard fork at this time" and then say "sorry guys, hard fork canceled" and the entire network just does as they say. That is not decentralization, but they have many nodes.
Ethereum nodes are irrelevant, typically they are not full nodes and the whole Ethereum thing and its governance issues is a joke, I admit, still irrelevant tho.
More full nodes, means more decentralization not absolute decentralization because we have pools and miners to take care of as well.
- Improved software quality because of significant reduction in need for downward compatibility.
Just about everything that is currently needed for consensus validation would be needed in the UTXO commitment model. In fact, this would cause more bloat because now we have to handle a new rule: UTXO commitments. All of the other things needed for backwards compatibility like non-segwit outputs still need to be maintained.
Handling UTXO commitment is trivial and opens doors to possibilities for having a much cleaner approach to downward compatibility because obsolete transaction types do not need to be validated. Spending UTXOs needs a downward compatible scrypt processing engine, I accept but it is not true for validation routine.
cons:
- vulnerability to ultra-long/complete chain rewrite attacks which are not practical anyway
- nothing else
You forgot a few:
- Increased trust in third parties
And if you are talking about UTXO commitments in every block, there's more cons:
- Additional time to validate and produce blocks due to the need to hash the UTXO set
- Issues with commitments themselves where they do not necessarily commit to all UTXOs or collide and commit to UTXOs that do not exist
Now I'm not saying the UTXO commitments are universally bad. It is an active area of research and someone may come up with a breakthrough which allows us to do this kind of pruning trustlessly. However, as it is now, UTXO commitments introduce more trust in others, and pruning the blockchain so that it is just an initial UTXO set from which blocks are built on is simply introducing too much trust to the system.
But UTXO commitments will certainly be useful even if they introduce more trust. Instead of pruning the blockchain to some initial state, they would allow for a faster sync. You can do a sort of Hybrid sync where you use a UTXO set initially and check it against UTXO commitment in blocks so that you can be synced very quickly. Then in the background the blockchain can be downloaded and checked. This would allow the best of both worlds: you can have a faster sync and reduce your node's resource usage, and the full history is still verifiable and you can verify that the UTXO set you received initially is correct. This would be optional: for those who want to have a faster sync and don't care that they are trusting others, they can use the UTXO commitments. For those who want a faster sync and are okay with initially trusting but later verifying the UTXO set, they can use the hybrid sync. And for those who want to have the least amount of trust in others and to verify everything, they can do the full sync of the blockchain.
Now you are speaking a bit more rigorously: I maintain that your "need for trust" issue is not correct though I admit that there is something to pay when you get such a huge utility, unlike what you say, it is not about trust, it is about desired security level against long range attacks: when users feel there is a more than one billion dollars incentive for adversarie to commit they need to tune for 20K+ number of commitments and pruning height.