Pages:
Author

Topic: Blockchain Compression - page 3. (Read 8657 times)

legendary
Activity: 905
Merit: 1012
July 03, 2013, 04:32:19 PM
#28
You can't have full node security by bootstrapping off a miner-majority consensus, that's SPV level security by definition. To bootstrap a full node in a trust free manner you have to process the entire chain, or get a copy of the database from somewhere you "know" is correct (i.e. another one of your own nodes).

Right now the choices are SPV or full node. The misnamed “ultimate blockchain compression” proposal would enable a wide spectrum of security modes in-between the two. Initial synchronization requires SPV level of trust in the checkpointed UTXO set. However from that point forward full validation can be performed, and history can be incrementally validated as far back as required (to the last checkpoint, for example, or the full history). The ability to construct fraud proofs provides an additional layer of security as it provides a compact way to inform other peers of an attack-in-progress. These are valuable options that enable secure limited-trust operation on constrained devices, or a graduated level of security as synchronization is performed.

The second claim that it helps lightweight clients also isn't really right. I've actually implemented lightweight mode (the one described by Satoshi at least) and UTXO commitments really change much, at least not fundamentally. You still need all the block headers to track the consensus, and at that point you may as well ask for transactions that match your keys at the same time using the existing merkle trees. A UTXO commitment in the coinbase might make things faster if you were to import a private key, at the cost of significantly increasing resource usage of every single node on the network. Given that private key import is (or should be) rare, this isn't a good tradeoff.

Mike, you seem content to trust that peers will send you transactions matching your bloom filter. I do not find that reasoning compelling. The UTXO index will give a short, verifiable proof that matching transactions are or are not reported.
legendary
Activity: 1526
Merit: 1134
July 03, 2013, 04:31:49 PM
#27
Well Gavin, at least for now, isn't too worried about disk space usage.  It'd only take a couple of months to fully implement the last parts of pruning if he worked on it seriously, and it's hard to imagine the block chain doubling in size in only a few months. That'd be a nice problem to have. Even then, 16 GB isn't all that much. I have 182 GB free on my computer and it's a pretty ordinary laptop.

At the moment Gavin's biggest worry is security. The payment protocol is designed to complement the Trezor project and give us better security than the best banks. Also, as I said, Pieter has said he wants to work on pruning, so that might be another reason Gavin's not prioritising it at the moment. Pieter has a full time job these days so it's harder for him to find the time, but even so, he's done amazing work on scalability and I'm sure he'll be able to finish off the pruning work some time this year. I'd guess the block chain will be under 12 gigs by the time pruning starts. So no risk of a crisis.

In short, yes it's important and will happen, but we're not going to see Bitcoin collapse if it doesn't happen this year. The numbers just don't work out that way.

Quote
Awesome job. However how does the use of SPV clients prevent centralization of the Bitcoin network? Are SPV clients equally sovereign when it comes to deciding which blockchain, which new blocks and transactions are valid?

I guess that's a rhetorical question? Anyway, at least for other people who are reading, no as described in Satoshi's paper they only validate the block headers. So they pick the hardest chain and then assume the contents are likely to be correct. You can't mine with them but in practice "best chain is correct" is usually good enough, at least for normal individual users.

If you can afford a full node, that's better. But most casual users won't want to run one. The next most decentralised after that is SPV. No central or trusted servers. You place your trust in the majority consensus instead.

If we didn't have SPV clients then Bitcoin usage would be dominated by services like blockchain.info or Coinbase. Both great companies and sites, but ultimately people would end up putting their money into "BitBanks" and relying on third party companies. Those organisations would then either get regulated out of existence, or become real banks and start lending out their bitcoins at interest. In some ways Bitcoin usage is already dominated by these sorts of wallets and it's not even hard to run bitcoin-qt today, so you can imagine how common it'd be in future with much larger blocks. That's why my no 1 priority is to try and build SPV wallets that are competitive with centralised bank-like services, to avoid that future and keep people using the P2P network directly without any middlemen (or more technically, with only p2p nodes and miners as middlemen).

Re: conspiracy theories, yes, sorry. Quite often people on this forum claim that because I work for Google, I must have some incentive to try and centralise Bitcoin or . It's nonsense of course, Google is focused on Google Wallet and hardly cares about Bitcoin. In fact Google not only employs me but also Pieter, who has done more work on pruning than anyone. It's about as relevant as the fact that Jeff once worked for Red Hat.
legendary
Activity: 1078
Merit: 1003
July 03, 2013, 03:49:04 PM
#26
I apologize for my rudeness, my frustration got the best of me.

I already explained that block chain pruning will allow you to run a full node that uses an arbitrary (user specified) amount of disk space.

My complaint was that implementing it is not a priority when it should be and that implementing it too late might have grave consequences.

By the way, I've spent over two years implementing SPV mode so people can use the P2P network directly without having to download the block chain - and without relying on any trusted servers. Only a handful of people have put as much effort into keeping Bitcoin decentralised as I have.

Awesome job. However how does the use of SPV clients prevent centralization of the Bitcoin network? Are SPV clients equally sovereign when it comes to deciding which blockchain, which new blocks and transactions are valid?

If you read back my post, you will notice that I called you a brilliant programmer because I really think you are one. And I never said you are not putting in a significant effort in order to improve Bitcoin. My complaint was that the development that you are contributing isn't the development that Bitcoin crucially needs in order to remain decentralized and prosper. I mean unless I'm mistaken about SPV clients in which case please correct me.


p.s.: while I admit that the tone of my post was not entirely constructive, you definitely didn't rise above me by calling my a conspiracy theorist
legendary
Activity: 1526
Merit: 1134
July 03, 2013, 03:03:38 PM
#25
hazek, you are not a developer so you really shouldn't get angry about technical topics. I already explained that block chain pruning will allow you to run a full node that uses an arbitrary (user specified) amount of disk space. So when the chain gets to be 55 gigs, OK, if you can spare 55 gigs then keep all of it. Otherwise keep 30 gigs of it. Or 10. Or none. Regardless of what you can afford, you will still be able to run a full node. You just won't be able to serve the whole chain to a new node that's starting from scratch (so someone, somewhere will have to keep a full copy of the chain, but that's fundamental to how Bitcoin works).

If you don't understand what I'm talking about, go do some research and come back when you do understand. Flaming people because you prefer to believe in conspiracy theories than learn hard technical facts just makes you come across as immature.

By the way, I've spent over two years implementing SPV mode so people can use the P2P network directly without having to download the block chain - and without relying on any trusted servers. Only a handful of people have put as much effort into keeping Bitcoin decentralised as I have. So by saying that I don't care about decentralisation you just sound even more uninformed.

Quote
Is there a summary of what is being considered here?

It'll be written up in BIP form soon. Until then, this link is the first result on Google for "bitcoin payment protocol":

https://github.com/bitcoin/bitcoin/pull/2539

Quote
I'd love to hear your technical arguments regarding that.

The idea was to put a commitment to the UTXO set into the coinbase transactions. The claim was this lets you run a full node without processing the full chain, and that it helps lightweight clients. The first claim is not true, which is why it's now crossed out in the original post.

https://bitcointalksearch.org/topic/ultimate-blockchain-compression-w-trust-free-lite-nodes-88208

You can't have full node security by bootstrapping off a miner-majority consensus, that's SPV level security by definition. To bootstrap a full node in a trust free manner you have to process the entire chain, or get a copy of the database from somewhere you "know" is correct (i.e. another one of your own nodes).

The second claim that it helps lightweight clients also isn't really right. I've actually implemented lightweight mode (the one described by Satoshi at least) and UTXO commitments really change much, at least not fundamentally. You still need all the block headers to track the consensus, and at that point you may as well ask for transactions that match your keys at the same time using the existing merkle trees. A UTXO commitment in the coinbase might make things faster if you were to import a private key, at the cost of significantly increasing resource usage of every single node on the network. Given that private key import is (or should be) rare, this isn't a good tradeoff.

Fortunately, none of that is needed. Satoshi already described everything that is required to make Bitcoin scale in his white paper nearly 5 years ago. Most of it is already implemented. The ability to delete old blocks from the chain when you start running out of disk space is waiting merely for the protocol to be upgraded so nodes can advertise how many blocks they have, and for wallet apps to be upgraded so they find nodes that still have the range of blocks they need to catch up with.
sr. member
Activity: 461
Merit: 251
July 03, 2013, 03:02:08 PM
#24
Again, the "ultimate blockchain compression" thing doesn't make any technical sense, that's why it's not happening.

I'd love to hear your technical arguments regarding that.
He described them here once: https://bitcointalksearch.org/topic/m.1607170
legendary
Activity: 905
Merit: 1012
July 03, 2013, 02:31:57 PM
#23
Again, the "ultimate blockchain compression" thing doesn't make any technical sense, that's why it's not happening.

I'd love to hear your technical arguments regarding that.
legendary
Activity: 1232
Merit: 1094
July 03, 2013, 08:45:40 AM
#22
The payment protocol is foundational work, it is required for a lot of other features.

Is there a summary of what is being considered here?
legendary
Activity: 1078
Merit: 1003
July 03, 2013, 08:29:56 AM
#21
The payment protocol is foundational work, it is required for a lot of other features. That's why it's important. Pruning is a needed scalability fix but we're far from having some kind of crisis because people can't afford the disk space. A lot of people blow more than 8 gigs of disk space on apps they simply forgot about.

Man, for a brilliant programmer I can't believe what utter shit you sometimes post.

Yes the disk space currently isn't a problem. But you know damn well that it only takes a little over 2 more doublings of transaction volume in terms of space and we're going to reach the hard limit of 1Mb. And then what, chief? Not to mention that this will mean 55GB blockchain size increase per year but undoubtedly you and Gavin will both lobby for a reckless hardfork then instead of trying to find a different solution now.

But I'm perfectly aware where your Google spoiled centrally planning head is coming from. You want more control cause it's easier from a technical standpoint, you don't care one bit about the decentralization part of Bitcoin and you don't care what sort of consequences not focusing on this issue now and forcing a hardfork then will have.
legendary
Activity: 1078
Merit: 1003
July 03, 2013, 08:23:47 AM
#20
The solution to both problems is clear and Gavin's prioritization is correct. The block chain is only 8 gigabytes today. That's nothing. If you can't keep up with that then just switch to MultiBit and you'll not need to store the chain any more. It'll be much faster as well.

Yes, let's screw decentralization while we are at it, right? I know you wouldn't mind Bitcoin becoming controlled by a hand full of super nodes..
legendary
Activity: 1526
Merit: 1134
July 03, 2013, 07:59:07 AM
#19
The solution to both problems is clear and Gavin's prioritization is correct. The block chain is only 8 gigabytes today. That's nothing. If you can't keep up with that then just switch to MultiBit and you'll not need to store the chain any more. It'll be much faster as well.

Again, the "ultimate blockchain compression" thing doesn't make any technical sense, that's why it's not happening. Pruning is something that Pieter Wiulle has said he wants to work on. Most of the work was already done. It requires some protocol upgrades and some other tricky work, then you will probably be able to tell Bitcoin-Qt/bitcoind how much disk space you want to give it. It'll store as much of the chain as it can within that space.

The payment protocol is foundational work, it is required for a lot of other features. That's why it's important. Pruning is a needed scalability fix but we're far from having some kind of crisis because people can't afford the disk space. A lot of people blow more than 8 gigs of disk space on apps they simply forgot about.
kjj
legendary
Activity: 1302
Merit: 1026
July 03, 2013, 07:17:37 AM
#18
You know I really don't care if that's true. Because what I do know is that it's wasting time on a problem that we might have over spending vital time we don't have on a problem that we have RIGHT NOW which frustrates me to no end.

A CEO with his prioritization skills would have been fired long ago.

There is certainly room for honest disagreement here.  One could argue that the lack of authentication for addresses and transactions is a problem right now, while the size of the blockchain does not appear to be a problem today, but might become one in the future.

At any rate, Gavin isn't a CEO, he is an engineer.  The payment protocol is something that can surely be solved, while the solution to the blockchain size is unclear.  You can hardly fault an engineer that wishes to solve a problem he knows he can solve in preference to a spinning his wheels on something vague.
legendary
Activity: 1078
Merit: 1003
July 03, 2013, 06:50:52 AM
#17
He intends to solve a problem that was already solved by private companies by implementing it into the reference client

So, what are they working on?

An integrated secure invoice system. Something the functionality of which Bitpay and all the other similar platforms already offer.

I think your understanding of the payment protocol is incorrect.

You know I really don't care if that's true. Because what I do know is that it's wasting time on a problem that we might have over spending vital time we don't have on a problem that we have RIGHT NOW which frustrates me to no end.

A CEO with his prioritization skills would have been fired long ago.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
July 03, 2013, 06:32:39 AM
#16
"Ultimate blockchain compression" does not compress the block chain. It's misnamed.

The most important piece of the chain is the unspent outputs set (UTXO set). This is already highly compressed using a custom compression algorithm. The rest of the chain is not compressed, but the solution to growth there is called pruning, which allows for deletion of old data.

Yes, you are technically right, but "compression" captures the thrust of the Reiner-Maaku project for public consumption. I see this as the single most important piece of work being done on Bitcoin at the moment as it partly addresses the long-term scalability problem.
hero member
Activity: 826
Merit: 500
Crypto Somnium
July 03, 2013, 06:24:06 AM
#15
   Are there any plans at some point to add blockchain compression technology to the bitcoin client? The blockchain size will continue to increase exponentially as adoption increases and will become a victim of its own success.  It would seem to me adding some compression algorithms to the bitcoin-qt client would be of significant benefit and eventually an outright necessity to maintain the integrity of the system.


+1 running out of space here  Sad

also i have a question a bit off topic but can a wallet.dat file get corrupted ?
kjj
legendary
Activity: 1302
Merit: 1026
July 03, 2013, 05:53:57 AM
#14
He intends to solve a problem that was already solved by private companies by implementing it into the reference client

So, what are they working on?

An integrated secure invoice system. Something the functionality of which Bitpay and all the other similar platforms already offer.

I think your understanding of the payment protocol is incorrect.
legendary
Activity: 1232
Merit: 1094
July 03, 2013, 04:21:48 AM
#13
An integrated secure invoice system. Something the functionality of which Bitpay and all the other similar platforms already offer.

Hmm, I would tend to agree that focusing on "core" functionality of bitcoin should be highest priority.  Would the intention be that this includes some kind of protocol change?
legendary
Activity: 1078
Merit: 1003
July 03, 2013, 04:20:05 AM
#12
He intends to solve a problem that was already solved by private companies by implementing it into the reference client

So, what are they working on?

An integrated secure invoice system. Something the functionality of which Bitpay and all the other similar platforms already offer.
legendary
Activity: 1232
Merit: 1094
July 03, 2013, 04:18:11 AM
#11
He intends to solve a problem that was already solved by private companies by implementing it into the reference client

So, what are they working on?
legendary
Activity: 1078
Merit: 1003
July 02, 2013, 07:10:04 PM
#10
Has the main dev team discussed these ideas at all? I think we all see the blockchain size will increase exponentially as acceptance of bitcoin grows.

Unfortunately Gavin has different priorities. He intends to solve a problem that was already solved by private companies by implementing it into the reference client instead of focusing on scalability which he knows when push comes to shove he will likely be able to lobby and get enough support for a hard fork that removes the blocksize limit and he basically doesn't care how big the blockchain gets or how few full nodes we have.
hero member
Activity: 907
Merit: 1003
July 02, 2013, 05:41:18 PM
#9
Would pruning delete bitcoin ledger entries if the person hadn't used it before the date of the pruning?

I guess the obvious answer is of course they wouldn't just make people's bitcoin disappear... but how would you avoid that if you were "pruning" the blockchain??
Pages:
Jump to: