Pages:
Author

Topic: New paper: Accelerating Bitcoin's Trasaction Processing - page 8. (Read 36352 times)

newbie
Activity: 21
Merit: 0
Thank you!

One thing that makes it a little hard for me to follow your notation is that I am unclear on the definitions of beta and lamda.  Beta is the rate of block additions to the "main chain" and lamda is the rate of block additions which have some potential to wind up in the main chain if they are accepted.  At least, that is my reading of it.  

However, there is no point in which a block becomes 100% canonized in the main chain (well, other than checkpointing).  It is therefore hard to make a rigorous definition of beta.  After all, the block chain is a probabilistic not a deterministic solution to the distributed consensus problem.  It seems that the difference between lamda and beta is also called the orphan rate, a term you choose to not use, possibly because you are afraid people will accuse you  of suggesting we put orphans to work in the mines Wink  

You are right, the chain is never really fixed.  But think of it this way: under the standard protocol, a chain only gets replaced by a longer one. We thus only look at the length of the longest chain (ignoring which one it actually is) and check the rate of events that make it grow (this could include a switch to a different chain).

Another comment is that some of the maths you develop could also be useful in analysis of shares in a mining pool.        

Thanks! That's an interesting comment that we did not consider. I suppose you refer to issues related to stale shares?
legendary
Activity: 1264
Merit: 1008
Thank you!

One thing that makes it a little hard for me to follow your notation is that I am unclear on the definitions of beta and lamda.  Beta is the rate of block additions to the "main chain" and lamda is the rate of block additions which have some potential to wind up in the main chain if they are accepted.  At least, that is my reading of it. 

However, there is no point in which a block becomes 100% canonized in the main chain (well, other than checkpointing).  It is therefore hard to make a rigorous definition of beta.  After all, the block chain is a probabilistic not a deterministic solution to the distributed consensus problem.  It seems that the difference between lamda and beta is also called the orphan rate, a term you choose to not use, possibly because you are afraid people will accuse you  of suggesting we put orphans to work in the mines Wink 

Another comment is that some of the maths you develop could also be useful in analysis of shares in a mining pool.         
newbie
Activity: 21
Merit: 0
I didn't read the paper yet, but I have some problems even understanding the basic concept.

If the idea is to also accept transaction in orphaned blocks (or forks), then how can one even be sure all nodes are aware of the transactions contained in the orphans? With the current longest-chain scheme this is easy: I can just walk back through all the blocks and get a list of all transaction which is exactly the list of all valid transactions. With the proposed change I don't see how I can get a provably complete list of all valid transactions.

Please someone enlighten me.


The idea is to still accept transactions only from a single chain, but to allow blocks off the chain to also contribute confirmations to the part of the chain they reference.
Think of it this way:
Each block gives 1 confirmation to the block it references directly, and to all preceding blocks.
If there is a fork at level h, and you need to decide which one to consider as the right path to take, simply pick the block with the most confirmations (rather than the block that supports the longest chain).


RE altcoins based on this improvement:

1) I think it's more important to see a testnet that is stress-tested with large transaction loads. This is something worth doing that others have mentioned on countless occasions.

2) IMO, if crypto-currencies succeed as we hope they do (and Bitcoin is probably the most likely to grow to large sizes at this point) they will end up with high transaction rates. So we think eventually something will have to be done. Half of our paper is dedicated to what happens to Bitcoin without our protocol improvement. There is room for growth there, but we worry that after a while security becomes an issue (the 50% attack is easier, and you need to wait more blocks for the same level of certainty when facing weaker attackers).
newbie
Activity: 13
Merit: 0
i will be carefully monitoring this technology; if it proves to be successful on paper and test runs, and bitcoin developers fail to implement this, i'll be investing in alt-coin which uses it.
hero member
Activity: 772
Merit: 501
This is incredibly serendipitous. I began writing a paper in Google Docs this week, titled:

"Disposable, non-orphaning, merge-mined blockchains for rapid transaction confirmation"

The idea is to have a parallel merge-mined mini-block 'grove' (since there could be more than one tree) to determine which transaction is chosen in the event of a double spend.

I'll describe it in case it can contribute to the discussion.

Disposable, non-orphaning, merge-mined blockchains for rapid transaction confirmation

The proposal is to have parallel chains of mini-blocks (3 seconds) that begin at every new 'real block' (10 minutes), and rather than orphaning forks of these mini-block chains with lower difficulty, nodes save them along with the greatest difficulty chain.

This parallel 'tree' or 'grove' of mini-block chains would be merge-mined with the main blockchain, to conserve hashing power, and would give a higher resolution picture of when a transaction was first propagated into the network.

Upon the creation of a new 10-minute block, nodes would check if any of the transactions contained in it (in-block transactions) have a double spend in the parallel mini-block chains that has at least 10 more mini-blocks backing it than backing the in-block transaction. The mini-blocks counted toward this comparison can be in any of the chains, including in lower difficulty forks. If the block does contain such a transaction, then it is invalid and not accepted by other nodes.

Under this rule, a transaction could be confirmed in an average of 30 seconds. Once a particular transaction is in a mini-block with 10 mini-blocks (in any chain) built on top of it, with no double spends detected, a person can be confident that the transaction cannot be replaced by a double spent transaction.

There are two further rules required to make this work:

1) Require that a new real block contain all transactions with at least 10 mini-blocks backing them. Without such a rule, mining nodes can simply refuse to include any transactions into the main block they're hashing on, or refuse to include those transactions which they've detected a double spend attempt on, to avoid the risk of a block they generate being invalid according to the mini-blockchain rule and orphaned.

To avoid an attacker being able to take advantage of this rule to bloat the blockchain with zero or low-fee transactions, the rule can be qualified with the '>=10 mini-block backed transactions that blocks are required to include' being only those with a fee to kb ratio that is 10 percent greater than the median fee to kb ratio of all fee-containing transactions over the last 100 real blocks.

The list of transactions with fees that would be used to determine the median transaction fee to kb ratio would be weighed by the 'bitcoin days destroyed' of each transaction, to prevent manipulation of the median through generation of a large number of small value, low input age, transactions.

2) Nodes refuse to build on mini-block chains that contain a transaction that conflicts (contains a double spend) with any transaction that has at least 3 mini-blocks backing it. This would ensure that any double spend transaction that is generated more than an average of 9 seconds after the initial transaction will have very few mini-blocks built on top of it, virtually guaranteeing the first transaction will be included in the next real block if enough time transpires until the new block is generated for 7 more mini-blocks to be generated.

Edit, a couple more thoughts:

  • Using these rules, there could be cases when there is a network split due to a block being accepted by some nodes, and rejected by others, due to it containing a double spend and there existing conflicting views of how many mini-blocks back the original transaction (one set of nodes could have a local view showing the first transaction out backs the second by 9 mini-blocks, and another could have a local view showing the first transaction out-backing the second by the required 10 mini-blocks to be secure from a double spend). In this case, an additional rule, of nodes accepting a new block, even if it contains an 'illegal' double spend, if the number of mini-blocks backing the new block exceeds the number of mini-blocks generated on top of the old block after the fork, by a particular number (e.g. 5 blocks), can be used to resolve network split
  • An additional rule could be that new blocks must publish in their headers the 'median transaction fee to kb ratio' for the last 100 blocks. This would allow SPV clients to operate with lower trust requirements, since they would know just from the block headers what transaction fee they need to include with their transaction to guarantee it will be secure from a double spend attempt at/after 10 mini-blocks

The strengths of this proposal are:

Like the GHOST protocol additions proposed in the linked paper, this provides faster confirmations without a rise in orphans caused by propagation latency reducing the effective network hashrate, as results from simply shortening block times.

There would be no extra sets of block headers in the blockchain as would be required with more frequent blocks.

It's a soft-fork that is backward compatible with non-mining nodes that don't upgrade.
hero member
Activity: 644
Merit: 500
P2P The Planet!
You guys have made the biggest contribution to the biggest problem with cryptocurrencies, transaction times.


Now we just need to implement it + zerocoin support. Ahh how i love open source.

If the bitcoin devs sit on their ass and don't look into it, you guys should make your own altcoin with the innovation, Prove it works to the world.
sr. member
Activity: 266
Merit: 250
Hi Aviv,

Has your paper already been peer-reviewed?

We are peer-reviewing it now ;-). No better review mechanisms than a btctalk thread.

Firstly: it's awesome to see such research.

Now, is litecoin too conservative to try to incorporate this?

I think yes, so let's make TreeCoin?



One of the reasons none of the altcoins have ever appealed to me is their lame ass names. No offense, but I can't see the whole world using something called Treecoins.

Would it be possible to implement this directly from the Bitcoin blockchain, creating a split? Where the current holders of bitcoins now are also the holders of Bitcoin 2.0? And then people will be vested in both with a market between the two, eventually finding out which one "wins."

This is how I would go about it when making an altcoin, too. But for most altcoiners the appeal is that they can be early adopters with their new coin. You don't have that with the bitcoin 2.0 approach.


Yeah, but this is for the good of Bitcoin, not for the altcoiners.

Still, I'm not so sure how good it would be to fragment the Bitcoin economy.

If the peer reviewing goes well, it seems to me that this is more of a candidate for the dev team to work on, than an alt-coin.
donator
Activity: 2772
Merit: 1019
Hi Aviv,

Has your paper already been peer-reviewed?

We are peer-reviewing it now ;-). No better review mechanisms than a btctalk thread.

Firstly: it's awesome to see such research.

Now, is litecoin too conservative to try to incorporate this?

I think yes, so let's make TreeCoin?



One of the reasons none of the altcoins have ever appealed to me is their lame ass names. No offense, but I can't see the whole world using something called Treecoins.

Would it be possible to implement this directly from the Bitcoin blockchain, creating a split? Where the current holders of bitcoins now are also the holders of Bitcoin 2.0? And then people will be vested in both with a market between the two, eventually finding out which one "wins."

This is how I would go about it when making an altcoin, too. But for most altcoiners the appeal is that they can be early adopters with their new coin. You don't have that with the bitcoin 2.0 approach.
donator
Activity: 2772
Merit: 1019
I didn't read the paper yet, but I have some problems even understanding the basic concept.

If the idea is to also accept transaction in orphaned blocks (or forks), then how can one even be sure all nodes are aware of the transactions contained in the orphans? With the current longest-chain scheme this is easy: I can just walk back through all the blocks and get a list of all transaction which is exactly the list of all valid transactions. With the proposed change I don't see how I can get a provably complete list of all valid transactions.

Please someone enlighten me.
hero member
Activity: 614
Merit: 500
Hi Aviv,

Has your paper already been peer-reviewed?

We are peer-reviewing it now ;-). No better review mechanisms than a btctalk thread.

Firstly: it's awesome to see such research.

Now, is litecoin too conservative to try to incorporate this?

I think yes, so let's make TreeCoin?



One of the reasons none of the altcoins have ever appealed to me is their lame ass names. No offense, but I can't see the whole world using something called Treecoins.

Would it be possible to implement this directly from the Bitcoin blockchain, creating a split? Where the current holders of bitcoins now are also the holders of Bitcoin 2.0? And then people will be vested in both with a market between the two, eventually finding out which one "wins."
legendary
Activity: 1120
Merit: 1152
Hi Aviv,

Has your paper already been peer-reviewed?

We are peer-reviewing it now ;-). No better review mechanisms than a btctalk thread.

+1


Firstly: it's awesome to see such research.

Now, is litecoin too conservative to try to incorporate this?

I think yes, so let's make TreeCoin?

Note how this idea can easily be implemented as a merge-mined coin, and if merge-mined with Bitcoin, that can naturally lead towards an eventual "pure" Bitcoin implementation; I'd recommend considering using the Bitcoin UTXO set as the basis for the coin's initial distribution. Of course, until the merge-mined chain gets >50% hashing power of Bitcoin as a whole it's in theory insecure, but that's IMO fine for an experiment that may lead to something concrete. (do it merge-mined with testnet first)
donator
Activity: 2772
Merit: 1019
Hi Aviv,

Has your paper already been peer-reviewed?

We are peer-reviewing it now ;-). No better review mechanisms than a btctalk thread.

Firstly: it's awesome to see such research.

Now, is litecoin too conservative to try to incorporate this?

I think yes, so let's make TreeCoin?

legendary
Activity: 1120
Merit: 1152
Speaking of, while the paper presents a solution preserving security guarantees, a quick skim of it doesn't seem to indicate they take into account the incentives around block propagation. If you wind up with a situation well large, centralized, mining pools earn more money as a part of this high-speed propagation game, even though in theory all the work being done contributes towards 51% security, the overall result may be a serious negative due to the incentives towards centralization. Lately I've done some work (pdf) on that topic; it's a very important crypto-currency design consideration that I'd like to see other people analyzing as well.

You make a really good point. Decentralization is hurt at higher transaction rates, and better incentives are needed. We do mention it as a problem that remains at the very end of the paper (issues related to decentralization). The problem seems inherent to both our suggestion and to Bitcoin's current implementation.

Let me just say, that with small block sizes (i.e., when full bloom filters are implemented) it will be very hard to distribute blocks faster than the network does already. IMO, this is less of a problem than people think.

Oh good to hear your thinking about this! I agree with you that there aren't easy answers yet.

Something to keep in mind is that the network distributing blocks != miners distributing blocks. Large miners and mining pools can and do peer to each other directly, so propagation delays on the global P2P bitcoin network don't affect them the same way as smaller miners who are not in a position to do that. When the block "interval" is getting down to just a few seconds, even stuff like physically locating your mining pool hashing power closer to other concentrations of hashing power may net you more revenue, with obvious problems for decentralization. You might have a small mining setup, and want to contribute that decentralized hashing power, but find you can't profitably because all the big mining pools already have peering agreements with each other and dedicated high-speed connections.

Also I'm not convinced that bloom filters are the way to do block distribution with only txid hashes; peers simply maintaining lists of what transactions each peer knows about and transmitting short tags instead is probably more efficient in terms of bandwidth as it is not probabilistic.

Finally, I've got a proposal for a blockchain where blocks themselves could only contain tx hashes at a fundamental level; transactions don't get verified at all in the scheme and the blockchain is strictly for proof-of-publication. Seems to me that it'd be well-suited to your idea as the lack of verification makes it easy to have actual tx publication be something done after the fact. (though note my comments about issues when there is a lack of incentives to actually publish the data that is being committed)
legendary
Activity: 1106
Merit: 1004
We've got some incredibly naive ideas floating around the Bitcoin space right now - like the idea that just removing the blocksize limit entirely will work out fine - and we need research into scalability solutions that considers incentives and the resulting security carefully. That kind of reasoning tends to involve a lot of math, rather than platitudes about "market forces"

Wow... This presumption of knowledge of yours reminded me of this brilliant quote, perfectly applicable here:

Quote from: Friedrich von Hayek
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

The subjective reasoning of millions can't involve "lots of math", and can't be modeled by it. You can only reach meaningful conclusions through praxeology. And, if you were to actually try that, you'd probably recognize one day that you should not try to impose top-down rulings instead of organic bottom-up organizations, as if people were peons on a chessboard to be manipulated by enlightened "planners". Anyways... that's not the topic here, I don't want to derail this very interesting thread, so I'll stop here. It would be nice if you could also try not to let insults on the air like this every time you can...
legendary
Activity: 1120
Merit: 1152
I'm going to have to sit down and read the paper more carefully to have a solid opinion on it, but my meta-opinion is that I think it's great to see people taking scalability and blocksize seriously on an academic level. We've got some incredibly naive ideas floating around the Bitcoin space right now - like the idea that just removing the blocksize limit entirely will work out fine - and we need research into scalability solutions that considers incentives and the resulting security carefully. That kind of reasoning tends to involve a lot of math, rather than platitudes about "market forces"


Speaking of, while the paper presents a solution preserving security guarantees, a quick skim of it doesn't seem to indicate they take into account the incentives around block propagation. If you wind up with a situation well large, centralized, mining pools earn more money as a part of this high-speed propagation game, even though in theory all the work being done contributes towards 51% security, the overall result may be a serious negative due to the incentives towards centralization. Lately I've done some work (pdf) on that topic; it's a very important crypto-currency design consideration that I'd like to see other people analyzing as well.

I'm not a developer, so excuse my ignorance. But wouldn't 1 second blocks essentially make solo mining much more plausible? It seems at that rate even lowly cpu miners could pull of solo mining. Might not be profitable per se, but the chances of at least mining a block solo would be greatly increased.

Yeah, but if the result of the solution is such that solo-mining earns you only 50% of the revenue that hashing in a large, 25% hashing power pool got you, you're incentive would be to mine in the pool instead leading to centralization. The problem is we know that pools already have higher revenue per unit hashing power; my suspicion is the solution proposed by the paper makes that problem even worse. But I've got a flight to catch in an hour or two and will get a chance to think about it all more carefully later. Smiley
newbie
Activity: 21
Merit: 0
Speaking of, while the paper presents a solution preserving security guarantees, a quick skim of it doesn't seem to indicate they take into account the incentives around block propagation. If you wind up with a situation well large, centralized, mining pools earn more money as a part of this high-speed propagation game, even though in theory all the work being done contributes towards 51% security, the overall result may be a serious negative due to the incentives towards centralization. Lately I've done some work (pdf) on that topic; it's a very important crypto-currency design consideration that I'd like to see other people analyzing as well.

You make a really good point. Decentralization is hurt at higher transaction rates, and better incentives are needed. We do mention it as a problem that remains at the very end of the paper (issues related to decentralization). The problem seems inherent to both our suggestion and to Bitcoin's current implementation.

Let me just say, that with small block sizes (i.e., when full bloom filters are implemented) it will be very hard to distribute blocks faster than the network does already. IMO, this is less of a problem than people think.
hero member
Activity: 614
Merit: 500
I'm going to have to sit down and read the paper more carefully to have a solid opinion on it, but my meta-opinion is that I think it's great to see people taking scalability and blocksize seriously on an academic level. We've got some incredibly naive ideas floating around the Bitcoin space right now - like the idea that just removing the blocksize limit entirely will work out fine - and we need research into scalability solutions that considers incentives and the resulting security carefully. That kind of reasoning tends to involve a lot of math, rather than platitudes about "market forces"


Speaking of, while the paper presents a solution preserving security guarantees, a quick skim of it doesn't seem to indicate they take into account the incentives around block propagation. If you wind up with a situation well large, centralized, mining pools earn more money as a part of this high-speed propagation game, even though in theory all the work being done contributes towards 51% security, the overall result may be a serious negative due to the incentives towards centralization. Lately I've done some work (pdf) on that topic; it's a very important crypto-currency design consideration that I'd like to see other people analyzing as well.

I'm not a developer, so excuse my ignorance. But wouldn't 1 second blocks essentially make solo mining much more plausible? It seems at that rate even lowly cpu miners could pull of solo mining. Might not be profitable per se, but the chances of at least mining a block solo would be greatly increased.
newbie
Activity: 24
Merit: 0
From what I understand, the main idea of the paper is the change of selecting the "valid" blockchain.

Currently, when a new block arrives, figuring out whether the current valid blockchain changes costs O(1) processing time: for each block you need to remember what the cost of the chain is until this block. Right now, in your scheme, I do not see a way to improve on Theta(depth) processing time for each block which comes in. Just wondering: do you see a better way to do this? I wouldn't be surprised if an O(log(depth)) data structure exists.
hero member
Activity: 630
Merit: 500
OP, I haven't read your academic paper, but I'm already interested by the summary you just made.

Could you tell me if your solution has anything to do with what is proposed by Maged in this thread: https://bitcointalk.org/index.php?topic=57647.0;all ?

A block-tree could eventually render freezing attacks more difficult. Such advantage alone already makes it interesting. If as a plus it also improves performance significantly, then it's definitely something that should be studied more seriously.
legendary
Activity: 1120
Merit: 1152
I'm going to have to sit down and read the paper more carefully to have a solid opinion on it, but my meta-opinion is that I think it's great to see people taking scalability and blocksize seriously on an academic level. We've got some incredibly naive ideas floating around the Bitcoin space right now - like the idea that just removing the blocksize limit entirely will work out fine - and we need research into scalability solutions that considers incentives and the resulting security carefully. That kind of reasoning tends to involve a lot of math, rather than platitudes about "market forces"


Speaking of, while the paper presents a solution preserving security guarantees, a quick skim of it doesn't seem to indicate they take into account the incentives around block propagation. If you wind up with a situation well large, centralized, mining pools earn more money as a part of this high-speed propagation game, even though in theory all the work being done contributes towards 51% security, the overall result may be a serious negative due to the incentives towards centralization. Lately I've done some work (pdf) on that topic; it's a very important crypto-currency design consideration that I'd like to see other people analyzing as well.
Pages:
Jump to: