Pages:
Author

Topic: Increasing blocksize dynamically w/economic safeguards - the ideal compromise? (Read 1599 times)

staff
Activity: 4270
Merit: 1209
I support freedom of choice
Again, you didn't prove anywhere your "tragedy of the common problem" on Bitcoin, while instead I showed a proof of the opposite:

http://www.coindesk.com/bitcoin-miners-ditch-ghash-io-pool-51-attack

You can't continue to put in a base of your logic, but still this will not make it truth/proven.
newbie
Activity: 57
Merit: 0
Quote
Maybe, but I don't see how can avoid this situation even if miners aren't malicious, my scenario still apply.
There are already some dynamic proposals Smiley

https://zander.github.io/posts/Blocksize%20Consensus/

Out of here... many things are moving to make it possible the on-chain scaling.

I know about dynamics proposal, even Core has apparently one in their cardboard box, they are commonly called "FlexCap", but they all still fall in the tragedy of the common in the long term, it's the same as unlimited, the cap is dynamic but doesn't have upper bound (unless it have an hard upper bound like let's say 8 MB).

We are back to square one, small nodes will disappear and Bitcoin will become PayPal in Bitmain data centers...  Cry

Let's face it blockchain tech ain't meant to be VISA they are highly inefficient, their value obviously derive on being a censorship resistant settlement network...

I don't see any viable solution for massive on-chain scaling without sacrificing core values and ethos of BTC...

staff
Activity: 4270
Merit: 1209
I support freedom of choice
Quote
Maybe, but I don't see how can avoid this situation even if miners aren't malicious, my scenario still apply.
There are already some dynamic proposals Smiley

https://zander.github.io/posts/Blocksize%20Consensus/

Out of here... many things are moving to make it possible the on-chain scaling.
newbie
Activity: 57
Merit: 0
@Nicolas Tesla
You as many other are currently missing one important thing.

What happen if the network lose a large number of nodes? The decentralisation feature of the network starts to missing.
What happen to the market if the decentralisation feature of the network starts to missing? The confidence in the Bitcoin gets lower, and so the price.
What happen to the miners if the price gets lower? They get less money.

So, even with an unlimited block size, maybe there can be problems from a possible attack from malicious entity, and so some dynamic barriers are needed, but there is no way that miners will start to make huge blocks, because this goes directly against their interest on money.

Again, not because they have a good heart, but even just because of their greed, as the Nakamoto's consensus works. (6. Incentive)

And they have millions invested that need to cover on all the next months/years.

Good past example of this economic behavior:
http://www.coindesk.com/bitcoin-miners-ditch-ghash-io-pool-51-attack/

Maybe, but I don't see how can avoid this situation even if miners aren't malicious, my scenario still apply. The price could still up because most people doesn't care about decentralization and financial sovereignty. This Paypal alike coin could still survive with enough big business and gov supporting it !

BU or any big block proposal will fall in that tragedy of the common price rising or not.

If BUcoin become successful we are back to square one: a centralized system in the hand of govs and big businesses, the exact reason I left fiat in the first place !

A settlement system with ten thousands nodes and a good portion of them on Tor run by the average Joe and with decentralized trust Layer II is far better in term of censorship resistance than another pale and ineffective copy of PayPal.

In my view BU coin could be successful if by successful you meant by price rise but at the cost to sell your soul again to gov, data centers and big businesses in term of revolution and financial sovereignty it would have failed miserably !
staff
Activity: 4270
Merit: 1209
I support freedom of choice
@Nicolas Tesla
You as many other are currently missing one important thing.

What happen if the network lose a large number of nodes? The decentralisation feature of the network starts to missing.
What happen to the market if the decentralisation feature of the network starts to missing? The confidence in the Bitcoin gets lower, and so the price.
What happen to the miners if the price gets lower? They get less money.

So, even with an unlimited block size, maybe there can be problems from a possible attack from malicious entity, and so some dynamic barriers are needed, but there is no way that miners will start to make huge blocks, because this goes directly against their interest on money.

Again, not because they have a good heart, but even just because of their greed, as the Nakamoto's consensus works. (6. Incentive)

And they have millions invested that need to cover on all the next months/years.

Good past example of this economic behavior:
http://www.coindesk.com/bitcoin-miners-ditch-ghash-io-pool-51-attack/
newbie
Activity: 57
Merit: 0
gmaxwell as a XMR supporter and holder (I think) what do you think of the Monero's way to deal with the blocksize problem? they are running a dynamic blocksize solution, what do you think about it? what are the differences from this compared to BU's proposal?

Do you think Monero can end up in a tricky situation if the volume of transaction starts growing where the blocksize becomes too big then it becomes too centralized? (which would make it less secure than just using bitcoin and additional anonymous features). I heard some guy talking about how they have a method of raising the fees if that happens, but then it's the same problem again (fees too high) so whats the point?

Im trying to guess if I want to buy a long term position in Monero but since im not a coder and only an investor I can't understand all the details  Cry

Can anybody explain the pros and cons in an ELI5 way or its not possible?

I just see a lot of people here saying how dynamic solution is the best and "blockstream don't want it so they can profit from sending most of transaction volumes throught LN" and stuff specially that franky1 guy is saying that here all day and as a non coder I dont know who to believe anymore  Cry

Yes as any flex cap can be gamed or it just fall in the tragedy of the common:
You have let's say 1000 nodes,
First you say lets raise to 2 M.
10 % will give up because of higher cost.
You still have 900 nodes.
Then you want raise to 4 MB.
20 % drop out.
You still have 720 nodes...
Now you say you see we lost only 28 %.
Lets raise to 10 MB.
Half now drop out this a unbearable for the average joe,
you have 360 nodes...
you have lost 64 % of your nodes...
This is tragedy of the common.
newbie
Activity: 57
Merit: 0
Here is a sequence from blockchain info - why will increasing blocksize help with this situation.

Quote
448492 (Main Chain)   2017-01-16 14:11:08   BW.COM   000000000000000000da57d09bae60aad857065282df71d390e19fc37b888db3   399.24
448491 (Main Chain)   2017-01-16 14:06:48   BitFury   00000000000000000302e906a677bd35944940aa9612ac3191583eef95a7e969   478.14
448490 (Main Chain)   2017-01-16 14:04:00   BW.COM   0000000000000000012ee8ec9caea10946f5310f60a47191038e47e43bd5eeed   342.04
448489 (Main Chain)   2017-01-16 14:01:57   BW.COM   00000000000000000197f8a1e665a11461bee7a29a73011bcf5b0fbf3aa20bce   496.43
448488 (Main Chain)   2017-01-16 13:57:47   SlushPool   0000000000000000003fdcae2df726d493136831ad7fc7de1378a2d24e0ba1d3   195.67
448487 (Main Chain)   2017-01-16 13:56:29   BTCC Pool   000000000000000001edfa42ad5754deba9df7359e920a23c849b97df05aa56c   998.86
448486 (Main Chain)   2017-01-16 13:51:53   BW.COM   0000000000000000032ad47bb82c250652cefd086408c14fc4c02feab9e7ceab   998.03
448485 (Main Chain)   2017-01-16 13:38:09   AntPool   0000000000000000016095498c650309869c2e327ba5f4783ce6a3bfc24a8e7b   472.42

In short blocks aren't full ? Some are not even half of the limit ! WTF ?
legendary
Activity: 2814
Merit: 2472
https://JetCash.com
Here is a sequence from blockchain info - why will increasing blocksize help with this situation.

Quote
448492 (Main Chain)   2017-01-16 14:11:08   BW.COM   000000000000000000da57d09bae60aad857065282df71d390e19fc37b888db3   399.24
448491 (Main Chain)   2017-01-16 14:06:48   BitFury   00000000000000000302e906a677bd35944940aa9612ac3191583eef95a7e969   478.14
448490 (Main Chain)   2017-01-16 14:04:00   BW.COM   0000000000000000012ee8ec9caea10946f5310f60a47191038e47e43bd5eeed   342.04
448489 (Main Chain)   2017-01-16 14:01:57   BW.COM   00000000000000000197f8a1e665a11461bee7a29a73011bcf5b0fbf3aa20bce   496.43
448488 (Main Chain)   2017-01-16 13:57:47   SlushPool   0000000000000000003fdcae2df726d493136831ad7fc7de1378a2d24e0ba1d3   195.67
448487 (Main Chain)   2017-01-16 13:56:29   BTCC Pool   000000000000000001edfa42ad5754deba9df7359e920a23c849b97df05aa56c   998.86
448486 (Main Chain)   2017-01-16 13:51:53   BW.COM   0000000000000000032ad47bb82c250652cefd086408c14fc4c02feab9e7ceab   998.03
448485 (Main Chain)   2017-01-16 13:38:09   AntPool   0000000000000000016095498c650309869c2e327ba5f4783ce6a3bfc24a8e7b   472.42
legendary
Activity: 868
Merit: 1006
gmaxwell as a XMR supporter and holder (I think) what do you think of the Monero's way to deal with the blocksize problem? they are running a dynamic blocksize solution, what do you think about it? what are the differences from this compared to BU's proposal?

Do you think Monero can end up in a tricky situation if the volume of transaction starts growing where the blocksize becomes too big then it becomes too centralized? (which would make it less secure than just using bitcoin and additional anonymous features). I heard some guy talking about how they have a method of raising the fees if that happens, but then it's the same problem again (fees too high) so whats the point?

Im trying to guess if I want to buy a long term position in Monero but since im not a coder and only an investor I can't understand all the details  Cry

Can anybody explain the pros and cons in an ELI5 way or its not possible?

I just see a lot of people here saying how dynamic solution is the best and "blockstream don't want it so they can profit from sending most of transaction volumes throught LN" and stuff specially that franky1 guy is saying that here all day and as a non coder I dont know who to believe anymore  Cry
newbie
Activity: 21
Merit: 1
out of band fees can work in both directions, e.g. including rebates.  (and, in fact rebates can be done inband with coinjoins with no trust)

Also consider what your scheme does when a majority hashpower censors any transaction paying a high inband fee level.

I'm not sure I understand how you're suggesting a rebate would work?

What do you mean by a majority hashpower, a 51% attack? If a transaction has a high fee surely any miner is incentivised to include it in a block?
staff
Activity: 4284
Merit: 8808
out of band fees can work in both directions, e.g. including rebates.  (and, in fact rebates can be done inband with coinjoins with no trust)

Also consider what your scheme does when a majority hashpower censors any transaction paying a high inband fee level.
newbie
Activity: 21
Merit: 1
There are no mandated fees in the Bitcoin protocol so the natural response to schemes like this is for miners to simply accept fees via other means (such as the direct txout method supported by eligius since 2011, or via outputs with empty scriptpubkeys, or via out of band fees) and give users a discount for using them. The expected result would be fees migrating out of the fee area, and protocols that depend on them (like your suggestion) becoming dysfunctional.  Sad

I previously tried to rescue this class of proposal by having the  change not be to fees but by modifying the lowness of the required hash (effective difficulty), but it's difficult to do that in the presence of subsidy.

Unrelated, as you note your proposal is no constraint if miners agree-- this is also why it fails to address the conflict of interest between miners (really mining pools), who are paid to include transactions, and everyone else-- who experiences them as an externalize except to the extent that they contribute to economic growth (not at all a necessity: e.g. many companies want to use the bitcoin blockchain without using the Bitcoin currency at all).  Still, better to solve one issue even if all can't be solved.

I did some analysis of the transaction fees, and used the data to properly demonstrate how all the risks you identify can be mitigated!

Out of band fees can be completely disincentivised.

Please have a read and let me know if you have any thoughts.

https://seebitcoin.com/2017/01/i-analysed-24h-worth-of-transaction-fee-data-and-this-is-what-i-discovered/
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
Accept in what way? Spoofing nodes is only a social manipulation strategy, it does not achieve anything in terms of votes or consensus. The whole point is node counts of signalling can't be trusted. Period. There is no way to assess a nodes contribution to the network. #2 cannot happen. Only miners or transactions can have any sway on the blockchain, as far as the blockchain is concerned nodes are read only.

Accept in the sense that we have no alternative.  No one can think of a way to prevent dishonest nodes.  Social manipulation or otherwise, it's still a danger to the overall health of the network and needs to be considered.  Whether node counts can be trusted or not, the fact remains it is being measured and will likely have an impact on whether or not SegWit activates.  Miners won't risk activation if they perceive a chance of the network not relaying their new blocks.  They'll want to see nodes running compatible software before it goes live.  Even though there's no way of knowing for sure if the figures are accurate, it's all we have to go by.  I'm suggesting that similar figures could be used for blocksize.

What we're discussing here isn't a problem with factoring node statistics into a dynamic blocksize proposal, but with an inherent flaw in the consensus mechanism itself.  One we can't easily get rid of, because there's currently nothing to stop dishonest node operators from spoofing their software version.  Similarly, there's nothing to stop them spoofing a preference for blocksize if that could be listed and quantified in the same way we list and quantify the software version.  If we can do one, we can obviously do the other.  It's not perfect, granted, but at least it would give a rough indication of what size adjustment the network would generally be prepared to allow before the miners started their signalling for larger or smaller blocks.  

If we can find a way to implement algorithmic safeguards to limit the damage that could potentially be caused by dishonest participants, perhaps those could also be extended to future fork proposals and kill two birds with one stone.  Effectively making the entire system more resilient.
newbie
Activity: 21
Merit: 1
Aside from the "fuzzy consensus" bit, would the same shortcomings not apply to nodes signalling for a softfork as is happening currently?  That's measurable, so this should be equally so.

Nodes don't signal for SegWit activation, only miners do.

I assumed nodes in BU signalled their max block size and depth they wanted to be overridden at, but perhaps it is signalled in blocks too - I haven't thoroughly examined the implementation, I just know that there is no consensus on what consensus even is.

Plus, we're apparently prepared to accept shenanigans like client spoofing, when that could have a significantly adverse effect on consensus, because there's no easy way to prevent it.  It's the nature of the beast.  So while I'm of the mindset "never say never", it's highly unlikely we're going to find a completely fool-proof solution.  Hence #1 and #3 being necessary as well.

Accept in what way? Spoofing nodes is only a social manipulation strategy, it does not achieve anything in terms of votes or consensus. The whole point is node counts of signalling can't be trusted. Period. There is no way to assess a nodes contribution to the network. #2 cannot happen. Only miners or transactions can have any sway on the blockchain, as far as the blockchain is concerned nodes are read only.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
The insurmountable problem with #2, beyond BU's implementation making something absolute like consensus into something fuzzy, is that it is impossible to avoid Sybil manipulation. Also, it isn't really easy to take a poll of all nodes on the network. The closest you could get is asking individual transactions to signal, but that adds extra bloat on chain, and gives the power to users instead of nodes, when really it is a decision for the latter.

Also, a node is just an IP in terms of measuring 'support'.

If you look at nodes you might end up with a super high bandwidth node that serves many concurrent connections, and somebody's tiny Raspberry Pi hobbyist setup over dodgy wifi. Giving each node equal say is ripe for gaming.

Aside from the "fuzzy consensus" bit, would the same shortcomings not apply to nodes signalling for a softfork as is happening currently?  That's measurable, so this should be equally so.  Plus, we're apparently prepared to accept shenanigans like client spoofing, when that could have a significantly adverse effect on consensus, because there's no easy way to prevent it.  It's the nature of the beast.  So while I'm of the mindset "never say never", it's highly unlikely we're going to find a completely fool-proof solution.  Hence #1 and #3 being necessary as well.

As an additional safety precaution, is there a way to measure the maturity of a node?  Something like a node isn't permitted a vote until it has relayed X amount of blocks?  Or is that just my brain drifting into the realms of fantasy?  The more algorithmic checks and balances that can be added, the better.

I certainly wouldn't advocate transaction-based signalling while we're in the process of trying to optimise the space they use up, not add to it.  Plus, as you mention, SPV clients and other non-load-bearing entities would then get equal say without equal contribution, which hardly seems fair.

The natural order of things is that large blocks favour miners because there's greater potential for profit and small blocks favour nodes because there's less externalised cost.  The only solution people will be prepared to accept is to strike a fair balance between the two.  It has to be solved somehow. 
newbie
Activity: 21
Merit: 1
For me, the ideal compromise should tick three boxes:

    1) An algorithmic element based on transaction volumes, so change only happens when required
    2) A way for both miners *and* nodes to signal the size of the adjustment they are willing to accept, to maintain equilibrium and to ensure miners can't dictate to the rest of the network
    3) Another algorithmic element taking into consideration the average total fees received per block over all the blocks in the previous difficulty period, to ensure economic viability and to avoid rigging by any single pool or entity.

No easy task, for sure.  But it feels like all the elements are there and just need putting together somehow.  BIP106 came very close, but left some important elements missing.  Mainly #2, a way for full nodes to set a "this far and no further" threshold.  It's almost ironic that for all the complaints on this forum about the BU client, #2 is almost exactly what it does (although again, tends to encourage whole numbers and not decimals, adjustments should be in fractions of a MB).

The insurmountable problem with #2, beyond BU's implementation making something absolute like consensus into something fuzzy, is that it is impossible to avoid Sybil manipulation. Also, it isn't really easy to take a poll of all nodes on the network. The closest you could get is asking individual transactions to signal, but that adds extra bloat on chain, and gives the power to users instead of nodes, when really it is a decision for the latter.

Also, a node is just an IP in terms of measuring 'support'.

If you look at nodes you might end up with a super high bandwidth node that serves many concurrent connections, and somebody's tiny Raspberry Pi hobbyist setup over dodgy wifi. Giving each node equal say is ripe for gaming.

In practice, would it not be a case of whoever blinks first loses money?  How else would miners know what to agree on unless someone starts signalling first?

It would introduce some interesting game theory for sure. It is possible that if there was clear consensus to increase the block size miners would avoid 'paying' to signal for an increase at the beginning of a cycle rather than at the end, it would depend on the strength of consensus and how desperate they were to get it 'passed' - a little like law makers.

The stakes are low enough that the real cost is incurred for signalling against consensus over a sustained period. Getting 'caught' occasionally by having to pay to signal because somebody else who shares your goal found a block before you and didn't is not a big deal and would average itself over time.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
Dynamic is absolutely my personal preferred solution.  The tricky part is not just how to implement it, but also to get people enthusiastic about the idea.  "Trying to find a compromise we can all get behind" is great, but we can only do that if the idea catches on.  Other proposals seem to garner so much more attention and I'm always at a loss to know why that is.  Maybe it's simply down to the fact that it would be inherently more complex than static limits.  "Blocksize = X" is just easier for everyone to comprehend than when it starts being about percentage increases.  There's no technical reason why the blocksize has to be an integer, but maybe since that's how it's always been, people have grown accustomed to it.

For me, the ideal compromise should tick three boxes:

    1) An algorithmic element based on transaction volumes, so change only happens when required
    2) A way for both miners *and* nodes to signal the size of the adjustment they are willing to accept, to maintain equilibrium and to ensure miners can't dictate to the rest of the network
    3) Another algorithmic element taking into consideration the average total fees received per block over all the blocks in the previous difficulty period, to ensure economic viability and to avoid rigging by any single pool or entity.

No easy task, for sure.  But it feels like all the elements are there and just need putting together somehow.  BIP106 came very close, but left some important elements missing.  Mainly #2, a way for full nodes to set a "this far and no further" threshold.  It's almost ironic that for all the complaints on this forum about the BU client, #2 is almost exactly what it does (although again, tends to encourage whole numbers and not decimals, adjustments should be in fractions of a MB).  But on its own, BU isn't the whole solution either, because it doesn't tick boxes #1 and #3.  It absolutely must contain algorithmic elements, or it's no better than "democracy" and would be just as easily corrupted.  If someone can put it all together, there's no reason I can see why it won't work.

Your own addition to the mix of miners paying a percentage of transaction fees to the next block may also have merit, but I'm struggling a bit with this part:

If miners are in unanimous agreement that the block size needs to increase, the fees would average out and all miners should still be equally rewarded. Only miners trying to increase the block size when consensus is not there would incur a cost.

In practice, would it not be a case of whoever blinks first loses money?  How else would miners know what to agree on unless someone starts signalling first?
newbie
Activity: 21
Merit: 1
Another way to do it is instead of averaging out the votes, say 5 votes: 0 + 0 + 1.35% + 2.7% + 2.7% = +1.35% increase.

You could do the option with the most votes wins. As that way miners who are signaling +2.7% are not getting their higher investment diluted down as long as they have consensus. Once the threshold is reached all miners would drop down to not voting until the next round.
newbie
Activity: 21
Merit: 1
There are no mandated fees in the Bitcoin protocol so the natural response to schemes like this is for miners to simply accept fees via other means (such as the direct txout method supported by eligius since 2011, or via outputs with empty scriptpubkeys, or via out of band fees) and give users a discount for using them. The expected result would be fees migrating out of the fee area, and protocols that depend on them (like your suggestion) becoming dysfunctional.  Sad

Hi greg, thanks for taking a look at the idea.

The issue you highlight is why I had the idea to exclude zero/far below mean fee transactions from calculating the fullness of blocks to justify an increase.

Therefore there would be no benefit to the miners who want to accept fees out of band in an effort to avoid paying for a vote to increase the block size, as increasing the block size would not be possible unless the blocks are sufficiently full with transactions that are paying a fee.

This transaction doesn't change the fundamentals of Bitcoin - fees are still not mandated, there is no penalty for including zero fee transactions. Its just that if there are a large number of transactions in blocks that are paying fees substantially below the mean the block size cannot grow, just as it can't anyway at the moment.

I hope what I'm trying to say makes sense. Do you think this could help mitigate the out of band fees problem?
staff
Activity: 4284
Merit: 8808
101 = vote increase 1.35%, pay 10% of transaction fees to next block
111 = vote increase 2.7%, pay 25% of transaction fees to next block
There are no mandated fees in the Bitcoin protocol so the natural response to schemes like this is for miners to simply accept fees via other means (such as the direct txout method supported by eligius since 2011, or via outputs with empty scriptpubkeys, or via out of band fees) and give users a discount for using them. The expected result would be fees migrating out of the fee area, and protocols that depend on them (like your suggestion) becoming dysfunctional.  Sad

I previously tried to rescue this class of proposal by having the  change not be to fees but by modifying the lowness of the required hash (effective difficulty), but it's difficult to do that in the presence of subsidy.

Unrelated, as you note your proposal is no constraint if miners agree-- this is also why it fails to address the conflict of interest between miners (really mining pools), who are paid to include transactions, and everyone else-- who experiences them as an externalize except to the extent that they contribute to economic growth (not at all a necessity: e.g. many companies want to use the bitcoin blockchain without using the Bitcoin currency at all).  Still, better to solve one issue even if all can't be solved.

Pages:
Jump to: