Author

Topic: A simple solution to the Blocksize war, but you won't like it. (Read 1362 times)

hero member
Activity: 1082
Merit: 505
A Digital Universe with Endless Possibilities.
the biggest problem is that some will agree and some will disagree meaning that there will always be two forks of Bitcoin.
sr. member
Activity: 362
Merit: 262
Implementation aside, I think it's better to increase the time between blocks instead of the size of the blocks the more transactions there are. Would be more convenient don't you think?
The problem with this statement that you are making a statement about the implementation but prefacing it with "Implementation aside".  Kind of difficult to argue with you.

Would be more convenient don't you think?

Why do you think it's more convenient? 

In my lay understanding increasing the block size or decreasing the time between blocks result in some of the same (ahem) implementation issues.  Block propagation becomes and issue.  Miners working on the wrong block tip.  Effective reduction in mining power backing bitcoin.  etc.
legendary
Activity: 1666
Merit: 1057
Marketing manager - GO MP
The size of the blocks since the last adjustment is already there and it can easily be verified. As for the unconfirmed transactions that was a sort of redundant thing I put in there so the amount of congestion can be better taken into account. Perhaps that wasn't the best approach for the problem. And probably the amount of congestion is directly related to the fullness of the blocks.
Fine, don't like the term reduced difficulty, fine make it a "difficulty modifier" instead.

Implementation aside, I think it's better to increase the time between blocks instead of the size of the blocks the more transactions there are. Would be more convenient don't you think?
donator
Activity: 1218
Merit: 1079
Gerald Davis
Ok say the code is in effect.  A miner solves a block with reduced difficulty.  How can I verify that reduced difficulty was warranted?  Remember not all nodes are going to be online at the same time.  Say I am a new node that is bootstrapping.  What is the proper difficulty for block X if I don't know the size of the memory pool at the time the block was created?

Even the idea of the 'size of the memory pool' for the entire network is itself an abstraction.  There is no one 'memory pool' but rather the memory pool of individual nodes and that size and contents is never consistent.  Most of these kind of solutions come from thinking of the bitcoin network as a single unified network with consistent state and code.  The reality is that Bitcoin is more like a number of independent nodes which share a standardized way to share information.   The nodes do not share a perfectly synchronized view of the state of the network.

The contents of the memory pool is neither deterministic nor provable.  The current blockchain is completely deterministic and provable.   I didn't need to be online when block 1 was mined to know it is valid.  I don't need any data other that the data from previous blocks to know that any given block is valid.  I can prove block 1 is valid (including the correct difficulty) with only the contents of block 0, I can prove block 2 is valid with only the contents of block 0 & 1.  I can prove block x is valid with only the contents of blocks 0 ... x-1.  This is what I mean by the fact that the blockchain is self verifying.
legendary
Activity: 1666
Merit: 1057
Marketing manager - GO MP
The blockchain is self verifying.  Everything to verify the blockchain is in the blockchain.   There is no way to 'prove' the size of the memory pool at any time.  Imagine your idea was implemented and you were offline for a day/week/year.  You receive a block which has its difficulty reduced.  Is it valid?  What was the size of the memory pool at the time the block was created?    

What? I'm simply proposing a more complex difficulty adjustment algorithm that can be done using a hard-fork after block X, just like any other alternative solutions. Storing the additional parameters in the blockchain can be done easily, it does not care what data is in it.
legendary
Activity: 1666
Merit: 1057
Marketing manager - GO MP
If (the amount of unconfined transactions > some limit && the last 2016 blocks contain 9x% of 1MB worth of data)
decrease the difficulty by x %


This means:
Faster transactions if needed
More transactions if needed
More inflation if needed
The block reward driving mining costs will be obsolete sooner and the 21 mil coin limit can remain *only it will happen sooner


This method provides an economic incentive for large miners to spam the blockchain with many very small transactions, which is one reason why it will never work well.

Interesting, I haven't thought about that. But I think this incentive can be removed if the difficulty reduction is tied to the transaction fees somehow, like only counting unconfirmed transaction with some set amount of fees. This would still push out zero fee transactions, but well to be honest those are only there because of the block reward in the first place and would become unsustainable sooner or later anyway.
donator
Activity: 1218
Merit: 1079
Gerald Davis
The blockchain is self verifying.  Everything to verify the blockchain is in the blockchain.   There is no way to 'prove' the size of the memory pool at any time.  Imagine your idea was implemented and you were offline for a day/week/year.  You receive a block which has its difficulty reduced.  Is it valid?  What was the size of the memory pool at the time the block was created?   
sr. member
Activity: 462
Merit: 250
I can draw your avatar!
Shouldn't we just wait it out till all the blocks have been mined and see how it went? And then come up with alternative ways to improve?
OP ideas sound plausible, but not as sturdy and thought out as well as the whole satoshi papers. The math is too simple to be taken seriously and besides the plus points he should consider the risks and downpoint as well as pointed out by others already. It would be nice to contribute to the bitcoin stability, but the way I see it it does no good for the chain or the coin, inflation is the one thing that undermines the whole concept of the coin, we should never take that path, as it will be the downfall of the coin and open up a whole can of worms that will throw us back to where we came from leaving us at the wrong end of the economy and empty handed once more.  Sad
legendary
Activity: 1100
Merit: 1032
OP's proposal is to reduce difficulty if there are more transactions.
How would that produce fewer blocks per day?

Oops, read it wrong Smiley

But then instead of a cheap DOS it becomes a cheap way to create a fork: pile up transactions until diff goes low enough a pool can solo-mine faster than the blocks can be propagated.
You just have to send as many low-value large transactions (ie. with lots of small inputs and outputs to your pool node, so no propagation issue for that pool, but slow propagation to other pools. If the other pools do not use the same tactic, you are guaranteed to have all the other pools lag, regardless of their hash power, if only because they will get the new blocks and transactions later because of propagation times. You can speed up the process by throttling the pool's outgoing bandwidth.

(this btw is a general vulnerability of faster blockchains whenever blocks can be generated faster than they can be propagated on the network)
legendary
Activity: 1876
Merit: 1475
This means a cheap way of denial of service: just spam with cheap transactions until diff goes so high transactions will pile up, which will further raises the diff, at which point blocks will take days, months, years...

OP's proposal is to reduce difficulty if there are more transactions.
How would that produce fewer blocks per day?

legendary
Activity: 1100
Merit: 1032
This means a cheap way of denial of service: just spam with cheap transactions until diff goes so high transactions will pile up, which will further raise the diff, at which point blocks will take days, then months, then years...
full member
Activity: 128
Merit: 103
I've asked this before on other threads but didn't get much in the way of replies, so once again...

what do people think about the recent proposals by Justus Ranvier around this issue ?


https://bitcoinism.liberty.me/2015/01/21/economic-fallacies-and-the-block-size-limit-part-1-scarcity/

https://bitcoinism.liberty.me/2015/02/09/economic-fallacies-and-the-block-size-limit-part-2-price-discovery/


hero member
Activity: 742
Merit: 500
If (the amount of unconfined transactions > some limit && the last 2016 blocks contain 9x% of 1MB worth of data)
decrease the difficulty by x %


This means:
Faster transactions if needed
More transactions if needed
More inflation if needed
The block reward driving mining costs will be obsolete sooner and the 21 mil coin limit can remain *only it will happen sooner


This method provides an economic incentive for large miners to spam the blockchain with many very small transactions, which is one reason why it will never work well.
legendary
Activity: 1876
Merit: 1475
If (the amount of unconfined transactions > some limit && the last 2016 blocks contain 9x% of 1MB worth of data)
decrease the difficulty by x %

This is in effect just removing the blocksize altogether.  It can just grow to whatever size it wants.

No other comment, I fence sit & go back and forth.

Except inflation could be too high and block reward could end too soon, before there are enough fees to support the network.
Actually I think removing the block limit altogether would be a better solution than this.

(So yeah, OP was right about "but you won't like it")
legendary
Activity: 896
Merit: 1000
If (the amount of unconfined transactions > some limit && the last 2016 blocks contain 9x% of 1MB worth of data)
decrease the difficulty by x %

This is in effect just removing the blocksize altogether.  It can just grow to whatever size it wants.

No other comment, I fence sit & go back and forth.
legendary
Activity: 1666
Merit: 1057
Marketing manager - GO MP
If (the amount of unconfined transactions > some limit && the last 2016 blocks contain 9x% of 1MB worth of data)
decrease the difficulty by x %


This means:
Faster transactions if needed
More transactions if needed
More inflation if needed
The block reward driving mining costs will be obsolete sooner and the 21 mil coin limit can remain *only it will happen sooner
Jump to: