Pages:
Author

Topic: The MAX_BLOCK_SIZE fork - page 4. (Read 35609 times)

legendary
Activity: 1722
Merit: 1217
February 10, 2013, 05:08:37 PM
legendary
Activity: 1904
Merit: 1002
February 10, 2013, 04:44:48 PM
Yeah.  If there's a major split, the <1MB blockchain will probably continue for a while.  It will just get slower and slower with transactions not confirming.  It would be better if there is a clear upgrade path so we don't end up with a lot of people in that situation.

You are assuming miners want to switch.  They have a very strong incentive to keep the limit in place (higher fees, lower storage costs).
Ari
member
Activity: 75
Merit: 10
February 10, 2013, 04:42:06 PM
Yeah.  If there's a major split, the <1MB blockchain will probably continue for a while.  It will just get slower and slower with transactions not confirming.  It would be better if there is a clear upgrade path so we don't end up with a lot of people in that situation.
legendary
Activity: 1904
Merit: 1002
February 10, 2013, 04:07:09 PM
We will eventually hit the limit, and the limit will be raised.  People running old versions will get disconnected from the network new block chain and continue on the old, causing confusion as to which chain is the official "bitcoin" chain when that happens.  The only question is how close we will come to a 50% network split.  It would be good to reach some consensus well in advance, so that this is minimally disruptive.

FTFY
Ari
member
Activity: 75
Merit: 10
February 10, 2013, 09:38:33 AM
We will eventually hit the limit, and the limit will be raised.  People running old versions will get disconnected from the network when that happens.  The only question is how close we will come to a 50% network split.  It would be good to reach some consensus well in advance, so that this is minimally disruptive.
member
Activity: 113
Merit: 11
February 09, 2013, 11:00:39 AM
I haven't read the entire thread so this may have been covered. I've been stewing over this problem for a while and would just like to think aloud here....

I very much think the blocksize should be network regulated much like difficulty is used to regulate propagation windows based on the amount of computation cycles used to find hashes for particular difficulty targets. To clarify, when I say CPU I mean CPUs, GPUs, and ASICs collectively.

Difficulty is very much focused on the network's collective CPU cycles to control propagation windows (1 block every 10 mins), avoid 51% attacks, and distribute new coins.

However the max_blocksize is not related to computing resources to validate transactions and regular block propagation, it is geared much more to network speed, storage capacity of miners (and includes even non-mining full nodes) and verification of transactions (which as I understand it means hammering the disk). What we need to determine is whether the nodes supporting the network can quickly and easily propagate blocks while not having this affect the propagation window.

Interestingly there is a connection between CPU resources, the calculation of the propagation window with difficulty targets, and network propagation health. If we have no max_blocksize limit in place, it leaves the network open to a special type of manipulation of the difficulty.

The propagation window can be manipulated in two ways as I see it, one is creating more blocks as we classically know, throw more CPUs at block creation, and we transmit more blocks, more computation power = more blocks produced, and the difficulty ensures the propagation window doesn't get manipulated this way. The difficulty is measured by timestamps in the blocks to determine whether more or less blocks in a certain period were created and whether difficulty goes up or down. All taken care of.

The propagation window could also be manipulated in a more subtle way though, that being transmission of large blocks (huge blocks in fact). Large blocks take longer to transmit, longer to verify, and longer to write to disk, though this manipulation of the number of blocks being produced is unlikely to be noticed until a monster block gets pushed across the network (in a situation where there is no limit on blocksize that is). Now because there is only a 10 minute window the block can't take longer than that I'm guessing. If it does, difficulty will sink and we have a whole new problem, that being manipulation of the difficulty through massive blocks. Massive blocks could mess with difficulty and push out smaller miners, causing all sorts of undesirable centralisations. In short, it would probably destroy the Bitcoin network.

So we need a maximum block size that is high enough that the vast majority of nodes are comfortable with it, and isn't so big that it can be used to manipulate the difficulty by artificially slowing propagation accross the network with massive blocks. With the help of the maintaining of the propagation window through it's difficulty, we may be able to determine whether the propagation of blocks is slowing and whether the max_blocksize should be adjusted down to ensure the propagation window remains stable.

Because the difficulty can be potentially manipulated this way we could possibly have a means of knowing what the Bitcoin network is comfortable with propagating. And it could be determined thusly:

If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize (median being chosen to avoid situations where one malicious entity, or entities tries to arbitrarily push up the max_blocksize limit), and the difficulty is "stable", increase the max_blocksize (say by 10%) for the next difficulty period (say the median is within 20% of the max_blocksize), but if the median size of blocks for the last period is much lower (say less than half the current blocksize_limit), then lower the size by 20% instead.

However, if the If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize and the difficulty is NOT stable, don't increase the max_blocksize since there is a possibility that the network is not currently healthy and increasing or decreasing the max_blocksize is a bad idea. Or alternatively in those situations lower the max_blocksize by 10% for the next difficulty period anyway (not sure if this is a good idea or not though).

In either case the 1mb max_blocksize should be the lowest the blocksize should go to if it continued to shrink.

Checking the stability of the last difficulty period and the next one is what determines whether the network is spitting out blocks at a regular rate or not, if the median blocksize of blocks transmitted in the last difficulty period is bumping up against the limit, and difficulty is going down, it could mean a significant number of nodes can't keep up, esp. if the difficulty needs to be moved down, that means that blocks aren't getting to all the nodes in time and hashing capacity is getting cut off because they are too busy verifying the blocks they received. If the difficulty is going up and median block size is bumping up against the limit, then there's a strong indication that nodes are all processing the blocks they receive easily and so raising the max_blocksize limit a little should be OK. The one thing I'm not sure of though is determining whether the difficulty is "stable" or not, I'm very much open to suggestions the best way of doing that. The argument that what is deemed "stable" is arbitrary and could still lead to manipulation of the max_blocksize, just over a longer and more sustained period I think is possible too, so I'm not entirely sure this approach could be made foolproof, how does calculating of difficulty targets take these things into consideration?

OK, guys, tear it apart. Wink
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
February 06, 2013, 03:31:22 PM

Well indeed there is a problem if the only option you have is to use a fractional reserve system because the hard limit of 1mb. Specially if the 'bank' cannot give you back your bitcoins because everybody else has withdrawn theirs for whatever reason Cheesy


Agreed 100%

If Bitcoin cripples itself at such an early stage then the central bankers will be laughing as they quaff bourbon in their gentlemen's clubs. Bitcoin is a very disruptive technology which has a huge first-mover advantage, potentially returning a lot of power from governments and TBTF banks to the people. If this is thrown away then expect the next major cryptocurrency to be FedCoin or ECBcoin or even IMFcoin which will be designed to integrate somehow with the existing fiat systems. (Oh. And expect attempts to ban community-based alternatives when "official" ones are up and running.)
sr. member
Activity: 527
Merit: 250
February 06, 2013, 01:20:33 PM
eventually this will drive us to a fractional reserve system again.

There's nothing wrong with a fractional reserve system, just look at systems of self-issued credit (like Ripple). The problem is with legal tender laws that force you to use a particular debt instrument. With voluntary exchange, competition between systems would keep them honest.


Well indeed there is a problem if the only option you have is to use a fractional reserve system because the hard limit of 1mb. Specially if the 'bank' cannot give you back your bitcoins because everybody else has withdrawn theirs for whatever reason Cheesy
legendary
Activity: 1064
Merit: 1001
February 06, 2013, 01:14:01 PM
eventually this will drive us to a fractional reserve system again.

There's nothing wrong with a fractional reserve system, just look at systems of self-issued credit (like Ripple). The problem is with legal tender laws that force you to use a particular debt instrument. With voluntary exchange, competition between systems would keep them honest.
sr. member
Activity: 527
Merit: 250
February 06, 2013, 01:10:05 PM
I think that if we reach the 1mb limit and don't upgrade with a solution, then the spontaneous order will create fiat currencies backed with bitcoins, in order to reduce the amount of transactions in the bitcoin network.

I'm not so sure this is a bad thing. These ad-hoc "fiat" currencies may be created with unique properties that make them better suited to the task at hand than Bitcoin. For example, a private payment network that provides instant confirmation and requires no mining (relying on trust in a central authority).


It is a bad thing because even when you may know the amount of deposits they have (since you can audit the blockchain), you don't actually know the amount of notes they will have emited, so eventually this will drive us to a fractional reserve system again.

That's what i think.
legendary
Activity: 1064
Merit: 1001
February 06, 2013, 11:22:53 AM
Any system which relies on trivial input can be easily gamed.  I (and other merchants) could buy/rent enough hashing power to solve 1% of blocks and fill them with massive fees (which come right back to us) and inflate the average fee per block.

Hmm...This was a problem in my first idea but I fixed it for the last two. Do you an exploitable problem with the most recent proposal?

...you propose that the size will only increase but never decrease? Anyway, the major problem is the choice of total reward amount because change in purchasing power is unpredictable.

Yes, size would only increase. If we allow the size to decrease then it could cause fees to skyrocket. I proposed a new scheme that does not depend on total reward amount, I believe it addresses your concerns.
legendary
Activity: 1792
Merit: 1111
February 06, 2013, 11:09:11 AM
Also, requiring a total reward of 50BTC means requiring 25BTC in fee NOW. As the typical total tx fee in a block is about 0.25BTC, the fee has to increase by 100x and obviously this will kill the system.

How and why would the system be "killed"? The max block size would simply not increase.


So you propose that the size will only increase but never decrease? Anyway, the major problem is the choice of total reward amount because change in purchasing power is unpredictable.
donator
Activity: 1218
Merit: 1079
Gerald Davis
February 06, 2013, 11:02:28 AM
There's a delay regardless of whether or not two different blocks are solved at the same time?

Yes but if a miner knew that no other block would be found in the next x seconds they wouldn't care.  Since that can never be known the longer the propagation delay the higher the probability that a competing block will be found before propagating completes and potentially win the race.

Quote
You mean that when two different blocks are solved at the same time, the smaller block will propagate faster and therefore more miners will start building on it versus the larger block?

Yes although it is more like the smaller block has a higher probability of winning the race.  A miner can never know if he will be in a race condition or which races he will lose but over the long run everything else being equal a miner with a longer propogating delay will suffer a higher rate of orphaned blocks.

Quote
Is there a straightforward way to estimate the risk of an orphan?

Not that I know of.  I do know pools have looked into this and to improve their orphan rates to remain competitive.  My guess is any analysis is crude because it would be difficult to model so testing needs to be done with real blocks = real earnings. A pool at least wants to ensure its orphan rate isn't significantly higher than its peers (or global average).


Quote
Even with a separate overlay, two blocks solved at the same time is a problem. And I would imagine that adding a new overlay is an extreme solution to be considered as a last resort only.

True but if it became a large enough problem, a mining network would allow for direct transmission to other miners.  A block notification superhighway of sorts.  Blocks could be digitally signed by a miner and if that miner is trusted by other miners (based on prior submitted work) those miners could start mining the next block immediately.  This of it as WOT for miners but instead of rating financial transactions miners are trusting other miners based on prior "good work".

The propogation delay on large blocks is a combination of the relay -> verify -> relay nature of the bitcoin network, combined with relatively slow block verification (large fraction of a second), and the potential need for multiple hops.  All combined this can result in a delay of multiple seconds before a majority of miners start work on this block.

A single hop, trust enough to start work on next block, and verify after the fact would make the "cost" of a larger block negligible.   It is just an idea.  I care less about mining these days so that is something for major miners to work out.    Even if this network never became "official" I would imagine some sort of private high speed data network to emerge.  It would allow participating miners to gain a small but real competitive advantage on other miners.  Less orphans, ability to include more tx (and thus higher fees) = more net revenue for miners.

Quote
What are your thoughts on the last scheme I described?

Any system which relies on trivial input can be gamed.  I (and other merchants) could buy/rent enough hashing power to solve 1% of blocks and fill them with tx containing massive fees (which come right back to us) and inflate the average fee per block.

I would point out that a fixed money supply and static inflation curve is non-optimal. In theory a central bank should be able to do a better job.  By matching the growth of the money supply to economic growth (or contraction) prices never need to rise or fall (in aggregate).  A can of soup which costs $0.05 in 1905 would cost $0.05 today.  At least the inflation aspect.  The actual price may vary for non-inflationary reasons such as improved productivity or true scarcity of resources.  

The problem with central banks isn't the theory ... it is the humans.  The models of monetary policy rely on flawed humans making perfect decisions and that is why they are doomed to failure.  Flawed humans choosing the benefit for the many (the value of price stability) over the increased benefit for the few (direct profit from manipulation of the money supply).  Maybe someday when we create a utopian such ideas will work but until then they will be manipulated for personal benefit.

The value of Bitcoin comes from the inability to manipulate the money supply.  Sure many times a fixed money supply and static inflation curve is non-optimal but it can't be manipulated and thus this non-optimal system has the potential to outperform systems which in theory are superior but have the flaw of needing perfect humans to run them.

On edit:
On rereading I noticed you proposed using median block reward not average.  That is harder to manipulate.  It probably is better than a fixed block size but I would expect 50 BTC per block isn't necessary on a very large tx volume so it may result in higher than needed fees (although still not as bad as 1MB fixed).  It is worth considering.  Not sure if a consensus for a hard fork can ever be reached though.

Quote
Hmm...this seems problematic. If the transaction volume doesn't grow sufficiently, this could kill fees. But if the transaction volume grows too much, fees will become exhorbitant. IF we accept that max block size needs to change, I believe it should be done in a way that decreases scarcity in response to a rise in average transaction fees.

Likewise a fixed subsidy reduction schedule is non-optimal.  What if tx fees don't cover the drop in subsidy value in 2016.  Network security will be reduced.  Should we also make the money supply adjustable? Smiley  (Kidding but I hope it illustrates the point).

TL/DR: Fixed but non-optimal vs adjustable but manipulable? Your choice.
legendary
Activity: 1064
Merit: 1001
February 06, 2013, 10:53:47 AM
Also, requiring a total reward of 50BTC means requiring 25BTC in fee NOW. As the typical total tx fee in a block is about 0.25BTC, the fee has to increase by 100x and obviously this will kill the system.

How and why would the system be "killed"? The max block size would simply not increase.

There is no reason to stick the total reward to 50 BTC because you need to consider the purchasing power.

Here's yet another alternative scheme:

1) Block size adjustments happen at the same time that network difficulty adjusts

2) On a block size adjustment, the size either stays the same or is increased by a fixed percentage (say, 10%). This percentage is a baked-in constant requiring a hard fork to change.

3) The block size is increased if more than 50% of the blocks in the previous interval have a size greater than or equal to 90% of the max block size. Both of the percentage thresholds are baked in.

Example:

A block size adjustment arrives, and the current max block size is 1024KB. The last 210,000 blocks are analyzed, and it is determined that 125,000 of them are at least 922KB in size. The maximum block size is increased by 10% as a result.

Instead of targeting a fixed block reward, this scheme tries to determine if miners are consistently reaching the max block limit when filling a block with transactions (the 90% percentage should be tuned based on historical transaction data). This creates scarcity (in proportion to the 50% figure) while remaining independent of the purchasing power.
legendary
Activity: 1792
Merit: 1111
February 06, 2013, 10:45:21 AM
legendary
Activity: 1064
Merit: 1001
February 06, 2013, 10:29:51 AM
Yes there is a propagation delay for larger blocks

There's a delay regardless of whether or not two different blocks are solved at the same time?

...any attempt to find an "optimal" block size is likely doomed because it can be gamed and "optimal" is hard to quantity.

What are your thoughts on the last scheme I described?

...
2016 - 2MB block =~ 720K daily transactions (262M annually)
...

Hmm...this seems problematic. If the transaction volume doesn't grow sufficiently, this could kill fees. But if the transaction volume grows too much, fees will become exhorbitant. IF we accept that max block size needs to change, I believe it should be done in a way that decreases scarcity in response to a rise in average transaction fees.

There would be some variety, surely. In the blocks they produce themselves, miners will search to optimize the ratio (time to propagate / revenue in fees), while in blocks they receive from other miners, they would rather it be the smaller possible.

Sure, a miner might "rather" received blocks be as small as possible but since there's no way to refuse to receive a block from a peer, this point is moot. They could drop a block that is too big once they get it but this doesn't help them very much other than not having to forward it to the remaining peers. And even this has little global effect since those other peers will just receive it from someone else.

These parameters are not the same for different miners, particularly the "time to propagate" one, as it strongly depends on how many connections you can keep established and on your bandwidth/network lag.

Bandwidth will be the limiting factor in determining the number of connections that may be maintained. For purpose of analysis we should assume that miner's choose degree (number of peers) such that bandwidth is not fully saturated. Because doing otherwise would lead to not being able to collect the largest number of transactions possible for the amount of bandwidth available, limiting revenue.

Do people in mining pools even need to run a full node?


donator
Activity: 1218
Merit: 1079
Gerald Davis
February 06, 2013, 10:15:46 AM
I don't understand this aspect of the network. Why do miners want smaller blocks from other miners? Do blocks take a long time to propagate? Are you saying that newly solved blocks are sent around on the same peer connections used to transmit messages, and that while a connection is being used to send a block (which can be large relative to the size of a transaction) it holds up the queue for individual tx?

If this is the case, perhaps an easier way to deal with the propagation of blocks is to have two overlays, one for tx and the other for blocks.

Yes there is a propagation delay for larger blocks when two blocks are produced by different miners at roughly the same time larger blocks are more likely to be orphaned. The subsidy distorts the market effect.  Say you know that by making the block 4x as large you can gain 20% more fees.  If this increases the risk of an oprhan by 20% then the larger block is break even.  However the subsidy distorts the revenue to size ratio.  20% more fees may only mean 0.4% more total revenue if fees make up only 2% of revenue (i.e. 25 BTC subsidy + 0.5 BTC fees).  As a result a 20% increase in oprhan rates isn't worth a 0.4% increase in total revenue.

As the subsidy become a smaller % of miner total compensation the effect of the distortion will be less.  There has been some some brainstorming on methods to remove the "large block penalty".  It likely would require a separate mining overlay.
donator
Activity: 1218
Merit: 1079
Gerald Davis
February 06, 2013, 10:03:48 AM
Sadly any attempt to find an "optimal" block size is likely doomed because it can be gamed and "optimal" is hard to quantity.

Optimal for the short thinking non-miner - block size large enough that fees are driven down to zero or close to it.

Optimal for the network - block size large enough to create sufficient competition that fees can support the network relative to true economic value.

Optimal for the short term looking miner - never rising larger than 1MB to maximize fee revenue.

However I would point out that the blockchain may eventually become the equivalent of bank wire transactions.  FedWire for example transferred ~$663 trillion USD in 2011 using 127 million transactions.  If FedWire used a 10 minute block it would be ~2,500 transactions per block.  For Bitcoin that would be roughly 400 bytes per tx.  So it shows that Bitcoin can support a pretty massive fund transfer network even with a 1MB block limit.

Some would dismiss this as too centralized but I would point out that direct access to FedWire is impossible for anyone without a banking charter.  Direct access to the blockchain simply requires payment of fee and computing resources capable of running a node.  This means the blockchain will always remain far more open. 

I think one modest change (which is unlikely to make anyone happy but would allow higher tx volume) would be to take it out of the hands of everyone.  The block subsidy follows a specific exact path for a reason.  If it was open to human control Bitcoin would likely be hyperinflated and nearly worthless today.  A proposal could be made for a hard fork to double (or increase by some other factor) the max size of a block on every subsidy cut. 

This would allow for example (assuming avg tx is 400 bytes):
2012 - 1MB block =~ 360K daily transactions (131M annually)
2016 - 2MB block =~ 720K daily transactions (262M annually)
2020 - 4MB block =~ 1.44M daily transactions (525M annually)
2024 - 8MB block =~ 2.88M daily transactions (1B annually)
2028 - 16MB block =~5.76M daily transactions (2B annually)
2030 - 32MB block =~11.52M daily transactions (4B annually)

Moore's law should ensure that processing a 32MB block in 2030 is computationally less of a challenge than doing so with a 1MB block today.
legendary
Activity: 1064
Merit: 1001
February 06, 2013, 09:51:53 AM
if there is an "global optimal max size", it's quite pretentious to claim you can come up with the "optimal formula" to calculate it.

Definitely, but I was talking about an optimum strategy for prioritizing transactions, not an optimum choice of max block size. Typically the strategy will be either:

1) Include all known pending transactions with fees

or

2) Choose the pending transactions with the highest fees per kilobyte ratio and fill the block up to a certain size

There would be some variety, surely. In the blocks they produce themselves, miners will search to optimize the ratio (time to propagate / revenue in fees), while in blocks they receive from other miners, they would rather it be the smaller possible. These parameters are not the same for different miners, particularly the "time to propagate" one, as it strongly depends on how many connections you can keep established and on your bandwidth/network lag.

I don't understand this aspect of the network. Why do miners want smaller blocks from other miners? Do blocks take a long time to propagate? Are you saying that newly solved blocks are sent around on the same peer connections used to transmit messages, and that while a connection is being used to send a block (which can be large relative to the size of a transaction) it holds up the queue for individual tx?

If this is the case, perhaps an easier way to deal with the propagation of blocks is to have two overlays, one for tx and the other for blocks.

I think that if we reach the 1mb limit and don't upgrade with a solution, then the spontaneous order will create fiat currencies backed with bitcoins, in order to reduce the amount of transactions in the bitcoin network.

I'm not so sure this is a bad thing. These ad-hoc "fiat" currencies may be created with unique properties that make them better suited to the task at hand than Bitcoin. For example, a private payment network that provides instant confirmation and requires no mining (relying on trust in a central authority).

Quote
So, this would also lead to less revenues for the miners (plus a loss on reputation for the bitcoin network).

Average transaction fees per kilobyte is inversely proportional to the block size, so leaving the block size at 1MB will cause fees to increase once blocks are regularly full. The rate of increase in the fees will be proportional to the growth in the number of transactions.

Miners would love it if all blocks had a one transaction maximum, this would maximize fees (assuming people didn't leave the Bitcoin network due to high fees).




sr. member
Activity: 527
Merit: 250
February 06, 2013, 09:21:57 AM
Plus, if there is an "global optimal max size", it's quite pretentious to claim you can come up with the "optimal formula" to calculate it. Even if you could, individual peers would never have all necessary data to feed to this formula, as it would have to take into consideration the hardware resources of all miners and the network as a whole. That's impracticable. Such maximum size must be established via a decentralized/spontaneous order. It's pretty much like economical central planning versus free markets actually.

I think that if we reach the 1mb limit and don't upgrade with a solution, then the spontaneous order will create fiat currencies backed with bitcoins, in order to reduce the amount of transactions in the bitcoin network. So, this would also lead to less revenues for the miners (plus a loss on reputation for the bitcoin network).

The block size hard limit is nothing but a protectionist policy.

Even when misterbigg's approach might not be the optimal solution, at least it's an idea.
Pages:
Jump to: