Pages:
Author

Topic: Share your ideas on what to replace the 1 MB block size limit with - page 3. (Read 7003 times)

hero member
Activity: 602
Merit: 500
In math we trust.
Well, I'm not a bitcoin expert, but here's the idea.

I think the limit should be removed for the network to allow mass-adoption.
Here's an interesting idea.

Bitcoin devs should:
Set a date, like January 1 2015 and declare that a new client will be released with no limit on that date.
Also, make every client released from now on, have a timestamp-based limit, (change the limit automatically depending on the date)
As the clocks hit 1/1/2015, hopefully more than 50% of the people will have updated their clients to a timestamp-based ones.

Well, I hope you took the point.
legendary
Activity: 2282
Merit: 1050
Monero Core Team
My suggestion, to get this discussion started, is a dynamic block size limit that is based on the following two parameters:

1) The median size of the a set of the previous blocks. A good example is the model of the CryptoNote coins, for example Monero (XMR). https://cryptonote.org/inside.php.
2) We could also add the requirement that for an increase in the block size limit the difficulty must be rising and for a decrease the difficulty must be falling.
hero member
Activity: 772
Merit: 501
I think the biggest obstacle to mass adoption of Bitcoin is the 1 MB block size limit.

It's a major source of uncertainty for the Bitcoin economy because, on the one hand:

Bitcoin can't achieve mass adoption and global currency status status until the limit is lifted, because 7 transactions per second (the transaction rate that results in block reaching the 1 MB size limit) is miniscule for any global payment network or currency

and on the other hand, a change in the protocol to lift the 1 MB block size limit portends many risks, the biggest two of which are:

  • a split in the community leading to the Bitcoin blockchain being forked
  • poor bloat control leading to garbage being dumped into the blockchain by malicious actors, making it too costly to run a full node for all but the largest players

I think we should have more discussion about potential replacements for the current block size limit, in order to get us closer to a solution.

Some might argue that we should wait until we are closer to the 1 MB block limit before discussing it, but consider that from May 2012 to May 2013, Bitcoin's transaction volume increased almost 10 fold.

If we see a similar growth in transaction volume, we would reach the block limit in a matter of four-five months (it's currently at 240 KB, meaning it can grow 4 fold before hitting the limit). And then what happens? The uncertainty hangs over future Bitcoin development.

For my part, I think the best solution is a two part one.

For the first part, we should eliminate the block size limit altogether, as Gavin Andresen and Mike Hearn advocate. If a miner creates a block that is too big, the other miners will simply reject it. This would not be a protocol level rule, but it would be enforced as if it were, because any miner whose default block size limit is not accepted by at least 50% of the network hashing rate, will eventually see all of their blocks and block rewards orphaned, so they would have an incentive to conform to the most common limit.

For the second part, miners should adopt a rule whereby their block size limit tracks the difficulty. This is a simple construction that will allow Bitcoin to scale as the economic value of the network increases. It's not perfect, but then no solution is, and between imperfect solutions, simpler ones are better.

If you have an idea on what to replace the 1 MB block size limit with, please post it here.

Edit:

Gmaxwell makes some great points, which I'll include in the OP for visibility:

Imagine— you want your message to be read by dozens or hundreds of people— consuming a few minutes of their valuable time each. It makes sense to spend quite a few minutes making sure you are well informed first, considering how much of other people's time your message will consume.

In particular, I think it's especially unhelpful when people make posts which make it clear that they don't understand that there isn't a free lunch here. In particular, I think any productive post will have been made understanding the following points:

  • Blocksize has a trade-off with decentralization. If verifying the blockchain is made expensive (relative to hardware and bandwidth costs), then past some limit Bitcoin becomes a centralized system where everyone is economically forced to trust some consortium of large miners— which are themselves more efficient if centralized since they can just verify once, instead of verifying for themselves. (If the economic majority is trusting and not verifying, you need to also do the same so you don't end up split from the other users of the system.)
  • Bitcoin isn't secure unless there is income to pay to apply computation to the honest chain (and thus far the alternatives appear not clearly workable), we argue that once the subsidy is gone transaction fees will support the security. But the existence of a market for transaction fees requires a degree of scarcity to make the rational price non-zero and to encourage efficient use. Just like Bitcoin itself wouldn't be valuable if everyone had access to infinite Bitcoins, our incentives require a degree of scarcity of blockspace.
  • Bitcoin currency throughput can be increased to arbitrary levels without increasing blocksizes, especially if you're willing to make decentralization tradeoffs. Importantly, handling high volume transactions in other ways than expressing each and every one in the global bitcoin ledger can help avoid pulling down the available security for all transactions just because a large volume of low value transactions need the throughput and can accept the lower security. Work in this space has been under-developed, but I'm not aware of anyone disagreeing with the broad possibilities here. Because of the lack of need until now it's only recently become possible to raise substantial funding for work in this space.

None of this to say that its not an interesting subject to discuss (though it has been discussed in depth before), but it's at least my view that posts which are unaware of these points are unlikely to be productive. If you don't understand what I'm saying in these points, you need to read up more (or even feel free to contact me in PM to talk to you about them one on one before taking the stage yourself).

The Bitcoin systems exists in a careful and somewhat subtle balance between two extremes: one where it is too costly to transact in, thus not valuable— or one where it is to costly to verify and so it offers little to no trustlessness advantage over traditional systems (which have a much more efficient and scalable design, made possible in part because they are not attempting to be trustless). Like most engineering tradeoff discussions every choice has ups and downs.

Also, you can review some previous discussions on the 1 MB block size limit in these links:

https://bitcointalksearch.org/topic/block-size-limit-automatic-adjustment-1865  Block size limit automatic adjustment (one of the earliest discussions on it, from 11/2010)

https://bitcointalksearch.org/topic/the-maxblocksize-fork-140233  The MAX_BLOCK_SIZE fork

https://bitcointalksearch.org/topic/how-a-floating-blocksize-limit-inevitably-leads-towards-centralization-144895  How a floating blocksize limit inevitably leads towards centralization

https://bitcointalksearch.org/topic/max-block-size-and-transaction-fees-96097  Max block size and transaction fees
Pages:
Jump to: