Author

Topic: What is the optimal block size? (Read 893 times)

legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
May 09, 2015, 02:01:23 PM
#4
Although I am not really against increasing the block size I think that Bitcoin will always be limited by its confirmation time which is suitable for large txs but not small ones.

It seems that quite a lot of people think that we should only have one blockchain (the Bitcoin one) and to me that has never made much sense (unless you are heavily invested in it).

To be decentralised means to not have a single point of failure. If the world was to rely upon Bitcoin as its only blockchain then it would have a "single point of failure".

Although I also acknowledge that most "alt coins" offer little innovation I still think that the future will be many blockchains that can work together (this is why I invented Automated Transactions or AT which is a blockchain "agnostic" smart contract system).
legendary
Activity: 1792
Merit: 1111
May 09, 2015, 01:53:37 PM
#3
I am starting to believe that we need to have some fee pressure to limit blocksize and to push away transactions that shouldn't be on the blockchain. A fee driven dynamic maxblocksize would really make sense.

A Reddit post from Adam Back is quite insightful on that matter
http://www.reddit.com/r/Bitcoin/comments/354qbm/bitcoin_devs_do_not_have_consensus_on_blocksize/cr15oke

A tweet from Jeff Garzik is also going in this direction : https://twitter.com/jgarzik/status/595606257936957440

I also recommend to you "the economics of Bitcoin transaction fees" by Nicolas Houy http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2400519



I never argue for a limitless block size. My point is the current 1MB block size is not enough as a settlement system even if we have all the Level-2 solutions implemented.
member
Activity: 554
Merit: 11
CurioInvest [IEO Live]
May 09, 2015, 01:38:54 PM
#2
I am starting to believe that we need to have some fee pressure to limit blocksize and to push away transactions that shouldn't be on the blockchain. A fee driven dynamic maxblocksize would really make sense.

A Reddit post from Adam Back is quite insightful on that matter
http://www.reddit.com/r/Bitcoin/comments/354qbm/bitcoin_devs_do_not_have_consensus_on_blocksize/cr15oke

A tweet from Jeff Garzik is also going in this direction : https://twitter.com/jgarzik/status/595606257936957440

I also recommend to you "the economics of Bitcoin transaction fees" by Nicolas Houy http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2400519

legendary
Activity: 1792
Merit: 1111
May 08, 2015, 11:02:10 AM
#1
Let's have some thought experiments.

m = Max Block Size

--------------------------
The very minimal size of a bitcoin block would be around 250 bytes, which will allow nothing but only a standard coinbase transaction. No one will agree to have m=250B because Bitcoin like this could only be a mechanism to timestamp a hash.

How about m=500B? That will allow one real transaction in each block, in addition to the reward. May be it would be enough to allow transactions between 2 or 3 users. But if there were only 2 or 3 users, we won't even have a Byzantine General problem. Mining is just a waste of resources.

Similarly, no one will agree with m=1k, 2k, or 10k. Bitcoin in such form may support transaction within a very small group but has no practical utility as a global payment network

Conclusion 1: there must exist a size x, when m < x, Bitcoin system would have no practical value and people would use something else. Different people may have a different value of x in mind but there should be a reasonable range. For example, Satoshi thought that x will be =< 1MB for a certain period of time, so he put the cap as 1MB.
-------------------------

On the other hand, we may have a Bitcoin network to record all transactions of all creatures in this galaxy, with m = GoogolplexB. However, only the Inter-galactic Payment Network Federation may be able to run it, exploiting the Hawking radiation from the central black hole of Andromeda Galaxy.

By reducing m, thus limiting tps, running a full node will become more affordable.

Conclusion 2: there must exist a size y, when m > y, the level of centralization will make Bitcoin nothing better (or even worse) than a centralized payment system. Again, different people may have a different value of y in mind but there should be a reasonable range. For example, Satoshi thought that y was >= 1MB, so he put the cap as 1MB.
------------------------

Conclusion 3: x will increase as the user number increase, as the number of tx thus demand of block space increases.

------------------------

Conclusion 4: y will increase as time increase, with time as a proxy of technological advance. y might be 10kB in 1989 and 1MB in 2009

------------------------

Conclusion 5: If x > y, there will be no possible value for m. Bitcoin is nothing but an academic experiment. That's why Bitcoin was not possible in 1989 but created in 2009. Satoshi is a genius but timing is equally important.

------------------------

So the question is, what is the value of x and y in the coming few years?

I think it's easier to determine y. I think no one would think $1/month is not affordable. As Gavin# suggests that $5 could run an pruning^ full nodes with 20MB block, $1 should be enough for 4MB.* . And if we allow it to be $10/month, as I think many Bitcoin enthusiast are willing to pay, y is 40MB.

However, for the sake of consensus, let's say the very minimum value of y is 4MB at this moment. Anyone disagree?

-----------------------

Determining x is more difficult. It is a function of user number, while user number is a function of transaction cost, and transaction cost is a function of maximum block size. So it is sort of a loop.

x is also a function of the availability of off-chain system capacity. If we have a good off-chain system, x would be smaller.

But not being too aggressive, if we want to be 50% of Paypal which processes 11.6 million tx/day or 134tps, that will be roughly 70tps, which means 10MB##.

I'd say 50% of Paypal is really humble. Even if we have all alternative scalability solutions implemented, we still need to have enough room for on-chain tx.

So I would say x is 10MB, at least for a few years. Anyone disagree?

----------------------

WAIT! So x > y and Bitcoin is a joke? Well, we are still far away from 50% of Paypal. Actually, the 1MB cap still works well at this moment with no one complaining confirmation delay. This suggests the current x is smaller than 1MB. In this regard, Satoshi made a good choice.

---------------------

The final questions:

Should we schedule a block size increase in 2016? My answer is yes, because we can see x is increasing and will reach 1MB soon, and we are still well below y (4MB). As the hardfork process will take many months to complete, it will be too late if we initiate the raise after x is beyond 1MB.^^

What is the new size? I will support any proposal that's between 1 to 20MB but I don't think it should be smaller than 4MB, as the y estimated above. Actually there is no point to make it between 1-4MB because the "pain" of running a full node in this range would be similar.


--------------------
#http://gavinandresen.ninja/does-more-transactions-necessarily-mean-more-centralized
^ well, you may argue that a REAL full node should not prune but it still monitors the network for any malicious miner activity can could warn all SPV clients
* Of course you can't buy a VPS with $1 but as the machine is not fully utilized you may do other things
## Don't forget 10MB is actually not enough for 70tps as blocks are created randomly.
^^ We did not have a problem when it hit the 250kB soft-limit in 2013, simply because that was a soft-limit and miner could easily response to it.
Jump to: