Pages:
Author

Topic: Size of BTC blockchain centuries from now... (Read 10790 times)

sr. member
Activity: 462
Merit: 250
It's good to know that problems have solutions. Hopefully, Bitcoin software will keep up with ever increasing load.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Verification is the most difficult function of nodes. Every input of every transaction has to be matched with an output of a previous transaction. We need to look up that previous transaction in a database of unspent transactions. If the UTXO DB does not fit in RAM, lookups involve disk I/O, which are relatively slow. With time, both transaction volume and UTXO DB grow. Nodes will have to verify more transactions and every verification will be taking more time on average.

Well the good news is the UXTO grows much slower.  For example in the last 6 months the full blockchain has nearly doubled however the UXTO has grown from ~200MB to 250MB.  This is due to the fact that currency is eventually reused.  New transactions require older transactions to be spent.  The one exception is "dust" where the economical value of the transaction is less than the cost to spend it.  The ~250MB UXTO is about one quarter dust and most of that will never be spent.  If it is eventually spent it is only due to luck and time where the exchange rate rises enough that the dust isn't dust.  This is why the changes to 0.8.2 are so important.  By preventing worthless dust it limits the amount of the UXTO which isn't spent.  If the majority of the UXTO is regularly spent the size will grow even slower.  The size of the UXTO is more related to the number of unique users (with full control over their private keys) then the number of transactions.

Also the method of caching the UXTO is relatively inefficient right now.  The entire transactions is stored however if the inputs are already verified (which they are) then the UXTO could simply contain the already verified output only and possibly the full transaction hash for lookup against the full db.  Outputs are much smaller than inputs with average output size being 34 bytes and the average input size being 72 bytes.  This would indicate a roughly 66% reduction in UXTO is possible.  The vast majority of outputs are "standard" (OP_DUP OP_HASH160 OP_EQUALVERIFY OP_CHECKSIG) for the purposes of the UXTO they could be simplified to a ledger like this:

TxHash - 32 bytes
OutIndex - 2 bytes
Amount - 8 bytes    
PubKeyHash - 20 bytes
Total: 62 bytes

Current UXTO has ~1.1 million unspent output so we are talking data (before db/disk overhead) being ~68 MB.   Obviously this isn't a pressing concern but given the UXTO grows relatively linearly, available memory grows exponentially and the current UXTO can be reduced to ~68MB the risk of UXTO spilling out of available memory and slowing tx validation is minimal.  The one threat was dust.  Since dust is almost never spent it causes the UXTO to grow much faster (they dust just keeps accumulating rather than cycling in and out of the UXTO).  With 0.8.2 that threat is removed.

This is a significant refactoring so don't expect it to appear anytime soon.  Eventually I think you will see the blockchain stored in three different formats.

1) The full chain.  Nodes keeping the full chain (containing all spent tx back to genesis block) will be archive nodes.  It isn't important for every node or even most nodes to be archive nodes but we would want a robust number to keep this archival copy of the blockchain decentralized.

2) The pruned chain & memory pool.  Most nodes will likely only retain the pruned blockchain, replacing spent tx with placeholders in the blockchain.  The memory pool contains the set of unconfirmed transactions.

3) In memory UXTO ledger.  Nodes will build this from the pruned database.  Since tx are already validated (or the tx and/or block would be rejected) it isn't necessary for the entire tx to be stored in the UXTO just the output portion.  Obviously you can't "only" have the UXTO or get the output only portion from other nodes as it can't be independently validated but this doesn't mean nodes can't build their OWN output only ledger from the pruned database.

As for CPU being a limitation I don't see that anytime soon.  A single 2.0 Ghz i7 core can perform 4,300 256 bit ECDSA signature validations per second.  Now that is just the ECDSA portion but it is the bulk of the computing needed to validate a transaction  Lets assume bitcoin implementation is no worse than half as fast and the average tx has two inputs that is a minimum of 2150 tps.  I would caution that this is probably a massive understatement of what is possible and assumes only one core is in use but use it as a worst case scenario.  To catch up 1 million transaction validation (roughly 28 hours of down time @ 10 tps) would take about 8 minutes to sync.  Remember this is just CPU "bottleneck" (or lack thereof).  For historical usage we can assume @ 10tps the first x years is a rounding error (blockchain today has ~3 million transactions or about 3 days at maxed out 10 tps).  To bootstrap from nothing the a node could validate (remember just CPU bottleneck so assuming network and disk and keep up) 185 million transactions per day or about 215 days at 10 tps.  So for near future the CPU isn't really a bottleneck.

Now at 1,000 tps it is a different story so this is why IMHO the first increase to block limit should be a single manual upgrade (to say 5MB or 10MB) to provide time to come up with more robust handling of high transaction volumes.  Those who wish to jump to an unlimited blockchain size are underestimating the risk of either through negligence or malice the blockchain growing so fast that while existing nodes can "keep up" it kills off the ability for new nodes to bootstrap. Given today we are only 0.4 tps I just don't see the potential risks of going from 1MB to unlimited overnight.  A modest one time increase (when necessary) would provide "breathing room" to come up with a more comprehensive solution.
sr. member
Activity: 462
Merit: 250
I do not see the actual problem.
Problems may be not obvious now, but we need to be ready to scale. Now we have about 0.48 transactions per second in Bitcoin. That is a tiny volume comparing to popular payment networks. If Bitcoin adoption will continue at a good rate, it will not be long untill we will need to scale to something like 10 TPS.

computational problem
Verification is the most difficult function of nodes. Every input of every transaction has to be matched with an output of a previous transaction. We need to look up that previous transaction in a database of unspent transactions. If the UTXO DB does not fit in RAM, lookups involve disk I/O, which are relatively slow. With time, both transaction volume and UTXO DB grow. Nodes will have to verify more transactions and every verification will be taking more time on average.

The main difficulty with this all is synchronization. If a node goes offline for some time, it will have to catch up. At 10 TPS, 28 hours offline results in a backlog of 1 million transactions. If you think that it's not much, consider that UTXO DB will be much larger than it is now and an average transaction will take more time to verify than it does now. During that time a big share of the computer resources and maybe downstream network bandwidth will be occupied by Bitcoin.

Initial synchronization in its current form will be just infeasible for most users. Nodes will have to start in SPV mode and download and verify the full transaction history in the background. Again, eating some share of computer resources.
legendary
Activity: 980
Merit: 1008
A limit that is always above the average actual block size at least prevents an attacker from creating one enormous block.
This is a fair point.

But perhaps it would be wiser to include a built-in limit of, say, a rolling median or average of the last 1000 blocks instead.

On the other hand that would just introduce added complexity. Perhaps it's a good idea to transition to that eventually, but start out by manually increasing the limit.
sr. member
Activity: 437
Merit: 255
I do not see the actual problem.

True - the blockchain has an incredible length of 10GB now. And some BTC users might not want to handle this.

But ..

1. anybody is free to use a lightweight client

2. compression is part of any modern file or operating system

3. according to the statistics it is growing now by 5GB every 6 month - I think as long as this is not a traffic or computational problem it should not be a storage problem due to current hard drive sizes.

Where the heck is the point ? Finally there will be a group of miners, secondly a group of heavy weight client users and thirdly the crowd of lightweight client users. Maybe from a security point of view the situation becomes serious if the number of blockchain users drops under a certain level which I do not see now.

I use a heavyweight client currently mayself and still have no problem with the size.
sr. member
Activity: 252
Merit: 250
Still the Best 1973
I thought that was the only intent behind block size.
sr. member
Activity: 462
Merit: 250
A limit that is always above the average actual block size at least prevents an attacker from creating one enormous block.
legendary
Activity: 980
Merit: 1008
So one has to make the best compromise. Clearly, if block size is too low (e.g. only ONE transaction per block, to make an extreme example), the network will become useless, people will loose interest, nobody will use it, and bitcoin dies. So in that sense we should be worried about Centralization of USERS.
But the current block size limit or 1 MB corresponds to ca. 1000 transactions per block (maybe 2000...), which is far from this point, as the current network already proves. So Centralization of Users is not to worry about.
One thing I don't get is the following:

You say that the current block size limit of 1 MB is not a problem, because we're only seeing ~100KB blocks on average. But how is that an argument? If we just raise the size limit every time there is demand for it, why even have it in the first place? What purpose does it serve if we're going to adjust it manually anyway when transaction volume starts budging against the limit? A continually adjusted limit is not a limit.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
First, the "20 USD" that you state is very far-fetched, way too high in my impression...

The track record of banks charging as much as they can for wire transfers is shameful.  Between $15 and $45 is cited here:
http://en.wikipedia.org/wiki/Wire_transfer#Regulation_and_price
The standard fee for international wire transfers is generally $25. No reason to expect the profit motive to weaken for Bitcoin banks. Also, if people pay this much now then blockchain fees could reach this level too.

The wire business should be bread and butter for Bitcoin to steal, but millions of such fiat transfers are done every day, well outside the current network capacity (even if SatoshiDice-like volume were zero, instead of 50% of current volume). Note that gamblers have a high tolerance to fees and Bitcoin may well prove to be a huge hit for the whole online gambling industry which could easily saturate the blocks by itself.

I have not yet advocated infinite blocks, just supported the idea of a flexible limit that might help the fees market develop by block space rationing. Until recently Bitcoin operated with a block limit that was effectively infinite because the number of transactions was far too small to trouble it. Then a soft-limit of 250KB was tried (to see how the network behaved), as well as 500KB later. As soon as the 250KB blocks approached saturation the limit needed raising because of users having unreasonable delays.
Fees are rising but probably due to client changes rather than block space rationing.
https://blockchain.info/charts/transaction-fees?showDataPoints=false×pan=&show_header=true&daysAverageString=7&scale=0&address=

You make more points, but, have you read these threads?
The MAX_BLOCK_SIZE fork
https://bitcointalksearch.org/topic/the-maxblocksize-fork-140233
How a floating blocksize limit inevitably leads towards centralization
https://bitcointalksearch.org/topic/how-a-floating-blocksize-limit-inevitably-leads-towards-centralization-144895
Funding of network security with infinite block sizes
https://bitcointalksearch.org/topic/funding-of-network-security-with-infinite-block-sizes-157141

To be clear, I am all for decentralization as well. This is not monolithic. Miners are one aspect, but also non-mining peering/propagating full nodes are another. It is the latter case which is hard to measure well. If there are 6,000 such full nodes then this seems much healthier than the mining situation where one miner has 25% of the network hashing power: http://www.asicminercharts.com/, let alone that only a few pools hash most of the blocks.


This is a good part of the answer to block size concerns, and becoming more urgent.
sr. member
Activity: 462
Merit: 250
That is a project that promises to make secure verification of new transactions and blocks possible without having the full block chain stored locally. It is also planned to use additional channels (BitTorrent) for distribution of old blocks to syncing nodes. Thus, the need in full nodes in their current form may be reduced. However, it will require a motivation for maintaining an additional block chain and distributing old data. Am I right?

runeks, I missed a point of your message.
legendary
Activity: 980
Merit: 1008
Even though their expenses increase (not significantly), they are still paid for the information they process. Additional gigabytes of required storage will not be a problem for miners. Most of a typical miner's expenses go to SHA2 bruteforce, which has no relation to the transaction volume.
Why would you say this?

My ancient 2.83 GHz Core 2 Quad can currently handle around 3000 transactions per second. If we assume an average transaction size of 250 bytes, that's a data rate of 730 KB/s, or 1.8 TB per month.

So your average old CPU can chug along for years without an upgrade, but you will have to buy a new 2 TB HDD every month.
sr. member
Activity: 462
Merit: 250
In the other case (no block size limit), we have the opposite effect: Short-term, USERS will continue to be able to use the blockchain at low tx fees due to low competition between transactions. So miners will have lower revenues, which will mean that many miners are going to switch off.
I guess that larger blocks would have more total fees, despite the lower average fee. Consider block chain transactions as a service. The cheaper the service is, the greater is the turnover, because more people pay to the Bitcoin network and less to other payment processors. Of course, that holds true down to a certain extreme; a free service has zero turnover in terms of money. So, as long as fees are above some threshold, transaction volume is not a concern for miners. Even though their expenses increase (not significantly), they are still paid for the information they process. Additional gigabytes of required storage will not be a problem for miners. Most of a typical miner's expenses go to SHA2 bruteforce, which has no relation to the transaction volume.

Do we need a hard block size limit to keep fees above the profitability threshold? I think, no. Miners are free to choose what transactions to include in their blocks. A reasonable miner will just drop any transactions that don't worth inclusion. Unreasonable miners and attackers can produce blocks with lots of trash, but hopefully these will be too rare to affect other miners.

So, miners are unlikely to suffer from increased transaction volume. And who does suffer? Non-miner nodes do. They don't receive a compensation for their processing power and occupied storage. Block size limit is to benefit of users who run their own nodes. Setting up new nodes becomes harder with time, and increase in transaction volume makes it considerably worse. If large number of nodes is important for Bitcoin network health, we will need to motivate full-node operators somehow. Until we have a solution, we better keep transaction volume low. By the way, some foresee that most of full nodes will be run by miners and businesses that profit from Bitcoin in some other way.
legendary
Activity: 1120
Merit: 1152
If anything, if you want to attack Peter Todd and friends on this basis you should be criticizing them for spending too much time promoting tools as solutions to scaling relative to the amount of time they've spent actually creating them.  Lots of yap yap, but if this stuff is to be the great salvation against uncomfortable tradeoffs between scale and decentralization... show us the code!

With enemies like you, who needs friends? Wink

Of course, I can in turn respond that advocacy rather than code is a response to getting attacked for promoting my specific off-chain systems like fidelity-bonded banks - the more people who realize this is a problem and work on solutions the better. If they come up with solutions that are drastically different from what I've proposed, all the better, so long as they actually work. Personal attacks just make this inherently highly political argument even worse.

Thats neither here nor there, as I don't think the argument is that there is currently a problem...  I point it out because I don't think the discussion is served by factually weak arguments.

Yup. It's funny listening to the "but we're decentralized now!" argument, because that's exactly the conversation I had with the people at stonecanoe when I was making the video. Specifically we thought it was hilarious that we were essentially making an advocacy video where the "call to action" message for much of the target audience was "do nothing and everything will be ok"

SPV clients and big mining pools are a concern, but overall Bitcoin is fairly decentralized now. I want to keep it that way.
sr. member
Activity: 278
Merit: 251
Bitcoin-Note-and-Voucher-Printing-Empowerer
PS: Moreover, long-term movement of transactions to off-blockchain systems is not a bad thing I think, because these operators would serve the non-tech-savvy masses (who don't know how to create safe cold storage etc.), while the minority of tech-savvy people (like >95% in this forum) may continue using the "proper" block chain, at least to a reasonable extend.
Unfortunately Michael, that's the problem I have with a low inflexible block limit (like 1MB), because it is fees which will determine who gets their transactions onto the blockchain. In the vision of keepbitcoinfree it will only be the large 3rd party banks and services which can afford the high fees (in the order of $20 per transaction at least). The >95% of this forum, tech-savvy types, will be effectively banned from the blockchain. Then we can ask: "Who will run a Bitcoin node to maintain the decentralized network, when they can't afford to use the blockchain for everyday transactions?" Answer. No one. So the network becomes very centralized because only the bitcoin banks and 3rd party systems will run full nodes.
Solex, your arguments do not convince me at all.

First, the "20 USD" that you state is very far-fetched, way too high in my impression. If we consider that the "3rd party banks" (or "online wallet providers", as I would call them) mainly use the blockchain to transfer funds between them, just like in the fiat money system today private banks carry out electronic money transfers of real central bank money between each other, then the capacity of the blockchain would not even be close to exhausted from this sort of transaction. Private banks have to make such transaction today only once per day, so the "bandwidth" for such transactions is enormously low, and enough bandwidth would still be left for normal business bitcoin transactions.

But even if all business and private transactions were practically "banned" from the blockchain due to these 20 USD fees, I would in the end still see a big difference and advantage compared to today's fiat world: In today's world, private people and businesses have no access to electronic central bank money at all (just non-electronic "cash" is central bank money). The rest of today's electronic "money" is actually just liabilities of banks towards their customers. In the "new world", private people and businesses and banks would have equal access to "real" money (the "primary blockchain-bitcoins"), and no private person needs to put his life-time savings into bank accounts with consequences as we have recently seen in Cyprus.

Secondly and more importantly, I would draw exactly the opposite conclusion than you w.r.t decentralization of miners (=bitcoin network nodes)! Think about it, please. You seem to confuse the concepts of bitcoin USERS and bitcoin MINERS, or at least you seem not to separate these two concepts. If transaction fees grow to an equivalent of 20 USD/transactions as you suggest, then this means in tendency a centralization of USERS, i.e. mainly those who transfer large amounts of bitcoins (like online wallet service providers, private persons buying real estates, or big companies) will have the "privilege" to use the blockchain instead of off-blockchain transfers of liabilities (=equivalent to today's bank transfers). But this is of no relevance for the miners! From the MINER point of view, there is a certain amount of traffic going on, the miner does not care who is generating this traffic, whether a bank or a private person. Thanks to the high TX fees it is really profitable to set up an own miner for many people. And since the block size limit is relatively low, the bitcoin network is still scalable, i.e. HW expenses for setting up a miner are low, so many people will decide to setup miners --> we get the desired DEcentralization effect in terms of Miners=bitcoin network nodes.

In the other case (no block size limit), we have the opposite effect: Short-term, USERS will continue to be able to use the blockchain at low tx fees due to low competition between transactions. So miners will have lower revenues, which will mean that many miners are going to switch off. As time proceeds, the blockchain increases quickly, meaning higher and higher memory requirements for miners. This will be another negative cost factor for miners that will cause even more miners to switch off. --> we get the UNdesired centralization effect in terms of Miners.

To summarize:
  (*) Limited Block Size promotes...
  • ...Decentralization of MINERS (due to higher TX fees and lower blockchain size, making mining more profitable), but
  • ...Centralization of USERS (due to higher TX fees, more users or services get incentive to go "off-blockchain")

  (*) Unlimited Block Size promotes...
  • ...Centralization of MINERS (due to exploding blockchain size=higher mining hardware costs and less profitable mining due to lower TX fees), but
  • ...Decentralization of USERS (due to lower TX fees more users will continue using the blockchain directly)

So one has to make the best compromise. Clearly, if block size is too low (e.g. only ONE transaction per block, to make an extreme example), the network will become useless, people will loose interest, nobody will use it, and bitcoin dies. So in that sense we should be worried about Centralization of USERS.
But the current block size limit or 1 MB corresponds to ca. 1000 transactions per block (maybe 2000...), which is far from this point, as the current network already proves. So Centralization of Users is not to worry about.

Rather we should be worried about the Centralization of MINERS, which appears to be the main threat to the bitcoin network right now.

As long as tx fees are so low that services like SatoshiDice still operate over the blockchain, there is certainly no need to increase the block size limit. Once we approach the block size limit and SatoshiDice is "squeezed out", and we hit the blocksize limit again, people will first start to become more cautious about their transactions! I know it from myself, I used to make lots of "fun" and "test" transactions in the past, because they were so cheap. If Tx costs rise from let's say 0.5 cent to 10 or 20 cent, I will certainly reduce my unnecessary transactions substantially (which make up certainly more than 90% of my transactions today) without really feeling that the bitcoin network has become less valuable for me in essence. Other users will do the same. So there will be a natural evolution towards use of transactions only where really needed, and this is good. There is absolutely nothing wrong with the vision that in a very far future (assuming that bitcoin is really becoming the dominating currency in the world one day - which is far from certain) the vast majority of payments (like shopping etc.) are done off-blockchain via service providers. However, if we destroy the network before due to miner scalability and decentralization problems, bitcoin will certainly never reach that point.


(update: PS: 20 USD tx fee/transaction means ca. 30000 USD tx fee per block, or ca. 4 Million USD tx fee per day, or 1.5 Billion USD per year. Just to get an idea. Current block reward (currently dominating the fees) is ca. 3000 USD/block.)
staff
Activity: 4242
Merit: 8672
Withdrawn.
I made the comment because of frustration that the debate on this crucial issue was moved from a civilized thread to a youtube video. I respect Peter's intellect, yet remain completely puzzled at the naive viewpoint that the video presents. He makes no attempt to measure decentralization, yet this is crucial to its message. It is only pieces of information, such as what you posted just now, which aids building an overall picture within which decisions on the block size can be made.
Thank you!  And indeed. Lots more information is required all around. Some will be easier to get than others.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
further a personal agenda which is to be the renowned architect of the 3rd-party systems
As far as I can tell this is a slanderous personal attack. It is unsupported by an information available to me. I must insist that you actually substantiate it or withdraw it.

Withdrawn.
I made the comment because of frustration that the debate on this crucial issue was moved from a civilized thread to a youtube video. I respect Peter's intellect, yet remain completely puzzled at the naive viewpoint that the video presents. He makes no attempt to measure decentralization, yet this is crucial to its message. It is only pieces of information, such as what you posted just now, which aids building an overall picture within which decisions on the block size can be made.


staff
Activity: 4242
Merit: 8672
further a personal agenda which is to be the renowned architect of the 3rd-party systems
As far as I can tell this is a slanderous personal attack. It is unsupported by an information available to me. I must insist that you actually substantiate it or withdraw it.

Seriously, this is very uncool. If you don't believe you arguments are strong enough to stand on their own without hypothesizing evil on the part of your counter-parties then you should keep your views to yourself.  If people are attacked in this manner we _cannot_ have a civil conversation on this stuff.  No one but pure politicians will be willing to earnestly debate the subject when there exists a threat of character attacks like this.

If anything, if you want to attack Peter Todd and friends on this basis you should be criticizing them for spending too much time promoting tools as solutions to scaling relative to the amount of time they've spent actually creating them.  Lots of yap yap, but if this stuff is to be the great salvation against uncomfortable tradeoffs between scale and decentralization... show us the code!

Quote
The essence of the video is that decentralization is at risk. Evidence is showing otherwise:
Bitcoin has a record number of active nodes, 350,000+, and this is increasing even as average block size is increasing.
http://bitnodes.io/
The number of stable listening nodes is constant-to-decreasing based on the actually data available (the data collected by the DNS seed collectors).  I can only imagine that they're getting 350,000 based on counting up addr messages, which is a methodology that doesn't work and gives gibberish results.  Their data from actually connecting to nodes shows numbers like 6000 which sounds more realistic.  I certainly wish the big numbers were true, but I don't think they are.

Thats neither here nor there, as I don't think the argument is that there is currently a problem...  I point it out because I don't think the discussion is served by factually weak arguments.

That said, I personally think there currently are decentralization problems: For one, bitcoin.org will so no longer recommend bitcoin-qt: There is tremendous pressure to only recommend SPV clients and the new site when launched did so, and only stopped because multibit was constantly locking up at the time and had many other issues.  Blockchain sync stats appear to be suggesting, by the upward slope for the oldest blocks, that a significant fraction of all nodes that attempt syncing never finish.   Those issues are indirect— but when you consider that compromising _two_ computers (or their operators) is currently enough to control >>50% of the hash-power  you simply cannot say that we don't currently have severe decentralization problems.

Though it's not clear to me to what degree this is historical accident vs scaling induced, we'll know more once some new tools for improving mining decentralization come out in the not too distant future.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
PS: Moreover, long-term movement of transactions to off-blockchain systems is not a bad thing I think, because these operators would serve the non-tech-savvy masses (who don't know how to create safe cold storage etc.), while the minority of tech-savvy people (like >95% in this forum) may continue using the "proper" block chain, at least to a reasonable extend.

Unfortunately Michael, that's the problem I have with a low inflexible block limit (like 1MB), because it is fees which will determine who gets their transactions onto the blockchain. In the vision of keepbitcoinfree it will only be the large 3rd party banks and services which can afford the high fees (in the order of $20 per transaction at least). The >95% of this forum, tech-savvy types, will be effectively banned from the blockchain. Then we can ask: "Who will run a Bitcoin node to maintain the decentralized network, when they can't afford to use the blockchain for everyday transactions?" Answer. No one. So the network becomes very centralized because only the bitcoin banks and 3rd party systems will run full nodes.
sr. member
Activity: 278
Merit: 251
Bitcoin-Note-and-Voucher-Printing-Empowerer
Hmm, maybe the optimum block size limit is not 1 MB, but "infinity" is certainly the worst choice of all. It is also clear that at some point it will have to become limited... actually I came to this conclusion myself w/o personal agenda before I saw http://keepbitcoinfree.org, so it appeared quite logical to me.
Would you care to argue for your position, instead of just stating it?

It's not at all obvious to me why no block size limit is "certainly the worst choice of all". Nor is it clear to me "that at some point it will have to become limited".
First of all, a lot of arguments (good argumetns I think) are given at http://keepbitcoinfree.org. It is easy to read and understand, so no point to re-iterate the arguments here. Everybody can read it and think about it for oneself.

In my own words, in essence if block size is unlimited, everyone can arbitrarily spam the blockchain without any risk and negligible cost, thereby driving up costs of operating a node of the bitcoin network and extremely accelerating the growth of blockchain size. As a result, only big players will remain as miners, creating a strong drive towards monopolizing the bitcoin network. This is the worst that could happen. Proponents of infinite or very large block size limit either do not see this, or have the interest of destroying bitcoin.

So I think an unlimited block size (or a too high limit) looks attractive short-term, but is not sustainable (i.e. not viable long-term), and since this can be seen well in advance, also mid-term adoption of bitcoin by the people is at risk, because people would see that bitcoin is at risk long-term.


PS: Moreover, long-term movement of transactions to off-blockchain systems is not a bad thing I think, because these operators would serve the non-tech-savvy masses (who don't know how to create safe cold storage etc.), while the minority of tech-savvy people (like >95% in this forum) may continue using the "proper" block chain, at least to a reasonable extend.
Pages:
Jump to: