Pages:
Author

Topic: Permanently keeping the 1MB (anti-spam) restriction is a great idea ... - page 15. (Read 105082 times)

donator
Activity: 1617
Merit: 1012
Heck, were the cap completely removed, and some major pools concerned about spam (aren't we all?) stated that, for their own values of X, Y and Z, that they'd not relay blocks larger than (say) 500KB that pay total fees of less than X satoshis per kilobyte, and would not even build on blocks paying fees of less than Y per kilobyte unless they had managed to become Z blocks deep, would have a huge deterrent effect of making it expensive to try to spam the network.  Not many people are willing to risk 25 BTC to make a point, never mind be willing to continue to do so repeatedly.   X, Y and Z wouldn't need to be uniform across pools, and of course could change with time and technology changes.  An equilibrium would be found and blocks would achieve a natural growth rate than no central planner can properly plan.

Yes, it makes sense to remove/raise the hard limit from the protocol and let individual miners set their own limits since they are the ones most in touch with what the optimal parameters would be for their individual setups. If we went to a 20MB cap tomorrow I'd guess that no pool would even try to build blocks anywhere near that size given the high risk of getting an orphaned block with the current state of the network.
donator
Activity: 668
Merit: 500
[For the entirety of Bitcoin's history, it has produced blocks smaller than the protocol limit.

Why didn't the average size of blocks shoot up to 1 MB and stay there the instant Satoshi added a block size limit to the protocol?

I'm not sure what you're getting at. Clearly there just hasn't been the demand for 1 MB worth of transactions per block thus far, but that could change relatively soon., and thus the debate over lifting the 1 MB cap before we get to that point. If suddenly the block limit were to drop to 50kb, I think we'd start seeing a whole lot of 50kb blocks, no?
Justus is, I believe, pointing out that until very recently bitcoin has effectively had no block size limit, as blocks near the protocol limit were almost non-existent.  More recently we tend to get a few a day, mostly from F2Pool.

Those claiming we'll have massive runaway blocks full of one satoshi / free transactions have never adequately explained why it wasn't true historically when the average block size was 70k, and why people still felt the need to pay fees then.

Anyone trying to send free / very low fee transactions recently will know from having it backfire that they have to think long and hard about taking the risk if they want confirmation in a reasonable time, and that's the way it should be and likely always will be.   Each incremental transaction increases miner risk, and therefore has a cost, and that's natural and good, and enough for an equilibrium to be found.

Heck, were the cap completely removed, and some major pools concerned about spam (aren't we all?) stated that, for their own values of X, Y and Z, that they'd not relay blocks larger than (say) 500KB that pay total fees of less than X satoshis per kilobyte, and would not even build on blocks paying fees of less than Y per kilobyte unless they had managed to become Z blocks deep, would have a huge deterrent effect of making it expensive to try to spam the network.  Not many people are willing to risk 25 BTC to make a point, never mind be willing to continue to do so repeatedly.   X, Y and Z wouldn't need to be uniform across pools, and of course could change with time and technology changes.  An equilibrium would be found and blocks would achieve a natural growth rate than no central planner can properly plan.
legendary
Activity: 3990
Merit: 1385
How about keeping a main blockchain, housed in several thousand, major repositories around the world, with mirrors and backups. Everything would go into the main blockchain.

There would be a smaller, secondary, "practical," everyday blockchain that would house the first record of every address, and the last two. Yet it would eliminate all the times an address was used between the first and the second last.

This way we would always have the full record when needed, but we would have easy access for everyday use.

Smiley
hero member
Activity: 772
Merit: 501
So people who oppose the change are basically saying, BTC transactions have to be severely limited (at around 7 TPS) forever, just to stick to the 1MB limit forever. That is nuts.

It actually could be as few as 2 tps, meaning that if there are over 60 million Bitcoin users, each one will only use the blockchain once per year, and with a billion users (1 out of 7 people in the world), each person can use the blockchain less than once a decade:

The numbers below are for 2tps.  Double the numbers if you think 4tps is more appropriate but it doesn't materially change the insignificant upper limit.

Code:
Maximum supported users based on transaction frequency.
Assumptions: 1MB block, 821 bytes per txn
Throughput:  2.03 tps, 64,000,000 transactions annually

Total #        Transactions per  Transaction
direct users     user annually    Frequency
       <8,000       8760          Once an hour
      178,000        365          Once a day
      500,000        128          A few (2.4) times a week
    1,200,000         52          Once a week
    2,600,000         24  Twice a month
    5,300,000         12  Once a month
   16,000,000          4  Once a quarter
   64,000,000          1          Once a year
  200,000,000          0.3        Less than once every few years
1,000,000,000          0.06       Less than once a decade

This is totally unrealistic, and it's not going to work. Running a block-space scarcity experiment on Bitcoin to see what happens when people can no longer use the blockchain for real-world transactions, when the block size can still be significantly increased without making the network centralized, is dangerous and irresponsible. The idea that they'll opt for a Bitcoin micropayment channel hub, rather than just giving up on Bitcoin, is pure speculation, and one that I don't think will be borne out.
legendary
Activity: 1904
Merit: 1007
I cannot believe people seriously oppose the increase!

"arguing to ignorance"

the rest of the post was distorted trash based on false assumptions too

Troll and useless post. Bringing nothing to discuss. Move alone.
member
Activity: 84
Merit: 10
I cannot believe people seriously oppose the increase!

"arguing to ignorance"

the rest of the post was distorted trash based on false assumptions too
legendary
Activity: 1372
Merit: 1014
Why couldn't MAX_BLOCK_SIZE be self-adjusting?

It certainly could be.   The point of the post wasn't to arrogantly state what we must do or even that we must do something now.   I would point out that planning a hard fork is no trivial manner so the discussion needs to start now even if the final switch to new block version won't actually occur for 9-12 months.   The point was just to show that a permanent 1MB cap is simply a non-starter.  It allows roughly 1 million direct users to make less than 1 transaction per month.  That isn't a backbone it is a technological dead end.

For the record I disagree with Gavin on if Bitcoin can (or even should) scale to VISA levels.   It is not optimal that someone in Africa needs transaction data on the daily coffee habit of a guy in San Francisco they will never meet.   I do believe that Bitcoin can be used as a core backbone to link a variety of other more specialized (maybe even localized) systems via the use of sidechains and other technologies.  The point of the post was that whatever the future of Bitcoin ends up being it won't happen with a permanent 1 MB cap.

There are always altcoins to pay for the coffees, but still there can be no doubt that 1MB is not enough. I cannot believe people seriously oppose the increase! After all it will not make the blockchain 20x bigger, it will only make the blockchain as large as needed to include all transactions, once the 1MB limit is not sufficient anymore.

So people who oppose the change are basically saying, BTC transactions have to be severely limited (at around 7 TPS) forever, just to stick to the 1MB limit forever. That is nuts.
legendary
Activity: 1652
Merit: 1029
legendary
Activity: 924
Merit: 1132
Okay, I'm going to start by saying that in the short run (next four or so years) there is no escaping an increase in maximum block size.  So yes, definitely do that.

However, in the longer term, that's still supralinear scaling and still potentially faces scale problems.  So we need another solution.  We don't need altcoins whatsoever.  It's possible for a single cryptocurrency to exist as multiple blockchains.   There can be a central blockchain that does hardly anything else than mediate automatic exchanges between address spaces managed by dozens of side chains.

Transaction times where you have coins on one side chain and need to pay into an address that's on a different side chain would become longer, because now two transactions that mutually depend on one another must appear in two separate blockchains.   So that's annoying if your wallet can't find coins that are in the same chain as the address you are paying to. 

Within chain A, a cross-chain tx might appear as "Han paid Chain B in tx foo-a" and in chain B it appears as "Chain A paid Chewie in tx foo-b"  And if that result would cause chain A to have a negative balance, it triggers "Chain B paid Chain A half of chain B's positive balance in tx foo-central" in the central blockchain.

Anyway, you'd have to at least temporarily track the central chain, any chain into which you're paying, and any from which you're being paid, like a lightweight client.  Beyond that, you'd have the option of actually having the full download and proof all the way back to an origin block of any subset of chains that interest you.


legendary
Activity: 1092
Merit: 1001
Touchdown
Late to the thread but wanted to lodge my Yes vote for increasing the block size.  Great post D&T.

It seems nonsensical to me to risk adoption itself - and as D&T points out, give up the financial/transactional freedoms Bitcoin provides - simply to test a high scarcity/high fee model.  I can see that the upcoming halving adds some weight to the "miners will leave, threatening the security of Bitcoin, if fees don't compensate" argument, but I think the impact (reduced hashrate, centralisation of same) is being overplayed by some.  I don't buy the doomsday scenario.

Increase the block size and either the numbers work (adoption -> more transactions -> more small fees -> happy miners) or they don't and miners leave/fees rise.
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
I agree.  I didn't have the time to respond in your thread but I don't believe there is "one blockchain to rule them all" however even for those who believe Bitcoin can exists as an interchange between a diverse ecosystem of networks the block size will need to be raised.  1MB blocks prevent even infrequent access by users between the Bitcoin network and side networks without "trusted" intermediaries.  If the backbone is centrally controlled then the supporting ecosystem is built on a foundation of sand.

Thanks (it is nice to have your response rather than all the trolling I've been getting with people thinking I am trying to flog an alt-coin or whatever).

And I do agree 100% that the current Bitcoin TPS is not going to be very useful in a few years at all (if it stays as that then it could only end up being a "store of value" system practically).
donator
Activity: 1218
Merit: 1079
Gerald Davis
I do believe that Bitcoin can be used as a core backbone to link a variety of other more specialized (maybe even localized) systems via the use of sidechains and other technologies.

And this is a point that I have been trying to make (we don't need to put every single transaction in the world into one blockchain).

I agree.  I didn't have the time to respond in your thread but I don't believe there is "one blockchain to rule them all" however even for those who believe Bitcoin can exists as an interchange between a diverse ecosystem of networks the block size will need to be raised.  1MB blocks prevent even infrequent access by users between the Bitcoin network and side networks without "trusted" intermediaries.  If the backbone is centrally controlled then the supporting ecosystem is built on a foundation of sand.

We may see most of the overall transaction data occurring on side networks or way may see sidechains only used in niche applications where they provide a tangible benefit over Bitcoin "core".  I don't know how it is going to play out there are advantages and disadvantages either way.   How, when, by what method, and to what ultimate end the transaction capacity will be increased are all good questions.
hero member
Activity: 658
Merit: 500
I do believe that Bitcoin can be used as a core backbone to link a variety of other more specialized (maybe even localized) systems via the use of sidechains and other technologies.

And this is a point that I have been trying to make (we don't need to put every single transaction in the world into one blockchain).


I agree and thats why we will have sidechains.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Also, with a hard limit on size, won't the conversation on manual adjustment resurface indefinitely with increasing political difficulty? Wink

Probably and it may be infeasible to raise the cap further in the future even if that would be optimal.  However with a 20MB or 33.6 MB* (2^25) block size Bitcoin at least has the potential to exist as a backbone of sorts and it does provide us breathing room.  Once again I am not advocating any particular size limit, timeline, or method to raise the cap just that the idea of permanently keeping a 1MB block size just fails basic math and reasoning.   If the post convinces some people to go from "never raise the limit" to "I have concerns how can be raise the limit in a way which addresses them" then it served its purpose.


* The only "limit" definitively set by Satoshi.
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
I do believe that Bitcoin can be used as a core backbone to link a variety of other more specialized (maybe even localized) systems via the use of sidechains and other technologies.

And this is a point that I have been trying to make (we don't need to put every single transaction in the world into one blockchain).
donator
Activity: 1218
Merit: 1079
Gerald Davis
Why couldn't MAX_BLOCK_SIZE be self-adjusting?

It certainly could be.   The point of the post wasn't to arrogantly state what we must do or even that we must do something now.   I would point out that planning a hard fork is no trivial manner so the discussion needs to start now even if the final switch to new block version won't actually occur for 9-12 months.   The point was just to show that a permanent 1MB cap is simply a non-starter.  It allows roughly 1 million direct users to make less than 1 transaction per month.  That isn't a backbone it is a technological dead end.

For the record I disagree with Gavin on if Bitcoin can (or even should) scale to VISA levels.   It is not optimal that someone in Africa needs transaction data on the daily coffee habit of a guy in San Francisco they will never meet.   I do believe that Bitcoin can be used as a core backbone to link a variety of other more specialized (maybe even localized) systems via the use of sidechains and other technologies.  The point of the post was that whatever the future of Bitcoin ends up being it won't happen with a permanent 1 MB cap.
member
Activity: 63
Merit: 10
Why couldn't MAX_BLOCK_SIZE be self-adjusting?
That very vague.... based on what?

Max block size could be retargeted periodically alongside difficulty adjustments using average block size and the frequency of full blocks in a period with the hard-coded value as a floor.

Imagining the necessity of 20MB blocks relies on the assumption that a massive increase in transaction volume develops, but what if it slows significantly?  Bloated block broadcast delay might temporarily even be useful as a competitive advantage.  When fees outweigh reward, doesn't the mining market encourage bloat?

Also, with a hard limit on size, won't the conversation on manual adjustment resurface indefinitely with increasing political difficulty? Wink
hero member
Activity: 772
Merit: 501
Quote
What I meant is not a vote by hash rate, that would be a cartel. Difficulty increase is also not a vote but a consequence of market forces and technology.

Miners cartelizing to create a sensible de facto block size limit doesn't seem like a bad thing to me. Anyway, I don't want to take this discussion too far off-topic so I'll save it for a different thread.





hero member
Activity: 836
Merit: 1030
bits of proof
The pace of increase has to be algorithmic, driven by market forces and advances of technology, not cenral planning or cartels.
The algorithm we have now, that fits above, is that of the difficulty adjustment.

This was also my first preference. I'd add that if it was up to me, the hard limit would be removed altogether, and it would be up to miners to create a soft limit. As long as over 50 percent of the network hashrate enforces a particular rule on the block size, it will be as binding as a protocol rule.

Perhaps the 40% per year hard limit increase can co-exist with a dynamic soft limit that tracks difficulty. That way the hard limit acts as a failsafe, while the soft limit imposes the real size constraints.

What I meant is not a vote by hash rate, that would be a cartel. Difficulty increase is also not a vote but a consequence of market forces and technology.

It is important to preserve the proposed role of fees, that are to two fold: contain spam and replace inflation on the long run.
Eagerly increasing block size would be contrary to both.
hero member
Activity: 772
Merit: 501
The pace of increase has to be algorithmic, driven by market forces and advances of technology, not cenral planning or cartels.
The algorithm we have now, that fits above, is that of the difficulty adjustment.

This was also my first preference. I'd add that if it was up to me, the hard limit would be removed altogether, and it would be up to miners to create a soft limit. As long as over 50 percent of the network hashrate enforces a particular rule on the block size, it will be as binding as a protocol rule.

Perhaps the 40% per year hard limit increase can co-exist with a dynamic soft limit that tracks difficulty. That way the hard limit acts as a failsafe, while the soft limit imposes the real size constraints.
Pages:
Jump to: