Pages:
Author

Topic: Increasing the block size is a good idea; 50%/year is probably too aggressive - page 5. (Read 14297 times)

sr. member
Activity: 252
Merit: 250
Coin Developer - CrunchPool.com operator
It's a free attack for a miner, and can arbitrarily kick anyone off the network (even if temporarily) who doesn't have sufficient bandwidth or ultimately enough disk space.

It's not free. The larger the block, the higher chance it has to be orphaned. No miner is going to inflate his blocks to reduce his chance to win a block race.
Yes... and the orphan risk rate also decreases with bandwidth availability.
I continue to maintain that market forces can rightsize MAX_BLOCK_SIZE if an algorithm with a feedback mechanism can be introduced, and that doing so introduces both less centralization risk than an arbitrary patch, and less risk of future manual arbitrary adjustments.
Fix it right, fix it once.
Yeah, something that looks at block sizes over a past period of time to determine the next max block size for a certain period would be ok for example.
sr. member
Activity: 252
Merit: 250
Coin Developer - CrunchPool.com operator
I don't know if this has been addressed before in this thread
I don't think adjusting the block size up or down or keeping it the same will have any effect on whether or not transaction fees will be enough to secure the network as the block subsidy goes to zero (and, as I said, I'll ask professional economists what they think).
But it's pretty simple: miners won't mine until they have enough transaction fees in a block to be able to pay what it costs them to mine. You'll have to increase the fees enough or increase adoption enough (more txs) to have a decent enough hashrate for the network to be reasonably secure.

Another side effect is that miners will have more power than they have now, they'll be able to establish cartels and request premiums for fast transactions. Right now we're used to transactions being treated more or less equally and fees being essentially zero, that equal treatment will come to an end. There is no question the block size will need to have been increased by then in order to fit a larger amount of txs; without many more txs than we have now, fees would be pretty high.

Ideally, mass adoption would be enough to keep txs cheap and net hashrate high enough. But I don't think we can expect that to happen while other issues remain unresolved. Bitcoin is not a very good payment system and the average person has no reason to adopt it yet. Some libertarians call for ignoring that and recognising Bitcoin for what it is now and its only strength: circumventing state and bank control over our money. But that is short-sighted too because Bitcoin is not designed to survive that way: without mass adoption it is just a very expensive and clunky trustless way of irreversibly sending money.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
It's a free attack for a miner, and can arbitrarily kick anyone off the network (even if temporarily) who doesn't have sufficient bandwidth or ultimately enough disk space.

It's not free. The larger the block, the higher chance it has to be orphaned. No miner is going to inflate his blocks to reduce his chance to win a block race.
Yes... and the orphan risk rate also decreases with bandwidth availability.
I continue to maintain that market forces can rightsize MAX_BLOCK_SIZE if an algorithm with a feedback mechanism can be introduced, and that doing so introduces both less centralization risk than an arbitrary patch, and less risk of future manual arbitrary adjustments.
Fix it right, fix it once.
legendary
Activity: 3878
Merit: 1193
It's a free attack for a miner, and can arbitrarily kick anyone off the network (even if temporarily) who doesn't have sufficient bandwidth or ultimately enough disk space.

It's not free. The larger the block, the higher chance it has to be orphaned. No miner is going to inflate his blocks to reduce his chance to win a block race.
legendary
Activity: 1050
Merit: 1002
More structure can cut both ways. If done well (big if), it can reduce centralization by better distributing "votes." But you're right that you can't convince everybody.

Agreed.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
This I know, but tend to think it's not the case. Anyone in this part of the forum, on this thread, probably doesn't have some obscure minority point of view. Whatever the basis for their reasoning it's probably not isolated. I think if we can get some sort of consensus on a thread like this we can too in the wider community. If we can't it would be harder, maybe not impossible, but harder depending how people dug into their positions. The longer the wait the harder. If this were mid 2010 there would likely be zero problem. High profile devs (like Satoshi/Gavin) would simply explain the necessary way forward and the small community would move along. If we're talking 2019 I don't see that happening so easily, or at all actually.

Except for that last clause perhaps, no arguments here.

I'd be in favor of more structure for progress, but you won't convince everybody. There will be purists that cry centralization.

More structure can cut both ways. If done well (big if), it can reduce centralization by better distributing "votes." But you're right that you can't convince everybody.

There is always a long tail of technology out into the marketplace. Just because our community is at the cutting edge of technology doesn't mean everyone is. For example, I was surprised to learn of a story in the community I came from (Hacker News) about a very successful business that still ran BASIC. This was used for order fulfillment, accounting, you name it. The business was profitable in the millions if I recall, but completely reliant on their workhorse infrastructure. It wasn't cutting edge, but it worked, and that's all that mattered. A similar story exists in the banking industry.

My first assignment after college (20ish years ago) was with a defense contractor maintaining a codebase written in... BASIC. In any case, point taken.
legendary
Activity: 1050
Merit: 1002
Fears over lack of consensus are rhetoric to encourage people to accept shoddy work out of a mistaken sense of urgency.

You didn't answer this:

Quote
Please tell me if you agree an ossifying of the protocol - the fact it will become increasingly hard, probably impossible to make changes as adoption grows - is what we'll likely see.
legendary
Activity: 1050
Merit: 1002
Asside: did you preview your post at any point? It's in your draft history if so...

Thanks! Never noticed that before. Here is the full reply:

Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side).

This I know, but tend to think it's not the case. Anyone in this part of the forum, on this thread, probably doesn't have some obscure minority point of view. Whatever the basis for their reasoning it's probably not isolated. I think if we can get some sort of consensus on a thread like this we can too in the wider community. If we can't it would be harder, maybe not impossible, but harder depending how people dug into their positions. The longer the wait the harder. If this were mid 2010 there would likely be zero problem. High profile devs (like Satoshi/Gavin) would simply explain the necessary way forward and the small community would move along. If we're talking 2019 I don't see that happening so easily, or at all actually.

This is exactly where a more formal governance model (as I mentioned) could help. It too would surely be imperfect, but just about anything would be better than determining consensus based on who writes the most posts, thoughtful though they may be.

I'd be in favor of more structure for progress, but you won't convince everybody. There will be purists that cry centralization.

If, for example, I knew that there was little support for gavin's version, I for one would be much more willing to compromise. But I simply don't know....

Yes, I think some sort of poll will be in order at some point. I haven't pushed that yet because I think people still need time to stew with their positions.

I'm having trouble imagining a use case where embedded hardware with difficult-to-update software would connect to the P2P network, much less having anything to do with handling the blockchain, but my imagination isn't all that great. I also have trouble in general with any device whose purpose is highly security related that isn't software upgradeable. (Such things do exist today, and they're equally ill-advised.)

There is always a long tail of technology out into the marketplace. Just because our community is at the cutting edge of technology doesn't mean everyone is. For example, I was surprised to learn of a story in the community I came from (Hacker News) about a very successful business that still ran BASIC. This was used for order fulfillment, accounting, you name it. The business was profitable in the millions if I recall, but completely reliant on their workhorse infrastructure. It wasn't cutting edge, but it worked, and that's all that mattered. A similar story exists in the banking industry.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side).

I had a complete reply typed out for all your points but my browser closed before I sent it  Cry

Ah well, I'm not re-typing it. The gist is I'm aware of the above, but don't think that's the case. I tend to think those in this part of the forum on this thread have sentiments which are not isolated. If we can gain consensus here we have a good chance in the wider community; if not then who knows, but it would become ever harder with passing time.

Fair enough.

Asside: did you preview your post at any point? It's in your draft history if so...
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
We have less of a crisis than a request for comment.
Fears over lack of consensus are rhetoric to encourage people to accept shoddy work out of a mistaken sense of urgency.
So far there have been some well considered comments, though the responses by the requesters seem assumptive and tersely dismissive, as if comment were not honestly sought?
Frankly I'd expected better.  This is how science happens.  Peers review proposals, and both experts and consensus must be challenged to accomplish this.
It takes a mere engineer or technician to craft a patch, science produces novel results.  If we reach a crisis ahead of science happening, we can always patch.  Consensus is congealed in crisis, but crisis decisions are often also ill-considered.  For this I am grateful for Gavin's proposal.  I see it as a back-up plan in case these better solutions do not mature.

Richard Feynman in 1966 taught us that:
"Science is the belief in the ignorance of experts." -

Essentially an "expert" may have well formed opinions, and ignore options due to confirmation bias.  Revisiting assumptions often proves valuable.

It is not time that hinders consensus, so much as quality.  The BIPs flow like water when they are solid improvements.
legendary
Activity: 1050
Merit: 1002
Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side).

I had a complete reply typed out for all your points but my browser closed before I sent it  Cry

Ah well, I'm not re-typing it. The gist is I'm aware of the above, but don't think that's the case. I tend to think those in this part of the forum on this thread have sentiments which are not isolated. If we can gain consensus here we have a good chance in the wider community; if not then who knows, but it would become ever harder with passing time.
hero member
Activity: 709
Merit: 503
Another nice big block https://blockchain.info/block-height/326505 came through while we discuss the topic, yet the backlog of transactions https://blockchain.info/unconfirmed-transactions wasn't huge really at some amount just over 4000 (or are we just getting used to such big backlogs?).

We are bumping into the ceiling gentlemen.  It is safe to say we will begin to accumulate a bigger backlog pretty soon, when we start getting multiple blocks in a row near the current 1MB limit.  In my experience, in terms of queuing theory http://en.wikipedia.org/wiki/Queueing_theory, we can expect real signs of trouble as the average block size over a reasonable period of time, e.g. an hour or maybe more like a day, begins to exceed 70% of the maximum.  I'm going to try to build a model using JMT http://jmt.sourceforge.net/.

Perhaps we could two-step our way to the functional maximum.  We need to find the reliable functional maximum via testing.  To give ourselves some time to find it perhaps we could increase MAX_BLOCK_SIZE to the proposed 20MB right away (or as soon as is reasonable) and then work diligently to find the greatest workable maximum and then jump to it when we're ready.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
At worst harder, but not impossible.

LOL are you not following this thread? What easy way forward do you see emerging for the block size issue?

Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side).

This is exactly where a more formal governance model (as I mentioned) could help. It too would surely be imperfect, but just about anything would be better than determining consensus based on who writes the most posts, thoughtful though they may be.

A formal governance model could draw distinct conclusions: yes the BIP passed, or no it didn't. If it didn't, it can lead to compromise. If, for example, I knew that there was little support for gavin's version, I for one would be much more willing to compromise. But I simply don't know.... instead, I choose to assume that people who support Bitcoin do so because they support the ideals of a free market, but I could be wrong.

If the ISO can finally manage to crank out C++11, despite the contentious issues and compromises that were ultimately required (and C++14 just two months ago too!), pretty much anything is possible IMO.

That's for a programming language not a protocol. Also see Andreas Antonopoulos's comment on ossification considering hardware, which I also agree with.

I'm having trouble imagining a use case where embedded hardware with difficult-to-update software would connect to the P2P network, much less having anything to do with handling the blockchain, but my imagination isn't all that great. I also have trouble in general with any device whose purpose is highly security related that isn't software upgradeable. (Such things do exist today, and they're equally ill-advised.)
legendary
Activity: 1050
Merit: 1002
At worst harder, but not impossible.

LOL are you not following this thread? What easy way forward do you see emerging for the block size issue?

If the ISO can finally manage to crank out C++11, despite the contentious issues and compromises that were ultimately required (and C++14 just two months ago too!), pretty much anything is possible IMO.

That's for a programming language not a protocol. Also see Andreas Antonopoulos's comment on ossification considering hardware, which I also agree with.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
At the risk of putting words into his mouth (for which I apologize if I'm wrong), gavin sees it as a technical anti-DOS measure: to prevent miners from DOSing voting enthusiasts out of the network.

But that's a very costly attack, yet it doesn't accomplish anything.

It's a free attack for a miner, and can arbitrarily kick anyone off the network (even if temporarily) who doesn't have sufficient bandwidth or ultimately enough disk space.
legendary
Activity: 3878
Merit: 1193
At the risk of putting words into his mouth (for which I apologize if I'm wrong), gavin sees it as a technical anti-DOS measure: to prevent miners from DOSing voting enthusiasts out of the network.

But that's a very costly attack, yet it doesn't accomplish anything.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
Please tell me if you agree an ossifying of the protocol - the fact it will become increasingly hard, probably impossible to make changes as adoption grows - is what we'll likely see.

Not that I was asked, but I'll offer an opinion anyways.

At worst harder, but not impossible.

Today we have a sort of self-enforced (by the core devs) consensus system, plus of course the ultimate ability to vote with your node and with your mining capacity. I wouldn't expect the latter to ever change (indeed some blocksize limit is required to maintain this goal). For the former, however, I doubt that having this little governance around important changes to Bitcoin will last forever -- 20 years hence I would expect a much more regimented procedure, somewhat more akin to a standards organization than what we have today (perhaps with a combination of academic, miner, and corporate interests represented, but that'd be an argument for a different thread).

More governance is both bad and good-- in particular on the good side, bright lines can be drawn when it comes to voting in a way that doesn't happen so much today. If the ISO can finally manage to crank out C++11, despite the contentious issues and compromises that were ultimately required (and C++14 just two months ago too!), pretty much anything is possible IMO.

If you're that worried about a ossification, perhaps you'd prefer a dead man's switch: in 20 years, the blocksize reverts to its current 1 MB.
hero member
Activity: 672
Merit: 504
a.k.a. gurnec on GitHub
We all agree that the current max block size is too restricted.

What seems obvious to me is that different people have different opinions on the underlying purpose of any blocksize limit.

At the risk of putting words into his mouth (for which I apologize if I'm wrong), Gavin sees it as a technical anti-DOS measure: to prevent miners from DOSing voting enthusiasts out of the network. If this is true, the best solution would be for an automatically adjusting limit that tracked the speed of residential connections and of residential hard drive capacities (enthusiast residences). Since that seems impossible, gavin's limited-exponential growth seems like the best educated guess that can be made today.

Others see it as a as an economic issue, and would like to tie the limit to some economic aspect of Bitcoin to solve this perceived economic threat. I'm no economist, and I certainly don't know if they're right. But guess what: I personally don't care if they're right.

Any restriction on blocksize is an artificial one, a regulation determined by some authority who believes (perhaps correctly) they know better than I. I'm OK with technical restrictions, and those that improve the ability to vote, but I am completely against any restrictions whose purpose is to alter an otherwise free-market system.

To but it bluntly, I would rather see a restriction-free Bitcoin crash and burn in the hands of a free-market system, than I would participate in a regulated Bitcoin. To me, Bitcoin should be a free market experiment (as much as is technically feasible), even if this leads to its eventual failure. Of course, that's just my personal opinion, but it's the basis for my dislike of more-limited blocksizes.

I mean no disrespect to some of the clever alternatives presented in this thread-- but I personally wouldn't want any of these "regulations" in Bitcoin.

Let me ask a question: is there anyone here who both (a) favors additional blocksize restrictions (more strict than Gavin's), and also (b) believes such restrictions are not additional "regulations" that subtract from a otherwise more-free-market system?
hero member
Activity: 709
Merit: 503
I have an idea. Why not ask everyone in Bitcoin what they think we should do, then just do all of them! Or, we can just debate each idea until it no longer matters since the year will be 2150.
Essentially a lot of ideas are being tried out via altcoins.

Rushing to do anything just to get something done does not seem prudent.  Hesitating forever will lead naturally to real consequences.

Waiting for the MAX_BLOCK_SIZE to become an emergency is waiting too long.  https://bitcointalksearch.org/topic/m.4552409 was an attempt to find the biggest queue to date.
hero member
Activity: 709
Merit: 503
https://en.bitcoin.it/wiki/Weaknesses#Spamming_transactions talks about spamming transactions but does not connect that with a miner.
Pages:
Jump to: