Pages:
Author

Topic: The MAX_BLOCK_SIZE fork - page 9. (Read 35546 times)

hero member
Activity: 756
Merit: 522
January 31, 2013, 05:02:34 PM
#33
To recap, this is my issue with Hearn: he makes a false claim (quoted above) to prop a false generalization of his. If that doesn't work, he just ignores the point. Intellectual dishonesty at its finest, and not quite the first time either.

Any hard forks in that list of many changes you weasel you!

And now sorry for the free advertising...
Fortunately we have Freicoin which doesn't suffers from this potential problem even if there's no block limit at all.
Freicoin has perpetual reward for miners financed through demurrage fees on holdings.
Before everybody starts complaining about savings and demanding mercy for their grandma's: freicoin is purposely designed to be a medium of exchange and NOT a store of value.
That's the beauty of a free monetary market: different monies can have different qualities and purposes.

Nothing to be sorry about (I had no idea it existed, for one) and good luck with it.
legendary
Activity: 1106
Merit: 1004
January 31, 2013, 03:52:49 PM
#32
2) The economics of providing for network security when block inclusion is free and inflation has dwindled

For (2), I feel like there's a factor I never see mentioned. In the short run (12+ years), the block rewards are more than enough to incentivize mining, especially as we're moving to a world where the variable cost (electricity) of mining is plummeting. Over that same timeframe, the cost of ASICs should also plummet to the marginal cost of production at the same time Moore's law is increasing their power. Hashing power is going to be cheap. Very cheap.

Hashing power being cheap is not relevant to the security of the network since it would be equally cheap to an attacker. It's the total amount of resources employed by honest miners relative to those of an eventual attacker that actually matter.
legendary
Activity: 1190
Merit: 1004
January 31, 2013, 03:08:57 PM
#31
I disagree with those that say this is bad since it breaks the trust of bitcoin. People who use bitcoin want to retain ownership and usability of their money and to be assured that the inflation rate wont differ. More space in blocks would be a benefit due to lower transaction fees. The integrity of people's bitcoins remains as was ever.

The problem is that the fork may not go seamlessly. If a fork is to be made as many issues as possible should be resolved. For instance, the block timestamp can be made a 64 bit integer as should always have been, so that this disturbance would be as seldom as possible.

Considerations must be made for how bitcoin can scale to much larger block sizes. I have mentioned before that when block sizes reach a point it will be beneficial to relay blocks in separated parts.
legendary
Activity: 1372
Merit: 1002
January 31, 2013, 02:25:45 PM
#30
About Mike's problem two, this has been discussed many times.
The bitcoin protocol as it is needs a block size limit (not necessarily the one we have today) to avoid a tragedy of the commons on mining when subsidies are gone.
I remember some people advocating for proof of stake (I think that's how that concept started) and me alone advocating for demurrage.

And now sorry for the free advertising...
Fortunately we have Freicoin which doesn't suffers from this potential problem even if there's no block limit at all.
Freicoin has perpetual reward for miners financed through demurrage fees on holdings.
Before everybody starts complaining about savings and demanding mercy for their grandma's: freicoin is purposely designed to be a medium of exchange and NOT a store of value.
That's the beauty of a free monetary market: different monies can have different qualities and purposes.
bpd
member
Activity: 114
Merit: 10
January 31, 2013, 02:02:39 PM
#29
legendary
Activity: 1526
Merit: 1129
January 31, 2013, 01:23:28 PM
#28
What's interesting is that there seem to be two fairly strongly divergent viewpoints on this matter: some people assume the transaction network will continue to grow to rival paypal or even credit cards, and see the block size limit as an unimportant detail that will be quickly changed when needed.  Others see the limit as a fundamental requirement, or even dogma, of the bitcoin project, and view the long term network as mainly an international high-value payment system, or the backing of derivative currencies.  Both views seem reasonable, yet mutually exclusive.

It's an issue that will become clearer with time, I think.

By the way, I'm not assuming that Bitcoin will grow to rival PayPal or credit cards. That would be a wild, runaway success that would mark a major turning point in the history of money itself. And the internet is littered with the carcasses of dead attempts to revolutionize payments. It'd be presumptuous to presume future success.

However, if transaction volumes do grow to reach the block size limits, I would (for now) be advocating for a move to a floating limit based on chain history.

To recap: the primary arguments against are

1) Rising requirements for running a node make Bitcoin more centralized
2) The economics of providing for network security when block inclusion is free and inflation has dwindled

For (1), Satoshi always took the position that Moores law would accomodate us. I wrote the Scalability page on the wiki to flesh out his belief with real numbers. As you can see, even with no improvements to todays technology at all Bitcoin can scale beyond PayPal .... by orders of magnitude. It requires nodes to run on server-class machines rather than laptops, but many already do, so I don't see that as a big deal. If Bitcoin ever reaches high traffic levels, CPU time, bandwidth, storage capacity ... all very likely to be cheaper than today. I don't think the "only banks and megacorps run full nodes" scenario will ever happen.

For (2) I have proposed building a separate p2p network on which participants take part in automatically negotiated assurance contracts. I think it would work, but it won't be possible to be truly convincing until it's tried out for real. That in turn requires:

a) That there be some real incentive to boost network security, like semi-frequent re-orgs leading to spends being reversed and merchants/exchanges losing money. Inflation will likely provide us enough security for the medium-term future.
b) Somebody actually build the tool and get people using it.

Then you have to wait and see if community participants step up and use it.

In short, I can't see this question being resolved before we actually run up against the limit, which is unfortunate. I wish Satoshi had put a floating limit in place right from the start. But unfortunately there were many issues for him to consider and only limited time to consider each one, a fixed size limit probably didn't seem like a big deal when he wrote it.
hero member
Activity: 756
Merit: 522
January 31, 2013, 11:49:15 AM
#27
You could argue against any change to Bitcoin ever based on "those are the protocol rules that were signed up for", but obviously, the protocol has changed many times since its first creation.

All this thread says to me is we need a better FAQ page. The topic comes up repeatedly and no new insight is gained by doing it 11 times rather than 10.

Any hard forks in that list of many changes?
newbie
Activity: 24
Merit: 1
January 31, 2013, 11:20:15 AM
#26
Mike Hearn - Sorry if this feels like a redundant question, or that it's decreasing the signal to noise ratio here in any way.  I suppose at it's base it's not really an answerable question: what's the future of bitcoin?  We'll have to see.

What's interesting is that there seem to be two fairly strongly divergent viewpoints on this matter: some people assume the transaction network will continue to grow to rival paypal or even credit cards, and see the block size limit as an unimportant detail that will be quickly changed when needed.  Others see the limit as a fundamental requirement, or even dogma, of the bitcoin project, and view the long term network as mainly an international high-value payment system, or the backing of derivative currencies.  Both views seem reasonable, yet mutually exclusive.

I don't see this kind of disagreement with other often-brought up and redundant issues, such as "satoshi's aren't small enough", "people losing coins means eventually there won't be anything left" and so on.  Those aren't real problems.  I'm not saying the 1MB limit is a "problem" though, I just want to know what people are going to do, and what's going to happen.  Regardless of anyone's opinion on the issue, given the large number of people using bitcoin, the ease with which the change can be made, and the impending demand for more transactions, someone will compile a client with a larger block limit.  What if people want to start using it?

I can see this issue limit bitcoin acceptance as a payment for websites: why go to all the trouble of implementing high a high security bitcoin processing system for your e-commerce site if in a couple years bitcoin won't be usable for small transactions?  Maybe it will in fact scale up, but without any clear path for how that would happen, many will choose to wait on bitcoin and see what evolves rather than adopt it for their organization.

Sorry if I'm being dense -- from the wiki this is indeed classified as "Probably not a problem", and if some developers come on here and told me, "Quiet, you, it's being worked on," I would concede the point to them.  To me though the uncertainty itself of whether the 1MB limit will remain gives me pause.  The threads from 3 years ago debating the same topic perhaps make this conversation redundant, but don't settle the issue for me: this was a tricky problem 3 years ago, and is still.  The only thing that's changed with regard to the block size is that we're getting ever closer to hitting the limit.

Perhaps this is a conversation we'll just need to have in a year or so when the blocks are saturated.
legendary
Activity: 1526
Merit: 1129
January 31, 2013, 10:36:10 AM
#25
You could argue against any change to Bitcoin ever based on "those are the protocol rules that were signed up for", but obviously, the protocol has changed many times since its first creation.

All this thread says to me is we need a better FAQ page. The topic comes up repeatedly and no new insight is gained by doing it 11 times rather than 10.
hero member
Activity: 756
Merit: 522
January 31, 2013, 09:34:29 AM
#24
That said, 1MB is really small.  I'm trying to envision a world-finance-dominating network with 1MB blocks every 10 minutes and it's tough.

What makes you suspect it's tough because of the blocksize? Maybe it's tough because it's just not something you'd be very good at, for a multitude of unrelated reasons.

Perhaps a better question then I'd like to ask people here is: The year is 2015.  Every block is a megabyte.  Someone wrote a new big-block client fork, and people are switching.  What will you do?

I've asked MP. While nobody can really know the future, turns out what we'll likely do is start an entirely new coin, this time guaranteed to never be hard-forked; not by a bunch of coder nobodies, but by MP himself. In practice that'll most likely work out to simply staying with the old version and replacing some code monkeys. This, mind you, not because we really care all that much if it's 1 Mb or 100 Gb, but because the precedent of hardforking "for change" is intolerable. We'll find ourselves in due time under a lot more pressure to fuck up Bitcoin than some vague "I can't manage to envision the future" sort of bs.

An alternative theory I present is: if some hardforking change is so valuable, why couldn't an altcoin prove that value and earn its place in the free market and eventually supplant the inferior alternative?

An excellent question. If y'all are going to be dicking around with hardforks might as well do it on whatever devcoin nobody cares about, watch it continue to be resoundingly not cared about and draw the logical inference from there.

The transition to ASIC mining should represent the last step increase in hashing power

Nonsense. The only group that shipped one (some?) asics is using 110nm tech. We are at the beginning of a curve, not at the end of it.
legendary
Activity: 1400
Merit: 1009
January 31, 2013, 08:31:44 AM
#23
I don't remember who proposed it but the best proposal I've heard is to make the maximum block size scale based on the difficulty.

The transition to ASIC mining should represent the last step increase in hashing power, so after that's done would be a good time to establish a baseline for whatever formula gets used.
newbie
Activity: 24
Merit: 1
January 31, 2013, 08:13:41 AM
#22
caveden - Thanks for the link to the thread from 2010.  It's interesting that many people, including Satoshi, were discussing this long before the limit was approached.  And I definitely agree that having it hard-coded in will make it much harder to change in 2014 than in 2010.

da2ce7 - I understand your opposition to any protocol change.  Makes sense; we signed up for 1MB blocks, so that's what we stay with.  What I'd like to know is, what would your response be if there was a widespread protocol change?  If version 0.9 of the qt-client had some type of increased, or floating max block size (presumably with something like solex proposes), would you:

- refuse the change and go with a small-block client
- grudgingly accept it and upgrade to a big-block client
- give up on bitcoin all together?

I worry about this scenario from a store-of-value point of view.  Bitcoins are worth something because of the decentralized consensus of the block chain.  To me, anything that threatens that consensus threatens the value of my bitcoins.  So in my case, whether I'm on the big-block side or the small-block side, I'm actually just going to side with whichever side is bigger, because I feel the maintenance of consensus is more valuable than any benefits / problems based on the details of the protocol.  Saying you reject it on "moral" terms though makes me think you might not be willing to make that kind of pragmatic compromise.

That said, 1MB is really small.  I'm trying to envision a world-finance-dominating network with 1MB blocks every 10 minutes and it's tough.  While there are lots of great ideas, it does seem to defeat the purpose a little bit to have the vast majority of transactions taking place outside the blockchain. 
And if the 1MB limit does stay, it calls in to question the need for client improvements in terms efficiency and so on.  If the blocks never get appreciably bigger than they do now, well any half-decent laptop made in the past few years can handle being a full node with no problem.

Perhaps a better question then I'd like to ask people here is: The year is 2015.  Every block is a megabyte.  Someone wrote a new big-block client fork, and people are switching.  What will you do?
staff
Activity: 4200
Merit: 8441
January 31, 2013, 07:56:22 AM
#21
Couldn't the block versioning be used as already described below regarding the introduction of version 2?
[...]
The result is close to a "soft fork" with minimum risk and disruption. This would prevent some of worst blockchain forking scenarios described above.
In our normal language a softforking change is one which is fully reverse compatible. They are changes which never produce behavior which an original bitcoin node would recognize as a violation of the rules which make bitcoin ... bitcoin.  What you're trying to describe is a coordinated hardfork, which is what you'd need to do to change any of the system fundamentals, e.g. change the supply of coins or the time between blocks— something we've never done— and something that isn't easily made safe.

Softforking changes are safe so long as a sufficient super-majority of mining is on... to older nodes they just look like some txn are indefinitely delayed and some blocks are surprisingly orphaned, but no violations.

A hardforking change requires almost universal adoption by bitcoin users (note: not miners, miners are irrelevant for a hardforking change: a miner that doesn' follow one that is followed by all the users simply isn't a miner anymore) so taking a count of miners is not a good or safe way to go about it.  The obvious way to implement one would be to achieve sufficient consensus, and then strike a fixed switchover time at some point in the future. ... though the savvy analyst is asking themselves what happens when the next revision of the rules is prejudicial to their interests?...

When Bitcoin's behavior is merely a system of computer rules you can trust it because you (or people you trust who read code) can point to the rules and say "it is so because of cryptographic proof, the mathematics of the program make it thusly".  If the rules are up for revision by popularity contest or whatever system you like— then you have a much more complicated trust equation where you have to ask if that process will make decisions which are not only wise but also respect your needs. Who will cast the ballots, who will count them? Even if the process is democratically fair— is it something that lets the wolves vote to eat the sheep or does the process somehow respect personal liberty and autonomy?  All the blockchain distributed consensus stuff starts sounding easy by comparison.

An alternative theory I present is: if some hardforking change is so valuable, why couldn't an altcoin prove that value and earn its place in the free market and eventually supplant the inferior alternative? Why is that inferior to changing the immutable (within the context of the system) rules when doing so is against the will of any of its users[1]?  Or to use the language of libertarian dogma: Must change only come by force?   Can any blockchain cryptocurrency survive if it becomes a practice and perception that the underlying software contract will be changed?

Hardforks: There be technological and philosophical dragons.


[1] if the rules are subtly broken and ~everyone agrees that they /must/ be changed that is another matter and not the subject I'm talking about.
legendary
Activity: 1428
Merit: 1000
January 31, 2013, 06:34:27 AM
#20
What do you think about a dynamic block size based on the amount of transactions in the last blocks?

That's easily exploited. The limit shouldn't depend entirely on the block chain.

Here's one idea:
The block size limit doesn't need to be centrally-determined. Each node could automatically set its max block size to a calculated value based on disk space and bandwidth: "I have 100 GB disk space available, 10 MB per 10 minutes download speed and 1 MB per 10 minutes upload speed, so I'll stop relaying blocks [discouraging them] if they're near 1/8 MB [enough for each peer] and stop accepting them at all if they're over 2MB because I'd run out of disk space in less than a year at that rate". If Bitcoin ends up rejecting a long chain due to its max block size, it can ask the user whether he wants to switch to a lightweight mode.

Users could also specify target difficulty levels that they'd like the network to have and reduce their max block size when the network's actual difficulty level drops below that. A default target difficulty level could maybe be calculated based on how fast the user's computer is -- as users' computers get faster, you'd expect mining to also get faster.

i dont like that approach very much, because i think it gave to much influence to nodes.
what about this one: blocksize is determined by median transaction fees?

this is not very easy to game (except you are a big pool which should want to reduce the blocksize anyway so there is no incentive)
hero member
Activity: 756
Merit: 522
January 31, 2013, 06:22:57 AM
#19
Changing this limit needs to be discussed now, before we start hitting it. 

This has been discussed for a while.

I used to support the idea of an algorithm to recalculate the limit, as it's done for the difficulty. But currently I just think miners should be able to create their own limits together with multiple "tolerance levels", like  "I won't accept chains containing blocks larger than X unless it's already N blocks deeper than mine". Each miner should set their own limits. That would push towards a consensus. Miners with limits too different than the average would end up losing work. The point is that like this the consensus is achieved through "spontaneous order" (decentralized), and not via a top-down decision.

That said, I do have the feeling that this change will only be scheduled once we start hitting the limit.

Probably the most sensible approach.
legendary
Activity: 1106
Merit: 1004
January 31, 2013, 05:55:26 AM
#18
Changing this limit needs to be discussed now, before we start hitting it. 

This has been discussed for a while.

I used to support the idea of an algorithm to recalculate the limit, as it's done for the difficulty. But currently I just think miners should be able to create their own limits together with multiple "tolerance levels", like  "I won't accept chains containing blocks larger than X unless it's already N blocks deeper than mine". Each miner should set their own limits. That would push towards a consensus. Miners with limits too different than the average would end up losing work. The point is that like this the consensus is achieved through "spontaneous order" (decentralized), and not via a top-down decision.

That said, I do have the feeling that this change will only be scheduled once we start hitting the limit.
legendary
Activity: 1078
Merit: 1002
100 satoshis -> ISO code
January 31, 2013, 05:53:14 AM
#17
The max block size seems to me to be a very important issue because 1Mb is certainly too small to support a global currency with a significant user base. Even if bitcoin just has a core function as a currency but not an all-singing all-dancing payment system. Like everyone here I would very much like to see bitcoin one day replace the disastrously managed fiat currencies.

My question is: Does increasing the max block size really need to be a hard fork?

Couldn't the block versioning be used as already described below regarding the introduction of version 2?

"As of version 0.7.0, a new block version number has been introduced. The network now uses version 2 blocks, which include the blockheight in the coinbase, to prevent same-generation-hash problems. As soon as a supermajority, defined as 95% of the last 1000 blocks, uses this new block version number, this version will be automatically enforced, rejecting any new block not using version 2."  (source http://blockorigin.pfoe.be/top.php)

Lets say a block size solution is determined such as a variable limit, or a simple increase to a new fixed value. And it is planned for block version 3.

The new software change could be inactive until a supermajority of the last 1000 blocks are version 3. Then the change to the max block size becomes active. The result is close to a "soft fork" with minimum risk and disruption. This would prevent some of worst blockchain forking scenarios described above.
hero member
Activity: 756
Merit: 522
January 31, 2013, 05:19:23 AM
#16
Unspent outputs at the time of the fork can be spent once on each new chain.  Mass confusion.

No, this is actually great insurance for Bitcoin users. Practically it says that if you get Bitcoins now and Bitcoin later forks, you will have your Bitcoins in each and every individual fork. You can never "lose" your Bitcoins for being "on the wrong side" of the fork, because you'll be on all sides.

This incidentally also offers a very efficient market mechanism for handling the issue: people will probably be interested in selling fork-x Bitcoins they own to buy more fork-y Bitcoins if they believe fork-y is good or fork-x bad. This imbalance of offer/demand will quickly bring the respective price ratios into a position where continuing the "bad" fork is economically unfeasible (sure, miners could continue mining forever from a technical standpoint, but in reality people with infinite bank accounts are rare).

Without a sharp constraint on the maximum blocksize there is currently _no_ rational reason to believe that Bitcoin would be secure at all once the subsidy goes down.

Bitcoin is valuable because of scarcity. One of the important scarcities is the limited supply of coins, another is the limited supply of block-space: Limited blockspace creates a market for transaction fees, the fees fund the mining needed to make the chain robust against hostile reorganization.

This is actually true.

(And the worst thing that can possibly happen to a distributed consensus system is that fails to achieve consensus. A substantial persistently forked network is the worst possible failure mode for Bitcoin: Spend all your own coins twice!  No hardfork can be tolerated that wouldn't result in an thoroughly dominant chain with near certain probability)

This is significantly overstated.

Surely from an "I want to be THE BITCOIN DEV!!!" perspective that scenario is the very avatar of complete and unmitigated disaster. The fact is however that most everyone currently propping their ego and answering the overwhelming "what is your point in this world and what are you doing with your life" existentialist questions with "I r Bitcoin Dev herp" will be out before the decade is out, and that includes you. Whether Bitcoin forks persistently or not, you still won't be "in charge" for very much longer.

Knowing that I guess you can view the matter a little closer to what it is: who cares? People do whatever they want. If they want eight different Bitcoin forks, more power to them. It will be even more decentralized that way, it will be even more difficult for "government" to "stop it" - heck, it'd be even impossible to know what the fuck anyone's talking about anymore. That failure mode of horror can very well be a survival mode of greatness, in the end. Who knows? Not me. Not you either, for that matter.

Its important to distinguish Bitcoin the currency and Bitcoin the payment network.  The currency is worthwhile because of the highly trustworth extreme decentralization which we only know how to create through a highly distributed and decentralized public blockchain.  But the properties of the blockchain that make it a good basis for a ultimately trustworthy worldwide currency do _not_ make it a good payment network.  Bitcoin is only as much of a payment network as it must be in order to be a currency and in order to integrate other payment networks.

This is also very true. Bitcoin is not a payment network any more than a girl that went to Stanford and graduated top of her class is a cook: for that limited interval where she's stuck with it. Course I've been saying that for a year now and pretty much everyone just glazes over and goes into derpmode. I guess it's a distinction whose time has not yet come or something.
administrator
Activity: 5222
Merit: 13032
January 31, 2013, 05:12:09 AM
#15
What do you think about a dynamic block size based on the amount of transactions in the last blocks?

That's easily exploited. The limit shouldn't depend entirely on the block chain.

Here's one idea:
The block size limit doesn't need to be centrally-determined. Each node could automatically set its max block size to a calculated value based on disk space and bandwidth: "I have 100 GB disk space available, 10 MB per 10 minutes download speed and 1 MB per 10 minutes upload speed, so I'll stop relaying blocks [discouraging them] if they're near 1/8 MB [enough for each peer] and stop accepting them at all if they're over 2MB because I'd run out of disk space in less than a year at that rate". If Bitcoin ends up rejecting a long chain due to its max block size, it can ask the user whether he wants to switch to a lightweight mode.

Users could also specify target difficulty levels that they'd like the network to have and reduce their max block size when the network's actual difficulty level drops below that. A default target difficulty level could maybe be calculated based on how fast the user's computer is -- as users' computers get faster, you'd expect mining to also get faster.
legendary
Activity: 1428
Merit: 1000
January 31, 2013, 05:04:52 AM
#14
What do you think about a dynamic block size based on the amount of transactions in the last blocks?
Pages:
Jump to: