Pages:
Author

Topic: How a floating blocksize limit inevitably leads towards centralization - page 24. (Read 71590 times)

sr. member
Activity: 340
Merit: 250
GO http://bitcointa.lk !!! My new nick: jurov
Ok, so a 10GiB block is unacceptably large. What about a 5Gib block? Or a 1GiB block? Or a 500MiB block? At some point the block will be confirmed by a large fraction of the hashing power, but not all the hashing power. The hashing power that couldn't process that gigantic block in time has effectively dropped off of the network, and is no longer contributing to the security of the network.

So repeat the process again. It's now easier to push an even bigger block through, because the remaining hashing power is now less. Maybe the hashing power has just given on on Bitcoin mining, maybe they've redirected their miners to one of the remaining pools that can process such huge blocks, either way, bit by bit the process inevitably leads to centralization.
This requires conspiration of biggest pool owners with majority of hash power. But if such conspiration is going to happen, it will result in abuse of current rules (and any updated rules) anyway. So I consider this argument a strawman. With any competition, issuing 5GiB block out of the blue takes serious risk of it becoming orphaned and Gavin's 5-second proposal supports exactly that.
legendary
Activity: 1232
Merit: 1094
So we need a maximum block size that is high enough that the vast majority of nodes are comfortable with it, and isn't so big that it can be used to manipulate the difficulty by artificially slowing propagation accross the network with massive blocks. With the help of the maintaining of the propagation window through it's difficulty, we may be able to determine whether the propagation of blocks is slowing and whether the max_blocksize should be adjusted down to ensure the propagation window remains stable.

A measure of how fast blocks are propagating is the number of orphans.  If it takes 1 minute for all miners to be notified of a new block, then on average, the number of orphans would be 10%.

However, a core of miners on high speed connections could keep that down and orphans are by definition not part of the block chain.

Maybe add an orphan link as part of the header field.  If included, the block links back to 2 previous blocks, the "real" block and the orphan (this has no effect other than proving the link).  This would allow counting of orphans.  Only orphans off the main chain by 1 would be acceptable.  Also, the header of the orphan block is sufficient, the actual block itself can be discarded.

Only allowing max_block_size upward modification if the difficulty increases seems like a good idea too.

A 5% orphan rate probably wouldn't knock small miners out of things.  Economies of scale are likely to be more than that anyway.

Capping the change by 10% per 2 week interval gives a potential growth of 10X per year, which is likely to be at least as fast as the network can scale.

So, something like

Code:
if ((median of last 2016 blocks < 1/3 of the max size && difficulty_decreased) || orphan_rate > 5%)
 max_block_size /= 8th root of 2
else if(median of last 2016 blocks > 2/3 of the max size && difficulty_increased)
 max_block_size *= 8th root of 2 (= 1.09)

The issue is that if you knock out small miners, a cartel could keep the orphan rate low, and thus prevent the size from being reduced.
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
Making the system less usable by artificial restrictions while allowing more people to maintain it. This is what we're basically talking about. Just making sure everyone knows.

As I said though, the block size limit is a difficult question because there are other issues with it besides the requirements of being a full node, and the status of the mining market. Personally I think that the incentive of mining itself is in question long term if there is no scarcity at all with the block sizes, but I could be wrong. That is basically the final test of proof of work, and basically a different issue, but still.
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
Very interesting discussion. Similar debates have been going on for a long time for sure, and there is clearly some conflict on what Bitcoin is supposed to be.

Personally I find it ridiculous that we would even be thinking of handicapping Bitcoin simply because it allows us to be "maximally decentralized". I have long since accepted the fact that not everyone will be able to run a full node, nor can everyone achieve profitability in mining. Being less centralized does not mean it's not decentralized. I believe there will have to be some sort of a compromise.

We should not accept that Bitcoin is going to become this network for high value transactions alone. That is basically the last option. Bitcoin is about decentralization, yes, but it's also a tool for making _small_ international payments affordably. With that being said, I will comment on some specific issues relating to the debate.

First of all, being a full node and being a miner is already more restricted than it was before. Running a full node is already a pain in the ass (0.8 helps, true, but still), GPU mining is becoming a thing of the past, etc. Bitcoin backbone even now isn't something everyone can help maintain, it's quite understandable that it will be even less so in the future. This doesn't mean Bitcoin will become centralized, that is simply not true. It will become more centralized, but that is different.

I find it ridiculous that we would intentionally handicap the technology so that everyone can fully help maintain the backbone of the network, or keep miners on equal basis. Miners are never on equal basis. Some people have more know-how and they tweak their hardware and software to better performance. Some have lower electricity costs. In the future the bandwidth will also be a factor, so what?

The whole bandwidth and storage question is mind-bogglingly stupid. I've had a 10mbps+ connection for over 10 years. In fact almost everyone in this country can get a 100mbps connection fairly cheaply, which is something I have now. I used to have 200mbps. Gigabit home connections are basically around the corner. Storage space is getting cheaper every day. CPU is apparently not the problem, so what is the problem? I have not seen any proof that Bitcoin is becoming a "bank only" technology, I have not seen that these higher requirements amount to needing a super computer.

So in conclusion, running a full node, or being a competitive miner, might have increased requirements in the future. So fucking what? Mining is a highly competitive market, isn't that supposed to happen? If Bitcoin becomes as big as PayPal, is running a full node something everyone should be able to do? Of course not, that is ridiculous. Handicapping Bitcoin so that everyone can, is even more ridiculous.

All of this being said, I'm not sure the block limit should be lifted entirely. There are many issues with it, one being the transaction fee structure and the entire profitability of mining (which is crucial). I believe making conservative raises in the limit should be the goal for now, until a robust method of automatically adjusting block limit has been agreed upon. An automatically adjusting block limit is the best solution long term, but until we have a good method for doing that, it should not be done.

Finally, I have a few things to say about the profit motive. Zeilat is the only one who got it right in his first post. Miners have a great incentive to not fuck with the transactions and screw with the blocks. The more they do that, the more it would handicap Bitcoin usage, thus likely leading to a decrease in Bitcoin price, which would in turn decrease profits. That's profit motive for you guys.

The whole profit question is often like the blind leading the blind. The only situation where there would likely be significant problems is if miners can 1) get very significant short term advantage by screwing things up and 2) not get caught and 3) have a low IQ. If this is the case, I can admit there is a problem. Otherwise I don't see it. Especially when we're talking about a more centralized mining network. That means there will be more and more actual mining companies and major projects, which are concerned for their longer term profits as well.

Making more on the short term just to fuck up the long term doesn't make much business sense, does it? Especially if the short term advantage isn't enormous and if the miner is big enough to actually know that his own actions will affect the network in itself. It would not be rational, it would be in fact against the profit motive.

From what I've seen, a large majority of miners are thinking long term and are fairly rational, which leads me to the conclusion that short term profit exploits are not such a major concern that we should put too much weight on them when thinking about how we develop the technology.
Ari
member
Activity: 75
Merit: 10
So...  I start from "more transactions == more success"

I strongly feel that we shouldn't aim for Bitcoin topping out as a "high power money" system that can process only 7 transactions per second.

If bitcoin operated at the scale of PayPal, ~87 transactions per second, that would require roughly 20MB blocks.  That's large, but not unmanageable.

People paid PayPal US$ 5.6 billion in fees last year.  There's no way the bitcoin network is going to cost that much.
staff
Activity: 4284
Merit: 8808
In this thread I'm a bit disappointed in Gavin. I used to see him as a very conservative project leader, only including changes when there's community consensus about it and no doubt about its security implications.
I'm not disappointed. Although I don't currently agree with Gavin on this, I believe he has clearly behaved in a  praise-worthy manner: His responses are understated and considerate— which must be difficult where the opposition (e.g. me) might be read as saying "OMG YOU'RE CENTERALIZING BITCOIN YOU MONSTER". This is hard stuff with long term consequences that none of us can truthfully say we completely understand and, at the moment, without the kind of immediate urgency (or even running code) which would help crystallize thinking.

Above all other things, I'm glad that he's taking some time to spend adding some thoughts to this when it isn't an immediate issue.  A lesser, not-consensus-driven developer would just rest quietly comfortable that there would eventually be an emergency that would allow pushing their way though... Gavin isn't like that, which is why you even get to hear what he's thinking on this when it's still just a philosophical debate.
staff
Activity: 4284
Merit: 8808
... and at this point your (and gmaxwell's) imagination seems to run out, and you throw up your hands and say "We Must Limit Either T or P."
Really?
This may, in fact, be the first time in my life that I've been accused of not having enough imagination. First the grey hairs, now this... I guess I'm finally getting old. Smiley

But perhaps you've not been paying attention where I've been throwing ideas against the wall. My imagination has far from run out. (Nor has PeterTodd's, as he's independently come up with some of the same ideas— as well as many that I haven't thought of...)

…Not to mention a zillion other less workable ideas that are waiting for some breakthrough or another to make viable… mostly I try to spare the public the more inane stuff... and certainly I've had no shortage of imagination for all the technical and economic failure modes that can result from removing part of the scarcity that drives a cryptocurrency.  We've even had non-blockchain scarce altchains— Liquidcoin (for speculation)—, perhaps you don't remember because it imploded into complete non-convergence before half the people who wanted to run it even got it installed (though its 'innovation' was uncapping the block rate instead of uncapping the blocksize).

Generally the ideas to improve Bitcoin's scalability seem to only change around the constant factors in costing equation— N*∝N is still quadratic even if you multiply in only a fraction validating. Or at least, so far, the in-blockchain-ideas don't allow unbounded scaling (as blockchain external thing like fidelity chaum-banks, however, do). As far as I can tell, if we're to have a system which is practically decentralized and secure— then there must be limits for technical and economic reasons.  I _hope_ that if we're clever enough we can improve scale without compromising decentralization, but examples from existing currencies and the state of the technology make me more confident in both centralized and decentralized off-chain approaches— which we'll have to have no matter what happens with blockchain scaling.   (And it bears bolding out: Off-chain DOES NOT MANDATE CENTERALIZED, though it seems that plenty of people love centralization, and its hard to argue against for low value things).

Quote
I think we should put users first. What do users want? They want low transaction fees and fast confirmations. Lets design for that case, because THE USERS are who ultimately give Bitcoin value.
I want a pony. Made of cake. Special healthy cake that makes you lose weight. If I can choose without compromise of course I'll want all the features and none of the costs.

The users of Bitcoin— with their understandable desire for fast transactions and low fees— are, above all, users of _BITCOIN_.  If fast transactions and low fees were their utmost priority they would be using VISA and the USD, which among numerous other advantages, have deep and fundamental advantages over direct Bitcoin transactions for achieving low transaction costs and high speed.  Decentralization is costly as heck, and if you punt on it you can scale much better (go look at what ripple is doing— even though it has gone the rather uninspired blockchain-ish global consensus like bitcoin route— it has replaced a scarce resource consumption consensus with trusted verifiers that produce blocks as fast as they achieve a majority).

Blockchains are just simply bad at fast and cheap. They are a zero information hiding global flooding network (==costly) that depends on achieving a worldwide consensus (==slow). When a butterfly in china buys a dumpling with Bitcoin, processors in Nebraska must whirl, and everyone must wait until we've had time to hear back about all the whirling or risk having their transactions reversed. And in spite of being slow, costly, and with a heaping helping of the market-failure-maker of externalized costs... it turns out that that blockchains can actually do a whole lot. An O(N^2) algorithm can be quite fast when N is small. And with enough thrust (compromise) even a pig can fly (scale). But we shouldn't mistake that for scaling without compromise— because, at least as of yet, no one has figured out how.

I believe that Bitcoin plus a nice mixture off-chain transaction processing can be all things to all people: It can provide unbreakable trust based on cryptographic proof, it can provide infinitely scalable transactions. The chain doesn't scale great, but it can scale enough to handle high value transactions (I pay $30 for an international wire today) and scale enough to interlink infinitely scalable additional mechanisms.  People can pick their blend of security vs cost by choosing how they transact.  If, instead, we try to make the blockchain do it all directly, we get nothing: We lose the zero-trust but would still not be as fast or cost effective as systems which don't pretend to be decenteralized. In doing so we'd force people— against their will and interests— to accept the system changing out from under them and forcing them to trust other parties. They may ask rightfully ask— "When fast and cheap transactions 'require' pegging the subsidy and creating inflation, will respecting the promises that were made to me once again be considered negotiable for the greater good because I'm not a core developer or big mining bank?"

... But at the same time, this is all greatly benefited by making Bitcoin scale better.  Any scaling improvement is equally a improvement of decenteralization at a given level of scale, and I absolutely am working my brain overtime on how to make it better, and open to listening to every idea presented.

Decenteralization— offering an alternative monetary system which doesn't require all that messy and misplaced trust— is the fundamental purpose of Bitcoin: it is the one thing Bitcoin does which nothing else can hold a candle to.  It is messy and political and, at times, morally ambiguous. It is a fantastic experiment. It is the BIG IDEA that makes this all worth doing. This is why we're all here. But, it doesn't always neatly translate into quarterly profits for Bitcoin startups, or simply explained day-to-day benefits, it drags in messy politics when you just want to shut up and code. Of course, Bitcoin actually has to _work_ for people but if we can't make it work and be decenteralized we might as well give up and go home.  The trust in classical currencies isn't eroded because of evil— it's eroded by successive application of pragmatic local decision making which leaves people with nothing they can count on to hold true for long.  Bitcoin might show us a way out of that— with its awful suicide pact of inhuman razor-sharp mathematics made nearly immutable through decentralization— but only if we're ready to deal with the consequences.

Quote from: Mike Hearn
At the same time, as evidenced by the disagreement on this thread, there are too many unknown variables for us to figure out what will happen ahead of time. The only way to really find out is to try it and see what happens
That I can agree with— but I think not in the way you intended it. Some of the argument about the uncapping risk are mooted if you only increase the maximum blocksize once its apparent to everyone the transaction load is some multiple of it. This way you don't get the fees race to the bottom.   You argue that removing the limits is okay because hopefully we won't commit suicide, I counter that it's even more okay to not back them off them until its apparent to all that they must be backed off.  Without a credible claim for long term decentralization Bitcoin will still retain many of its flaws but have no selling point to the ordinary people you wish to attract. We can't have both by uncapping the chain, but we can have both with the additional systems.

Quote from: caveden
Only miners (and by that I mean solo miners + pool operators, not pooled miners) need to be a full node. They're your "everyone" there.
What you're describing there is a distributed system, not a decenteralized system. If only some small number of wealthy nodes do all the validation— instead of pools, why not call them Central Banks?— then everyone else is forced by economic reality to trust them to behave honestly, and there is no economic force beyond a consensus of a small number of like-motivated-parties not to make helpful "optimizations" according to earnest beliefs the greater good or in response to coercion.

In Bitcoin, as it exists today, running a full node is dirt cheap and would remain dirt cheap forever— unless and until we have a hard fork with the consent of the users— and so you're not forced to trust anyone or anything beyond your ability to audit source code and your understanding of software and mathematics.  All modern banking is distributed, and Bitcoin as a distributed system isn't especially inspired. It's fragile and has high overhead.  In site of this, because of decentralization we have something better. I hope we don't lose it because the rush to make MAD MONEY with speculation has left us with too large of a community who doesn't realize how powerful the ideas behind the system actually are— and how those ideas are the fundamental that make Bitcoin valuable, not how many transaction per second the system can handle (how many transaction per second does a gold bar process?).
legendary
Activity: 1120
Merit: 1152
Please people, understand one thing: you can't run a full payment network used by millions as a hobby. The day Bitcoin becomes something really serious (i.e., thousands of transactions per second), being a full node will necessarily be a professional task. It won't be something you can do on your laptop, "just for fun".

Right now the capacity of Bitcoin is about half a million transactions per day. So you can participate in that level of transactions as a hobby. The value of those transactions can be as high as required. If Bitcoin does become a widespread store of value, blocks will probably be transferring hundreds of million of dollars worth of value each, tens of billions every day.

But after all, it's just information, so yes, participating will be perfectly possible as a hobby, and for a fairly affordable fee, you'll be able to even make transactions directly on the world's decentralized value transfer service, the same system big banks will use.

EDIT: And, as Mike said, the idea of converting Bitcoin into some replacement to SWIFT with $20 fees for transactions, which would force people to use bank-like institutions for daily transfers, just because you want "ordinary people to verify transactions", totally turns me off. Bitcoin can be much more than that. If you actually want it to remain this censorship-resistant currency that it is, it has to remain suitable for small transactions like buying some plugin from Wordpress. If you want Bitcoin to remain an alternative for those seeking financial privacy, you have to keep it suitable for SR users and alike - otherwise all these "bank-like" payment processor would ruin your privacy. If you want Bitcoin to remain an alternative for those trying to protect their purchasing power from inflation, you have to keep it suitable for those who want to protect their daily money on their own, without having to use a bank just for storage purposes which would recreate the incentive for fractional reserves. The list can go on. Bitcoin has the potential to be much more than SWIFT 2.0. But for that, processing transactions will have to become a professional activity (it kinda already is actually).

Absolutely. But the solution isn't to make access to the core Bitcoin network, the thing that actually keeps Bitcoin inflation free and secure, require such a huge investment in computer hardware that only big banks and other large institutions can afford access. The solution is to keep blocks small, and build payment systems that work on top of the block chain.

Remember that if the blockchain is kept small enough that validating it is affordable, you don't have to trust the payment processors very much. The protocols will be designed in ways that allow anyone to prove fraud automatically and warn the whole world. The client software people use will see these fraud proofs, and immediately stop using the payment processor, putting them out of business. Yet at the same time, using technologies like chaum tokens, those payment processors can't even know where payments are going too; you're privacy is even more protected than with on-chain transactions, because the links connecting one transaction to another are severed with unbreakable mathematics.

Do you think the banking crisis would have happened if banks were forced to have all their bank-to-bank transactions publicly recorded for the whole world to see? Keeping the blocksize limited does exactly that.

But maybe not, and if just one miner starts creating gigabyte blocks, while all the rest agrees on 10 MiB blocks, ugly block-shunning rules will be necessary to avoid such blocks from filling everyone's hard drive (yes, larger block's slower relay will make them unlikely to be accepted, but it just requires one lucky fool to succeed...).

Succeed in what? Killing everybody else? Do you realize that would likely require more than 50% of the network processing power, otherwise the "unacceptably-gigantic" block would always be an orphan? Miners would likely reject blocks way too large, specially if it's filled with transactions never seen before (i.e., a likely attempt of flooding).

Ok, so a 10GiB block is unacceptably large. What about a 5Gib block? Or a 1GiB block? Or a 500MiB block? At some point the block will be confirmed by a large fraction of the hashing power, but not all the hashing power. The hashing power that couldn't process that gigantic block in time has effectively dropped off of the network, and is no longer contributing to the security of the network.

So repeat the process again. It's now easier to push an even bigger block through, because the remaining hashing power is now less. Maybe the hashing power has just given on on Bitcoin mining, maybe they've redirected their miners to one of the remaining pools that can process such huge blocks, either way, bit by bit the process inevitably leads to centralization.

EDIT: And also, as a general comment on the discussion, you people fearing "too much centralization", as in "too few market participants", should realize that, at most, what would happen would be a few pool operators, like we have now. Pool operators do not own the processing  power. Such processing power will remain scattered among thousands of people, who may easily migrate to different pools if they feel like. Pretty much like what already happens. Current pools need to have some "professional bandwidth" if only for protecting against DDoS, It already require professional resources to run a mining pool.

Pool operators do own hashing power if the miners contributing the hashing power can't effectively validate the blocks they mine.

If running a validating node requires thousands, or even tens of thousands, worth of expensive equipment, how exactly do you expect to even find out that you've been mining at a dishonest pool? If >50% of the people mining and running validating pools decide to get together and create bogus transactions creating coins out of thin air, you won't even know they've been defrauding everyone.(1) If running a node requires tens of thousands of dollars worth of equipment, and it will to support Visa-scale transaction volumes, only a small handful of large banks are going to run nodes. I think you can see how collusion between half-a-dozen large banks becomes not just possible, but likely.

1) Yes, you can try to create automated fraud proof mechanisms to detect it - I wrote about the idea here - but implementing the software to process fraud proofs is extremely complex, much more complex than applying the same idea to keeping off-chain banking services honest. I also have little hope that those mechanisms will actually get written and tested before the much more simple step of just lifting the block limit is taken.

In this thread I'm a bit disappointed in Gavin. I used to see him as a very conservative project leader, only including changes when there's community consensus about it and no doubt about its security implications. And I liked that, even though it meant that some of the changes I support are not going to be included. For a monetary system, trust and stability are essential, and I hope Gavin will continue to provide that trust and stability, so hopefully he just considers abandoning the transaction limit as an academic "thought experiment", and not something he is planning to actually put into the code in the near term.

I agree %100. Increasing the block limit seems like a conservative change - it's just one little number - but the long-term implications are enormous and have the potential to drastically change what Bitcoin is. It may be a conservative change for the small number of big businesses that are heavily invested in the current system, and can afford the network power to process large blocks, but it's not a conservative change for the rest of us.
legendary
Activity: 1526
Merit: 1134
I think there are some things we can all agree on.

We're all keen to see efficient protocols built on top of Bitcoin for things like micropayment channels (which allow lots of fast repetitive satoshi-sized payments without impacting the block chain), or trusted computing (which allows offline transactions to be carried around in long chains until final resolution). Also the payment protocol should eliminate the most absurd abuses of micropayments like SDs messaging system. These things fall into the class of "no brainers" and were discussed for a long time already.

Other more exotic ideas like Ripple-style networks of payment routers using contracts don't seem against the spirit of Bitcoin if they keep the low trust aspects of the system.

At the same time, as evidenced by the disagreement on this thread, there are too many unknown variables for us to figure out what will happen ahead of time. The only way to really find out is to try it and see what happens. If Bitcoin does fail to scale then the end result will be a smaller number of full nodes but lots of people using the system - this is still better than Bitcoin being deliberately crippled so it never gets popular because even if the number of full nodes collapses down to less than 1000, unknown future advances in technology might make it cheap enough for everyone to run a full node again. In the absence of a hard-coded limit the number of full nodes can flex up and down as supply and demand change. But with a hard-coded limit Bitcoin will fail to achieve popularity amongst ordinary people and will eventually be forgotten.

And now some specific replies:

Quote
Heck, while we're playing this game, find me a single major O(n^2) internet scaling problem that's actually been solved by "just throwing more hardware at it", because I sure can't.

I already gave you one. The internet backbone is a great analogue for Bitcoin because you have billions of users who connect up hierarchically through to the backbone which is a P2P broadcast network - exactly like what we have where users will run SPV nodes and connect up to a backbone of full nodes.

You say, oh, but people were worried about routing table scalability. They took measures to improve things. Yes, of course they did, just like we're doing with Bitcoin. Nobody is arguing that scalability improvements can be ignored, far from it. Along the way there will be problems and bottlenecks and rewrites and protocol improvements to change the slopes of the graph, but fundamentally the backbone scaled because routers got a lot more powerful (in particular, high speed RAM got a lot cheaper).

And when the internet backbone hit its own "block size limit" (the 16 bit ASN width), they expanded it to 32 bits. There was no talk of backbone scalability being an O(n^2) problem and who needs more than 65k networks anyway. They just bit the bullet, grew the system and here we all are today sending tiny messages over it. Success!

Quote
It's interesting you say that, I'm guessing you don't think there was a need for a Bitcoin Foundation then do you?

For funding development? Not really. But I am glad Gavin has the certainty of a salary rather than having to raise money all the time via assurance contracts. Also, giving him a salary means he has free reign to work on what he thinks is most important rather than what community members feel like funding, which is important when so much of the work he does is somewhat intangible (like testing infrastructure).

The Foundation is however very important for presenting a professional and united front to organizations that are genetically programmed to require one, like governments or the media. It's also a good organizational focal point. So I am a member for that reason.
cjp
full member
Activity: 210
Merit: 124
I mean, I'm not totally against a one-time increase if we really need it. But what I don't want to see is an increase used as a way to avoid the harder issue of creating alternatives to on-chain transactions. For one thing, Bitcoin will never make for a good micropayments system, yet people want Bitcoin to be one. We're much better off if people work on off-chain payment systems that complement Bitcoin, and there are plenty of ways, like remote attestation capable trusted hardware and fidelity bonds, that allow such systems to be made without requiring trust in central authorities.
This is exactly how I see it too!

In this thread I'm a bit disappointed in Gavin. I used to see him as a very conservative project leader, only including changes when there's community consensus about it and no doubt about its security implications. And I liked that, even though it meant that some of the changes I support are not going to be included. For a monetary system, trust and stability are essential, and I hope Gavin will continue to provide that trust and stability, so hopefully he just considers abandoning the transaction limit as an academic "thought experiment", and not something he is planning to actually put into the code in the near term.

I'm afraid Gavin still has this "low-fee micropayment" idea of Bitcoin. Gavin, if you read this: please abandon that idea. It stops you from seeing clearly where Bitcoin is useful and where it is not useful, so it stops you from developing Bitcoin in a direction that will make it most successful. If you steer towards "low-fee micropayment", you will find yourself in a niche with more efficient competing technologies, where Bitcoin will die. Bitcoin's strength is in store of value; the bulk of microtransactions can and should be done off-chain with other systems.
legendary
Activity: 1106
Merit: 1004
However, with no limit on block size, it effectively becomes miners who are in control of _everyone_'s block size. As a non-miner, this is not something I want them to decide for me.

Only miners (and by that I mean solo miners + pool operators, not pooled miners) need to be a full node. They're your "everyone" there.

Please people, understand one thing: you can't run a full payment network used by millions as a hobby. The day Bitcoin becomes something really serious (i.e., thousands of transactions per second), being a full node will necessarily be a professional task. It won't be something you can do on your laptop, "just for fun".

EDIT: And, as Mike said, the idea of converting Bitcoin into some replacement to SWIFT with $20 fees for transactions, which would force people to use bank-like institutions for daily transfers, just because you want "ordinary people to verify transactions", totally turns me off. Bitcoin can be much more than that. If you actually want it to remain this censorship-resistant currency that it is, it has to remain suitable for small transactions like buying some plugin from Wordpress. If you want Bitcoin to remain an alternative for those seeking financial privacy, you have to keep it suitable for SR users and alike - otherwise all these "bank-like" payment processor would ruin your privacy. If you want Bitcoin to remain an alternative for those trying to protect their purchasing power from inflation, you have to keep it suitable for those who want to protect their daily money on their own, without having to use a bank just for storage purposes which would recreate the incentive for fractional reserves. The list can go on. Bitcoin has the potential to be much more than SWIFT 2.0. But for that, processing transactions will have to become a professional activity (it kinda already is actually).

But maybe not, and if just one miner starts creating gigabyte blocks, while all the rest agrees on 10 MiB blocks, ugly block-shunning rules will be necessary to avoid such blocks from filling everyone's hard drive (yes, larger block's slower relay will make them unlikely to be accepted, but it just requires one lucky fool to succeed...).

Succeed in what? Killing everybody else? Do you realize that would likely require more than 50% of the network processing power, otherwise the "unacceptably-gigantic" block would always be an orphan? Miners would likely reject blocks way too large, specially if it's filled with transactions never seen before (i.e., a likely attempt of flooding).

There is of course wide spectrum between "I can download the entire chain on my phone" and "Only 5 bank companies in the world can run a fully verifying node", but I think it's important that we choose what point in between there is acceptable.

I'm glad you see it's not so black-and-white. I'm sad though that you think you can actually "choose what point between there is acceptable". Such point cannot be found arbitrarily, even because it is not fixed: it varies all the time according to different demands for security, transaction space, speed etc, and different supply of resources. Basically, you'll never be able to even list all data that is necessary to take such decision, as it involves subjective opinions of thousands and eventually millions. Let alone calculate anything.
Then I think you misunderstand what a hard fork entails. The only way a hard fork can succeed is when _everyone_ agrees to it.

Only if you want to totally avoid the fork actually. Otherwise, you may have both chains living in parallel for some time. Then eventually one might "beat" the other, as people migrate. Or both keep alive "forever", that's a possibility too.
The more I see these discussions, the less I believe in totally avoiding the fork.

EDIT: And also, as a general comment on the discussion, you people fearing "too much centralization", as in "too few market participants", should realize that, at most, what would happen would be a few pool operators, like we have now. Pool operators do not own the processing  power. Such processing power will remain scattered among thousands of people, who may easily migrate to different pools if they feel like. Pretty much like what already happens. Current pools need to have some "professional bandwidth" if only for protecting against DDoS, It already require professional resources to run a mining pool.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Great posts from Mike and Gavin in this thread. There's indeed no reason to panic over "too much centralization". Actually, setting an arbitrary limit (or an arbitrary formula to set the limit) is the very definition of "central planning", while letting it get spontaneously set is the very definition of "decentralized order".

Also, having fewer participants in a market because these participants are good enough to keep aspiring competitors at bay is not a bad thing. The problem arises when barriers of entry are artificial (legal, bureaucratic etc), not when they're part of the business itself. Barriers of entry as part of the business means that the current market's participants are so advanced that everybody else wanting to enter will have to get at least as good as the current participants for a start.

Removing the block cap means a hard fork, and once we decided to do that we may as well throw in some "no brainer" upgrades as well, like supporting ed25519 which is orders of magnitude faster than ECDSA+secp256k1. Then a single strong machine can go up to hundreds of thousands of transactions per second.

That's cool. Please core devs, consider studying what other hard fork changes would be interesting to put in, because we risk hitting the 1Mb limit quite soon.

+1
Fully agree.

Further. In the OP an example of slow bandwidth for a political reason was given. We can't let bitcoin be handicapped because of local problems affecting just a small part of the network. It can't operate like a kindergarten going only at the speed of the slowest pupil.
legendary
Activity: 1120
Merit: 1152
However, with no limit on block size, it effectively becomes miners who are in control of _everyone_'s block size. As a non-miner, this is not something I want them to decide for me. Perhaps the tragedy of the commons can be avoided, and long-term rational thinking will kick in, and miners can be trusted with choosing an appropriate block size. But maybe not, and if just one miner starts creating gigabyte blocks, while all the rest agrees on 10 MiB blocks, ugly block-shunning rules will be necessary to avoid such blocks from filling everyone's hard drive (yes, larger block's slower relay will make them unlikely to be accepted, but it just requires one lucky fool to succeed...).

Well, read my initial post at the top; that larger blocks don't propegate to the network as a whole is actually benefits the miner because provided the blocks propagate to more than 50% of the effective hashing power, the part that doesn't get the block is effectively wasting their mining effort and taken out of the competition.

Additionally even if miners see a rational reason to keep block-sizes low, which I already doubt, allowing them to control the size gives irrational miners who are trying to actively damage Bitcoin another way to do so. Right now all that an evil miners can do is either help double-spend attempts, easily defeated with confirmations, or launch a 51% attack, so far defeated with large amounts of hashing power. We don't want to give yet more ways for malicious people to damage Bitcoin, especially ones which are actually profitable in the short term.

My suggestion would be a one-time increase to perhaps 10 MiB or 100 MiB blocks (to be debated), and after that an at-most slow exponential further growth. This would mean no for-eternity limited size, but also no way for miners to push up block sizes to the point where they are in sole control of the network. I realize that some people will consider this an arbitrary and unnecessary limit, but others will probably consider it dangerous already. In any case, it's a compromise and I believe one will be necessary.

I mean, I'm not totally against a one-time increase if we really need it. But what I don't want to see is an increase used as a way to avoid the harder issue of creating alternatives to on-chain transactions. For one thing, Bitcoin will never make for a good micropayments system, yet people want Bitcoin to be one. We're much better off if people work on off-chain payment systems that complement Bitcoin, and there are plenty of ways, like remote attestation capable trusted hardware and fidelity bonds, that allow such systems to be made without requiring trust in central authorities.

I would hate to see the limit raised before the most inefficient uses of blockchain space, like satoshidice and coinad, change the way they operate. In addition I would hate to see alternatives to raising the limit fail to be developed because everyone assumes the limit will be raised. I also get the sense that Gavin's mind is already made up and the question to him isn't if the limit will be raised, but when and how. That may or may not be actually true, but as long as he gives that impression, and the Bitcoin Foundation keeps promoting the idea that Bitcoin transactions are always going to be almost free, raising the block limit is inevitable.

Anyway, there are plenty of good reasons to have off-chain transactions regardless of what the block limit is. In particular they can confirm instantly, so no waiting 6 blocks, and they can use chaum tokens to truely guarantee that your transactions are private and your personal financial information stays personal.
donator
Activity: 994
Merit: 1000
I think we should put users first. What do users want? They want low transaction fees and fast confirmations.
This comes down to Bitcoin as a payment network versus Bitcoin as a store of value. I thought it was already determined that there will always be better payment networks that function as alternatives to Bitcoin. A user who cares about the store of value use-case, is going to want the network hash rate to be as high as possible. This is at odds with low transaction fees and fast confirmations.
This!

Bitcoin is about citizen empowerment. When ordinary citizen can't run their own validating nodes anymore you lost that feature (independent from the question of hashing). Then bitcoin is commercialized. The bitcoin devs need to keep that in mind. (If you need to freshen up on your brainwash, here's a great presentation from Rick Falkvinge: http://www.youtube.com/watch?v=mjmuPqkVwWc)
member
Activity: 113
Merit: 11
I actually posted the below in the max_block_size fork thread but got absolutely no feedback on it so rather than create a new thread to get exposure on it, I am reposting it here in full as something to think about with regards to moving towards having a fairly simple process to create a floating blocksize for the network that is conservative enough to avoid abuse and will work in tandem with difficulty so no new mechanisms need to be made. I know there are probably a number of holes in the idea but I think it's a start and could be made viable so that we get a system that allows blocks to get bigger, but doesn't run out of control such that only large miners can participate, and also avoids situations where manipulations of difficulty could occur if there were no max blocksize limit. Ok, here goes.

I've been stewing over this problem for a while and would just like to think aloud here....

I very much think the blocksize should be network regulated much like difficulty is used to regulate propagation windows based on the amount of computation cycles used to find hashes for particular difficulty targets. To clarify, when I say CPU I mean CPUs, GPUs, and ASICs collectively.

Difficulty is very much focused on the network's collective CPU cycles to control propagation windows (1 block every 10 mins), avoid 51% attacks, and distribute new coins.

However the max_blocksize is not related to computing resources to validate transactions and regular block propagation, it is geared much more to network speed, storage capacity of miners (and includes even non-mining full nodes) and verification of transactions (which as I understand it means hammering the disk). What we need to determine is whether the nodes supporting the network can quickly and easily propagate blocks while not having this affect the propagation window.

Interestingly there is a connection between CPU resources, the calculation of the propagation window with difficulty targets, and network propagation health. If we have no max_blocksize limit in place, it leaves the network open to a special type of manipulation of the difficulty.

The propagation window can be manipulated in two ways as I see it, one is creating more blocks as we classically know, throw more CPUs at block creation, and we transmit more blocks, more computation power = more blocks produced, and the difficulty ensures the propagation window doesn't get manipulated this way. The difficulty is measured by timestamps in the blocks to determine whether more or less blocks in a certain period were created and whether difficulty goes up or down. All taken care of.

The propagation window could also be manipulated in a more subtle way though, that being transmission of large blocks (huge blocks in fact). Large blocks take longer to transmit, longer to verify, and longer to write to disk, though this manipulation of the number of blocks being produced is unlikely to be noticed until a monster block gets pushed across the network (in a situation where there is no limit on blocksize that is). Now because there is only a 10 minute window the block can't take longer than that I'm guessing. If it does, difficulty will sink and we have a whole new problem, that being manipulation of the difficulty through massive blocks. Massive blocks could mess with difficulty and push out smaller miners, causing all sorts of undesirable centralisations. In short, it would probably destroy the Bitcoin network.

So we need a maximum block size that is high enough that the vast majority of nodes are comfortable with it, and isn't so big that it can be used to manipulate the difficulty by artificially slowing propagation accross the network with massive blocks. With the help of the maintaining of the propagation window through it's difficulty, we may be able to determine whether the propagation of blocks is slowing and whether the max_blocksize should be adjusted down to ensure the propagation window remains stable.

Because the difficulty can be potentially manipulated this way we could possibly have a means of knowing what the Bitcoin network is comfortable with propagating. And it could be determined thusly:

If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize (median being chosen to avoid situations where one malicious entity, or entities tries to arbitrarily push up the max_blocksize limit), and the difficulty is "stable", increase the max_blocksize (say by 10%) for the next difficulty period (say the median is within 20% of the max_blocksize), but if the median size of blocks for the last period is much lower (say less than half the current blocksize_limit), then lower the size by 20% instead.

However, if the If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize and the difficulty is NOT stable, don't increase the max_blocksize since there is a possibility that the network is not currently healthy and increasing or decreasing the max_blocksize is a bad idea. Or alternatively in those situations lower the max_blocksize by 10% for the next difficulty period anyway (not sure if this is a good idea or not though).

In either case the 1mb max_blocksize should be the lowest the blocksize should go to if it continued to shrink. Condensing all that down to pseudocode...

Code:
IF(Median(blocksize of last difficulty period) is within 10% of current max_block_size 
AND new difficulty is **higher** than previous period's difficulty),
    THEN raise max_block_size for next difficulty period by 10%

otherwise,

Code:
IF(Median(blocksize of last difficulty period) is within 10% of current max_block_size 
AND new difficulty is **lower** than previous period's difficulty),
    THEN lower max_block_size for next difficulty period by 10% UNLESS it is less than the minimum of 1mb.


Checking the stability of the last difficulty period and the next one is what determines whether the network is spitting out blocks at a regular rate or not, if the median blocksize of blocks transmitted in the last difficulty period is bumping up against the limit, and difficulty is going down, it could mean a significant number of nodes can't keep up, esp. if the difficulty needs to be moved down, that means that blocks aren't getting to all the nodes in time and hashing capacity is getting cut off because they are too busy verifying the blocks they received. If the difficulty is going up and median block size is bumping up against the limit, then there's a strong indication that nodes are all processing the blocks they receive easily and so raising the max_blocksize limit a little should be OK. The one thing I'm not sure of though is determining whether the difficulty is "stable" or not, I'm very much open to suggestions the best way of doing that. The argument that what is deemed "stable" is arbitrary and could still lead to manipulation of the max_blocksize, just over a longer and more sustained period I think is possible too, so I'm not entirely sure this approach could be made foolproof, how does calculating of difficulty targets take these things into consideration?

OK, guys, tear it apart.
newbie
Activity: 58
Merit: 0
Interesting debate.

First of all, my opinion: I'm in favor of increasing the block size limit in a hard fork, but very much against removing the limit entirely. Bitcoin is a consensus of its users, who all agreed (or will need to agree) to a very strict set of rules that would allow people to build global decentralized payment system. I think very few people understand a forever-limited block size to be part of these rules.

...

My suggestion would be a one-time increase to perhaps 10 MiB or 100 MiB blocks (to be debated), and after that an at-most slow exponential further growth. This would mean no for-eternity limited size, but also no way for miners to push up block sizes to the point where they are in sole control of the network. I realize that some people will consider this an arbitrary and unnecessary limit, but others will probably consider it dangerous already. In any case, it's a compromise and I believe one will be necessary.

Realize that Bitcoin's decentralization only comes from very strict - and sometimes arbitrary - rules (why this particular 50/25/12.5 payout scheme, why ECDSA, why only those opcodes in scripts, ...) that were set right from the start and agreed upon by everyone who ever used the system. Were those rules "central planning" too?

I tend to agree with Pieter.

First of all, the true nature of Bitcoin seems to be the rigid protocol as it helps the credibility among masses. Otherwise one day you remove block size limit, next day remove ECDSA, then change block frequency to 1 per minute, then print more coins. It actually sounds more appropriate to do such changes under a different implementation.

Then I can't help this: With such floating block limit isn't everyone afraid of chain splits? I can imagine a split occurring by a big block being accepted by 60% of the nodes and rejected by the rest.

How about tying the maximum block size to mining difficulty?
...
The difficulty also goes up with increasing hardware capabilities, I'd expect that the difficulty increase due to this factor will track the increase of technical capabilities of computers in general.

This sounds interesting.
member
Activity: 97
Merit: 10
How about tying the maximum block size to mining difficulty?

This way, if the fees start to drop, this is counteracted with the shrinking block size. The only time this counteracting won't be effective is when usage is actually dwindling at the same time.
If the fees start to increase, this is also counteracted with increasing the block size as more mining power comes online.

The difficulty also goes up with increasing hardware capabilities, I'd expect that the difficulty increase due to this factor will track the increase of technical capabilities of computers in general.
legendary
Activity: 1400
Merit: 1013
However, with no limit on block size, it effectively becomes miners who are in control of _everyone_'s block size. As a non-miner, this is not something I want them to decide for me. Perhaps the tragedy of the commons can be avoided, and long-term rational thinking will kick in, and miners can be trusted with choosing an appropriate block size. But maybe not, and if just one miner starts creating gigabyte blocks, while all the rest agrees on 10 MiB blocks, ugly block-shunning rules will be necessary to avoid such blocks from filling everyone's hard drive (yes, larger block's slower relay will make them unlikely to be accepted, but it just requires one lucky fool to succeed...).
In a different thread Gavin proposed removing the hard limit on block size and adding code to the nodes that would reject any blocks that take too long to verify.

That would give control over the size of the blocks to the people who run full nodes.
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
Enjoyed reading many great posts here!

If there will be a hard fork from time to time, then I prefer keep the change as progressive/small as possible

If there will be only ONE hard fork ever, then it is a high risk gamble
hero member
Activity: 991
Merit: 1011
My suggestion would be a one-time increase to perhaps 10 MiB or 100 MiB blocks (to be debated), and after that an at-most slow exponential further growth.

i totally agree with this. let the networt run for a little more, make a conservative one-time increase in block size, then use the time to analyse the final phase of the 1mb limit and the effect of the increased limit and plan another hard fork, possibly with an  - again conservatively - adjusting limit.
i am all for radical action if its necessary. but right now transaction fees are still very low and there is really no need to be radical. its just an unnecessary risk.

Pages:
Jump to: