Pages:
Author

Topic: Funding of network security with infinite block sizes - page 3. (Read 24569 times)

legendary
Activity: 1470
Merit: 1006
Bringing Legendary Har® to you since 1952
So the longer I think about the block size issue, the more I'm reminded of this Hayek quote:

Quote
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

F.A. Hayek, The Fatal Conceit

We can speculate all we want about what is going to happen in the future, but we don't really know.

So, what should we do if we don't know? My default answer is "do the simplest thing that could possibly work, but make sure there is a Plan B just in case it doesn't work."

In the case of the block size debate, what is the simplest thing that just might possibly work?

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

I support this.

Let's try the simple solution and if it doesn't work, we can try something else.
Bitcoin is all about free market, right ? So let's allow the market to decide for us.

Configuration setting can be made for both client and miner applications, so users will be able to choose.
legendary
Activity: 1526
Merit: 1134
One option would be to change the "full" block message so that it only includes the hashes of the included transactions.

Transactions would then have to be sent separately.

Bloom filtering support already provides most of what's needed here. Matt checked in an optimization for full-match filters a short while ago, it makes sense to continue on that vein.
legendary
Activity: 1232
Merit: 1094
That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

That effectively sets the block size at 32MB, since that is the max message size.  Would your intention be to include splitting blocks over multiple messages?

One option would be to change the "full" block message so that it only includes the hashes of the included transactions.

Transactions would then have to be sent separately. 

This allows the 32MB to be used entirely for transaction hashes, but still is a limit to 1 million transactions, which (famous last words) should be enough.

Quote
What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.

Ideally, there would be miner rules and user rules.  For example, normal uses might accept any size blocks.  However, when working out which block to add to, miners might reject certain blocks.  Once they are buried deeply enough, the miner might accept them then.  This would help to heal forks.

This means that only 1 hard fork is required to take the limit to infinite.  However, then the majority of the hashing power gets to pick the max block size.
legendary
Activity: 1400
Merit: 1013
how is any of those limitations relevant to the current situation?
I don't even know where to begin. Maybe you could find an entry level textbook on economic and start with the definition of scarcity, then read up on price.

You've already identified the solution, except that you erroneously identified it as the problem.

people dont want one piece of software to occupy whole harddisks
Users have neither the ability nor desire for the blockchain to grow without bound. That's why no hard-coded limit on the block size is necessary.
hero member
Activity: 501
Merit: 500
I like the floating (miner-voted) block size idea. This is because I think:

1) Transactions require real resources (bandwidth, cpu time, storage), hence transactions must be expensive. They require these things from every full node, not just miners, so there's necessarily a kind of tragedy of commons situation in play. I cannot think of a solution that wouldn't discourage non-miners to run a full node. After trust-free lite nodes are implemented I can't think of a reason why a non-miner would run a full node.
2) Bitcoin ceases to be useful if transactions are TOO expensive. I think the upper limit for the minimum mandatory fee (currently zero) is around $5. Above that, people just migrate to cheaper ways to transfer value. This is why keeping the 1M limit won't work.

Of course, I have never considered Bitcoin to be particularily useful for e-commerce.
hero member
Activity: 991
Merit: 1011
how is any of those limitations relevant to the current situation?

legendary
Activity: 1400
Merit: 1013
there is very little upside to having no scarcity at all
That word does not mean what you think it means.

Blocks do not need a protocol limit on their maximum size in order for the space in them to be scarce.

Until computers start shipping that contain an infinite amount of RAM, and are equipped with CPUs that can process an infinite amount of data in zero time, and have internet connections which can transfer an unlimited amount of data in zero time, then space in the blockchain is scarce.

If you think an artificial limit on the size of a block is necessary to make the space scarce you either don't understand physics, or economics, or both.
hero member
Activity: 991
Merit: 1011
honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

maybe next time you make up an argument that doesnt entirely rely on speculation about me and every other bitcoin user and i might consider thinking about it  Wink

@realpra
i cant find anything in your post that relates to mine. i think you interpreted a lot more into my post than i actually said.

@nubarius
there is very little upside to having no scarcity at all and a very concrete worst case scenario: the blockchain growth might be too quick for the current state of the technology. people dont want one piece of software to occupy whole harddisks and lightweight clients are still in an rather early development stage. just imagine some sort of major microtransaction or gambling service started using bitcoin and it grew by a factor of 100. its legitimate use alright. still, bitcoin cant possibly handle it at this point. the technology needs to be further developed before it can handle any amount of transactions. i wish the free-market enthusiasts would at least see that right now there are limits, regardless of what we want or decide.

i think its very unfortunate that both the the "full node for everyone!" and the free market advocates basically act like fanatics, with very little concern about real issues (including finding a solution a majority can agree on) and a strong tendecy for black-or-white thinking.
sr. member
Activity: 310
Merit: 253
I fully support Gavin Andresen on this.

It is the best solution possible: no risk of overengineering, no kicking the can down the road and no half-baked attempts to solve a would-be problem that hasn't been proved to be there.

I think it is a mistake to apply the scarcity-as-economic-incentive logic to physical resources like storage or bandwidth. How scarce those resources are can only be determined by technology and the free market, not by a whimsical hard-coded little number. It is utterly wrong to draw any analogies with the 21-million-bitcoin limit, which simply affects the unit of account and doesn't relate to any physical resource. Bitcoins are neither scarce nor abundant; there's simply a predictable number of them. Extending the logic of the unit of account to actual physical resources is simply muddled economic thinking, in my opinion.

[...]
honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).

Your list is a very good summary, but I think you're exaggerating the cons of option 4. Besides what justusranvier and RealPra have said above, the need for a plan B applies to the other options too. Also, it is the other options that will keep a constant need for consent as the block size limit will be a recurring issue as long as there is an arbitrary cap that we're about to hit at any moment. With option 4 the debate will fizzle out and will only reappear if and when we come across an actual problem (and then discussion will be based on facts and not on speculation about human behaviour).

This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

An unlimited number of times this^.
sr. member
Activity: 310
Merit: 250
honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

Bitcoin with a fixed protocol limit of 1 MB blocks is like building a long distance telephone network that had a global limit of 10 calls in progress at once, and refusing to ever increase this limit no matter how much demand there was to make long distance phone calls, or how much the technology improved.

I am 100% sure that a bitcoin would not be worth $144.00 USD right now if the majority of people purchasing at that price (or even half that price) had knowledge or belief that in the very short future, it would cost more than a few cents to use Bitcoin or that the network would permit a maximum of 7-20 transactions per second forever. That is not what people using, adopting, investing, or proselytizing are selling to others in their elevator speeches, because if they were aware of it they would be disgusted.

Any alt chain that chooses to scale will quickly replace Bitcoin in terms of both usage and fiat exchange rate, because as should be quite obvious to anyone without ulterior motives, the two are correlated. Without usage that presents a superior alternative to existing systems, which in this case is the ability to instantly and cheaply send money anywhere without a 3rd party risk, Bitcoin is irrelevant, worthless, and soon to be obsolete.

hero member
Activity: 815
Merit: 1000
Your "plan" reminds me of an old joke...

Okey dokey.

If you want to be helpful, please write up a list of pros and cons for the various plans that have been proposed, including your own (last time I asked you, you waffled and didn't have any plan).

I've been pretty busy dealing with the avalanche of press and working on the payment protocol.

the short list:

....

4) remove limit entirely:

pro

- takes no time at all
- might allow for a natural balance without artificial rules
- allows for a plan b

con

- needs a plan b  Wink
- relies on people to do the right thing in a situation when
 - a) its very hard to say what the right thing actually is
 - b) money is involved. people are evil, stupid and completely unable to be objective when money is involved.
- bonus con: with a constant need for consent, it actually allows for even more discussions...


honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
4 is the correct option because your con list is flawed:

1. In the case of a very large block being artificially being created to destroy Bitcoin most of the network would simply automatically end up rejecting it because by the time they downloaded it there would be a longer chain/fork without that block.

2. In the case of increasing block sizes squeezing out smaller nodes the solution is to create "swarm clients" that do collaborative validation.
This would allow the network to scale to any size for "infinite" time when combined with the "ledger solution".
(This is not up to choice or opinion, Bitcoin presently does not scale and if you don't fix this some other crypto currency will. Forum searching both quotation marked concepts should be easy enough)


The vision of Bitcoin believers is to take over the world, you won't do that with a 1mb block limit, its pathetic.
legendary
Activity: 1400
Merit: 1013
honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
This forum is not representative of bitcoin users as a whole. The majority of bitcoin users don't realize there is a fixed limit on the transaction rate and would be appalled if they knew. Move outside the circle of people who have some kind of vested interest in limiting the capabilities of the blockchain and the response would be, "Of course there shouldn't be an arbitrary limit on the transaction rate."

Bitcoin with a fixed protocol limit of 1 MB blocks is like building a long distance telephone network that had a global limit of 10 calls in progress at once, and refusing to ever increase this limit no matter how much demand there was to make long distance phone calls, or how much the technology improved.
hero member
Activity: 991
Merit: 1011
Your "plan" reminds me of an old joke...

Okey dokey.

If you want to be helpful, please write up a list of pros and cons for the various plans that have been proposed, including your own (last time I asked you, you waffled and didn't have any plan).

I've been pretty busy dealing with the avalanche of press and working on the payment protocol.

the short list:

1) no incease:

pro

- takes no time at all
- predictable consequences
- allows the biggest amount of people to run a full node

con

- pretty much limits bitcoin to store of value. if it fails as store of value (possibly for an off-chain transaction system) it will be dead or a total niche product very soon.

-----------------------------------------------

2) fixed increase:

pro

- very simple implementation
- easy to predict consequences
- no major worst case scenarios

con

- short term solution
- discrete jump in block size with no regard to current amount of transactions

-----------------------------------------------

3) floating block size:

pro

- most likely the best option to successfully balance low transaction fees and small block size
- no more pros, but thats a really big one

con

- most complex implementation (might still not be very complex though)
- hard to predict exact behavior
- has to undergo more scrutiny and fine tuning to avoid an exploitable algorithm

-----------------------------------------------

4) remove limit entirely:

pro

- takes no time at all
- might allow for a natural balance without artificial rules
- allows for a plan b

con

- needs a plan b  Wink
- relies on people to do the right thing in a situation when
 - a) its very hard to say what the right thing actually is
 - b) money is involved. people are evil, stupid and completely unable to be objective when money is involved.
- bonus con: with a constant need for consent, it actually allows for even more discussions...


honestly, i have no hope for 4). its the least likely option to get a majority behind it. plus its too optimistic. my money is on 2) with an eventual 3).
legendary
Activity: 1652
Merit: 2301
Chief Scientist
Your "plan" reminds me of an old joke...

Okey dokey.

If you want to be helpful, please write up a list of pros and cons for the various plans that have been proposed, including your own (last time I asked you, you waffled and didn't have any plan).

I've been pretty busy dealing with the avalanche of press and working on the payment protocol.
legendary
Activity: 1120
Merit: 1152
What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.

Your "plan" reminds me of an old joke about the difference between programmers and engineers:

Quote
A mechanical engineer, an electrical engineer and a computer programmer were on their way to a meeting in Switzerland. They had just passed the crest of a high mountain pass when suddenly the brakes on their car failed. The car careened almost out of control down the road, bouncing off the crash barriers, until it miraculously ground to a halt scraping along the guardrail.

The car's occupants, shaken but unhurt, now had a problem: they were still stuck at the top of a mountain in a car with no brakes. What were they to do?

"I know," said the mechanical manager, "Let's check under the car for leaks - chances are a hydraulic line burst."

"No, no," said the electrical engineer, "This is a new car with sensors everywhere  - let me hook up my multimeter to the hydraulic line pressure sensor output and see if it reads low first."

"Well," said the programmer, "Before we do anything, I think we should push the car back up the road and see if it happens again."
legendary
Activity: 1050
Merit: 1002
So the longer I think about the block size issue, the more I'm reminded of this Hayek quote:

Quote
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

F.A. Hayek, The Fatal Conceit

We can speculate all we want about what is going to happen in the future, but we don't really know.

So, what should we do if we don't know? My default answer is "do the simplest thing that could possibly work, but make sure there is a Plan B just in case it doesn't work."

In the case of the block size debate, what is the simplest thing that just might possibly work?

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.

Well, thanks for weighing in. Under the circumstances you're the closest to a leader this leaderless grand experiment has, so it means a lot. We simply must get some resolution to this so we can move forward.

As I alluded to earlier no action could ever satisfy everyone nor be provably correct.

I'm a bit torn on this avenue, because on one hand I believe strongly in the ability of the market to work effectively. So your rationale it will self regulate block size makes sense to me. On the other I consider this the riskiest route for things which can go wrong and consequences if they do.

However, this is one of the reasons I began pushing alt-coins. The more options a free market has the better it can work. In the end, because of that, I can live with any decision for Satoshi's version of Bitcoin that resolves block size without an economically and confidence damaging fork.

So I say let's start pushing forward with such plans. It will be better for all interests if we do.
legendary
Activity: 1652
Merit: 2301
Chief Scientist
So the longer I think about the block size issue, the more I'm reminded of this Hayek quote:

Quote
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.

F.A. Hayek, The Fatal Conceit

We can speculate all we want about what is going to happen in the future, but we don't really know.

So, what should we do if we don't know? My default answer is "do the simplest thing that could possibly work, but make sure there is a Plan B just in case it doesn't work."

In the case of the block size debate, what is the simplest thing that just might possibly work?

That's easy!  Eliminate the block size limit as a network rule entirely, and trust that miners and merchants and users will reject blocks that are "obviously too big." Where what is "obviously too big" will change over time as technology changes.

What is Plan B if just trusting miners/merchants/users to do the right thing doesn't work?

Big-picture it is easy:  Schedule a soft-fork that imposes some network-rule-upper-limit, with whatever formula seems right to correct whatever problem crops up.
Small-picture: hard to see what the "right" formula would be, but I think it will be much easier to define after we run into some actual practical problem rather than guessing where problems might crop up.


legendary
Activity: 1050
Merit: 1002
Okay here is a proposal which IMO is "safe" enough to win support, but first a quick question. Infinite block sizes appear dangerous for centralization, but also DOS attack and chain bloat by tx's with no/trivial fees which size limits and higher fees help prohibit. Has there been good answer for the latter? (apologies for having missed it if so)

This new proposal builds on Gavin's model, but IMO makes it "safer":

Increases of 4MB every 26,400 blocks (every 6 months) if block.timestamp is after 1-Jan-2014 and 95% of the last 4,400 blocks (1 month) are new-version.

Pros:

- no action until future date gives market time to respond gracefully
- no change as default, means no fork risk
- allows possible increase of 16MB in 2 years and has no final limit
- puts increases in the hands of miners; even the small ones have a say
- should not be prone to gaming due to large sampling size, but worst case is 8MB per year increase

Cons

- advocates of infinite size will likely find it conservative
- ?

Please note in my view Bitcoin could survive with a 1MB limit and not be "crippled". I think one thing to remember is Bitcoin would not exist if government was completely out of money. Bitcoin only has value because it allows people to control their stored wealth and the free market to work. Regulation is the reason payments come with fees and inefficiency. A truly free market would have long since given users better and cheaper options. Indeed, payments existing outside the "system" was PayPal's original idea believe it or not.

So what Bitcoin allows are services to be built on top of it, e.g. ones which rely less on the block-chain, to extend its user empowerment. What I'm saying is Bitcoin is not only the block-chain; it's the idea of allowing free market efficiency the chance to work, so off-chain Bitcoin transactions can be considered Bitcoin too as Bitcoin makes them efficiently possible.
full member
Activity: 154
Merit: 100
As an example, consider your (and Mike's) proposal to propagate headers and/or transaction hash information, rather than full transactions. Can you think of a way to attack such a system? Can you think of a way that such a system could make network forking events more likely? I'll give you a hint for the latter: reorganizations.

Are you talking about the case where I mine a block with one of my own transactions which I haven't announced, so you have to make another request for the missing transactions before you can verify it? If I refuse to provide the missing transactions, then no-one will accept my block and I lose. If I delay announcing the missing transactions, competing blocks may be found in the mean time, so this is no different to delaying the announcement of the block itself which has no benefit for me. What am I missing?
legendary
Activity: 1120
Merit: 1152
I think this shows great consideration and judgement because I note and emphasize the following:

Quote
100% of the last 1000 blocks is a straw-man; the actual criteria would probably be different ...  since this change means the first valid greater-than-MAX_BLOCK_SIZE-block immediately kicks anybody running old software off the main block chain.

What I think is of greatest value in Gavin's quote is that it's inclusive of data from the field. It's not him unilaterally saying the new size will be X, deal with it. Instead he essentially says the new size can be X if Y and Z are also true. It appears he has a regard for the ability of the market to decide. Indeed no change remains an option and is actually the default.

Think through this voting proposal carefully: can you think of a way to game the vote?

I completely disagree. Think how easily this issue could have been solved if in 2009 Satoshi implemented a rule such as Jeff suggests here:

Quote
My off-the-cuff guess (may be wrong) for a solution was:  if (todays_date > SOME_FUTURE_DATE) { MAX_BLOCK_SIZE *= 2, every 1 years }  [Other devs comment: too fast!]  That might be too fast, but the point is, not feedback based nor directly miner controlled.

I think the above could be a great solution (though I tend to agree it might be too fast). However, implementing it now will meet resistance from someone feeling it misses their views. If Satoshi had implemented it then it wouldn't be an issue now. We would simply be dealing with it and the market working around it. Now however there is a lot of money tied up in protocol changes and many more views about what should or shouldn't be done. That will only increase, meaning the economic/financial damage possible from ungraceful changes increases as well.

There was some heated IRC discussion on this point - gmaxwell made the excellent comment that any type of exponentially increasing blocksize function runs into the enormous risk that the constant is either too large, and the blocksize blows up, or too small, and you need the off-chain transactions it was supposed to avoid anyway. There is very little room for error, yet changing it later if you get it wrong is impossible due to centralization.

I also note early in Jeff's post he says he reversed his earlier stance, my point here being people are not infallible. I actually agree with his updated views, but what if they too are wrong? Who is to say? So the same could apply to Gavin. That's why I think it's wise he appears to include a response from the market in any change, and no change is the default.

Same here. I read Satoshi's predictions about achieving scalability through large blocks about two years ago, and simply accepted it as inevitable and ok. It was only much, much later, that I thought about the issue carefully and realized how dangerous the implications would be to Bitcoin's decentralization.


I am already concerned about the centralization seen in mining. Only a handful of pools are mining most of the blocks, so decentralization is already being lost there. Work is needed in two areas before the argument for off-chain solutions becomes strong: first blockchain pruning, secondly, initial propagation of headers (presumably with associated utxo) so that hashing can begin immediately while the last block is propagated and its verification done in parallel. These would help greatly to preserve decentralization.

(snip)

You mention your background in passing, so I will just mention mine. I spent many years at the heart of one of the largest TBTF banks working on its equities proprietary trading system. For a while 1% of the shares traded globally (by value) was our execution flow. On average every three months we encountered limitations of one sort or another (software, hardware, network, satellite systems), yet every one of them was solved by scaling, rewriting or upgrading. We could not stand still as the never-ending arms race for market-share meant that to accept a limitation was to throw in the towel.

It's good that you have had experience with such environments, but remember that centrally run systems, even if the architecture itself is distributed, are far, far easier to manage than truly decentralized systems. When your decentralized system must be resistant to attack, the problem is even worse.

As an example, consider your (and Mike's) proposal to propagate headers and/or transaction hash information, rather than full transactions. Can you think of a way to attack such a system? Can you think of a way that such a system could make network forking events more likely? I'll give you a hint for the latter: reorganizations.


It's my impression that the 2009 Satoshi implementation didn't have a block size limit - that it was a later addition to the reference client as a temporary anti-spam measure, which was left in until it became the norm.

Is this impression incorrect?

Bitcoin began with a 32MiB blocksize limit, which was reduced to 1MB later by Satoshi.
Pages:
Jump to: