Pages:
Author

Topic: Block size limit automatic adjustment (Read 14572 times)

donator
Activity: 2058
Merit: 1054
December 05, 2011, 02:21:26 PM
#97
This still gives big miners too much power in demanding draconian tx fees.
Please, keep in mind that there's nothing we can do to prevent such kind of "cartel boycott" from happening right now.
If we don't need to rely on a cartel to maintain difficulty equilibrium, we can try to come up with a solution to the problem of a cartel forming.

While individual people might have a slight interest in protecting Bitcoin from attacks, that interest might not be very strong. In an overall group, that means that the protection is pretty weak. For example, think of your average 1-rig miner out there right now. If Bitcoin collapsed today, how much of an impact would that really have on them? How much would they really lose?
Yes, this is the problem. No, I don't agree that a cartel is a desirable solution.
legendary
Activity: 1204
Merit: 1015
December 05, 2011, 01:33:07 PM
#96
This still gives big miners too much power in demanding draconian tx fees.

Please, keep in mind that there's nothing we can do to prevent such kind of "cartel boycott" from happening right now.
This is an important point to keep in mind. You also must consider that a cartel might even be desirable. While individual people might have a slight interest in protecting Bitcoin from attacks, that interest might not be very strong. In an overall group, that means that the protection is pretty weak. For example, think of your average 1-rig miner out there right now. If Bitcoin collapsed today, how much of an impact would that really have on them? How much would they really lose? A strong cartel, on the other hand, would have a significant vested interest in keeping Bitcoin safe. That is because each individual member has invested a lot of money into equipment that wouldn't be worth much for any other application.

As for the worries of large blocks, you have to remember that large blocks already have a significant cost to the miner who mined it, at least in terms of risk. The larger the block...

1) the longer it takes to initially transmit to all of the miner's peers.
2) the longer it takes peers to individually verify the block.
3) the longer it takes those peers to then broadcast the block.

The longer that process takes to propagate the block throughout the network, the more likely it is for the block to be ultimately rejected. In fact, once a miner accepts such a large block, they might be willing to switch to building off of a block with fewer transactions if one comes in before they are confident that the large block fully propagated. That is especially true if the small block collects much fewer fees than the large one.
legendary
Activity: 1106
Merit: 1004
December 05, 2011, 03:32:49 AM
#95
Sorry, I'm just not seeing it. Relying on these insurers to counter any potential attack seems only one step removed from dropping the whole proof of work thing and just letting a few trusted servers synchronize transactions.

Oh but mining will be a professional thing regardless. Actually, it already is, if we consider only those who mine solo or operate pools, they are quite few.
I wouldn't compare that to a centralized solution though. Anyone with the skills and resources can enter the market, and most importantly, pool operators don't own the resources that they use to mine. Even if a pool operator is shut down by force, the actual miners, those who own the cards, can just switch to another pool or start mining solo. Somebody may also try to create a new pool to gather all those pool-less miners. Anyway, it's not comparable to a centralized solution where you kill everything by killing the servers.

I'd much rather see a carefully planned incentive structure / branch selection criterion (which IMO should involve some combination of proof-of-stake, cementing, Bitcoin days destroyed and proof-of-work) which naturally leads to an efficient decentralized market.

Don't you think that's the political way of solving problems? We're more likely to create new problems than solving the existent one. Smiley

This still gives big miners too much power in demanding draconian tx fees.

Please, keep in mind that there's nothing we can do to prevent such kind of "cartel boycott" from happening right now. I'd argue that not even CPU-friendly proof-of-work algorithms prevent that, since in the end there would be big pools anyway (and while they don't prevent cartels, they create new vulnerabilities).

And I don't think transaction fees will ever be draconian, for the reason you also note:

(Fees too high will reduce Bitcoin tx volume and thus the total fees collected. But I see no reason why the point with the max collected fees is the point best for Bitcoin in general. Efficiency is when you compete with someone other than yourself.)

The fact that it's not that difficult to transact outside of the blockchain already prevents miners from abusing. And they are competing with someone else: they are competing with other miners, with e-wallets which take transactions out of the chain, with offline means of payment like casacius coins and bitbills etc.
legendary
Activity: 1106
Merit: 1004
December 05, 2011, 03:04:14 AM
#94
All of this assumes that there is a proxy service giving miners the historic data.

 Huh
I didn't understand this. I'm not making such assumption.

Otherwise, the proposed situation is not stable: the miner consensus would continuously kick out miners with too little storage space, until the weaker half of miners in terms of storage are the stronger half in processing power.

I'm not sure if I follow. You think the "miner consensus" would artificially create huge blocks with fake transactions in it, just to kick out those who do not have the resources to handle it? Because otherwise, if they are not faking anything and the blocks are big, that's the bitcoin network itself who's kicking them out.
I don't think it is clever for pool operators to try to fake large blocks.

I really hope nobody includes some major mistake in dynamics when changing the protocol.

That's not a change in the protocol, other than eventually eliminating the 1MB limit - what will have to be done anyway.
Miners are the only ones that should concern with huge blocks, so why not let them work that out?
donator
Activity: 2058
Merit: 1054
December 04, 2011, 04:50:11 PM
#93
I will just give a grasp on the idea of these insurances for those who haven't heard about it yet. Basically, people interested in not being the target of double-spends, as well as being capable of spending at all (the "freezing the network" attack scenario), could hire insurances for that. Say, for example, some organization wants to freeze bitcoin's network with a >50% attack. If that happens, insurers would have to pay a huge amount to their clients. They have a financial interest to rent enough processing power to outcome such attack as quick as possible.
Sorry, I'm just not seeing it. Relying on these insurers to counter any potential attack seems only one step removed from dropping the whole proof of work thing and just letting a few trusted servers synchronize transactions.

I'd much rather see a carefully planned incentive structure / branch selection criterion (which IMO should involve some combination of proof-of-stake, cementing, Bitcoin days destroyed and proof-of-work) which naturally leads to an efficient decentralized market.

(actually, I would expect most bitcoin users to collaborate... not only for ideological reasons, but simply to be able to spend their money again)
The effect each user's mining has on his own ability to spend bitcoins is negligible and not much of an incentive. That's pretty much what "tragedy of the commons" means.

If you want a more "discussed" scenario which can be compared to bitcoin's "transaction fee tragedy of the commons" scenario, I'd suggest the one of stateless defense. It's obviously not exactly the same thing, but I think it's the closest one on economic literature. For example, half of the The Chaos Theory book, from Bob Murphy, is about stateless defense.
Thanks, sounds interesting, I'll try to have a look.

If your solution relies on a cartel of miners boycotting competitors who undercut them, with nobody having a clear idea what they need to do to have their blocks accepted, I'd say you already lost.
And, what do you mean with "nobody having a clear idea what they need to do to have their blocks accepted"? Nothing needs to be done on secret, actually pool operators would better announce everything they do pretty clearly since they are using other people's resources after all.
This still gives big miners too much power in demanding draconian tx fees. Which solves the difficulty equilibrium problem, but creates a new problem. (Fees too high will reduce Bitcoin tx volume and thus the total fees collected. But I see no reason why the point with the max collected fees is the point best for Bitcoin in general. Efficiency is when you compete with someone other than yourself.)

And, also, what did I lose?
Lost in your efforts to bring about a decentralized (as in, not run by a cartel) currency.
legendary
Activity: 1036
Merit: 1002
December 04, 2011, 03:42:56 PM
#92
It's on the interest of miners not to have huge blocks, which they didn't mine, occupying space on their hard drives. Also I believe it's on their interest not to split the network. So, they can create rules of the kind "I won't build on top of blocks larger than X unless they are already Y blocks deep already". They can actually have multiple rules with different values for X and Y. If most miners roughly agree on these parameters, it would be really hard to keep a chain with a block larger than such limits.

Plus, if they want, they can actually use the same mechanism to try to boycott miners which accept transactions with "too low fees". "I won't mine on top of blocks which had transactions paying less than X/Kb unless it is already Y blocks deep". That would be another "decentralized" way to deal with the incentives issue, besides the eventual insurers.

All of this assumes that there is a proxy service giving miners the historic data. Otherwise, the proposed situation is not stable: the miner consensus would continuously kick out miners with too little storage space, until the weaker half of miners in terms of storage are the stronger half in processing power. Who knows when that would happen?

Always keep in mind that miners have no say once they're kicked out of the market for whatever reason! Neglecting that was the major reason people did not acknowledge the low difficulty equilibrium for so long.

I really hope nobody includes some major mistake in dynamics when changing the protocol. Make noise if any protocol change is upcoming, and let's make sure the dynamics are right. We might not be able to deflate the block chain again once it's enormous.
legendary
Activity: 1106
Merit: 1004
December 04, 2011, 03:20:00 PM
#91
And the transaction fee "tragedy of the commons" scenario would probably be better solved by market agents (like insurers)
Can you explain exactly how will this work? Back then I was not at all convinced by Mike's vision of the mechanics, and I still consider this an open problem.

Yes, I guess it was Mike Hearn the first to come by with this insurance idea a while ago, but at the time maybe I didn't pay the deserved attention to it, or I just didn't think it through well enough, don't remember.

I can't obviously "explain exactly how will this work" since I have no crystal ball. But I tend to trust more on spontaneous order than centrally planned order, particularly when it comes to economic incentives. And that's what Stefan made me see with those talks: by arguing for an arbitrary formula to set up a moving block size limit, I was in a sense arguing for central planning instead of spontaneous order.

I will just give a grasp on the idea of these insurances for those who haven't heard about it yet. Basically, people interested in not being the target of double-spends, as well as being capable of spending at all (the "freezing the network" attack scenario), could hire insurances for that. Say, for example, some organization wants to freeze bitcoin's network with a >50% attack. If that happens, insurers would have to pay a huge amount to their clients. They have a financial interest to rent enough processing power to outcome such attack as quick as possible. (actually, I would expect most bitcoin users to collaborate... not only for ideological reasons, but simply to be able to spend their money again)
If you want a more "discussed" scenario which can be compared to bitcoin's "transaction fee tragedy of the commons" scenario, I'd suggest the one of stateless defense. It's obviously not exactly the same thing, but I think it's the closest one on economic literature. For example, half of the The Chaos Theory book, from Bob Murphy, is about stateless defense.

If your solution relies on a cartel of miners boycotting competitors who undercut them, with nobody having a clear idea what they need to do to have their blocks accepted, I'd say you already lost.

It's not "my solution". And, what do you mean with "nobody having a clear idea what they need to do to have their blocks accepted"? Nothing needs to be done on secret, actually pool operators would better announce everything they do pretty clearly since they are using other people's resources after all.
And, also, what did I lose?
donator
Activity: 2058
Merit: 1054
December 04, 2011, 02:29:34 PM
#90
And the transaction fee "tragedy of the commons" scenario would probably be better solved by market agents (like insurers)
Can you explain exactly how will this work? Back then I was not at all convinced by Mike's vision of the mechanics, and I still consider this an open problem.

Plus, if they want, they can actually use the same mechanism to try to boycott miners which accept transactions with "too low fees". "I won't mine on top of blocks which had transactions paying less than X/Kb unless it is already Y blocks deep". That would be another "decentralized" way to deal with the incentives issue, besides the eventual insurers.
If your solution relies on a cartel of miners boycotting competitors who undercut them, with nobody having a clear idea what they need to do to have their blocks accepted, I'd say you already lost.
legendary
Activity: 1106
Merit: 1004
December 04, 2011, 01:53:00 PM
#89
But I'm still a little uncertain about the block size limit because of memory. A single miner with access to a lot of storage might flood the block chain to get rid of competition with less storage capabilities, so some mechanism against arbitrarily large blocks might be good. "The spam problem is a problem for pool operators", I agree, so what to do about it?

It's on the interest of miners not to have huge blocks, which they didn't mine, occupying space on their hard drives. Also I believe it's on their interest not to split the network. So, they can create rules of the kind "I won't build on top of blocks larger than X unless they are already Y blocks deep already". They can actually have multiple rules with different values for X and Y. If most miners roughly agree on these parameters, it would be really hard to keep a chain with a block larger than such limits.

Plus, if they want, they can actually use the same mechanism to try to boycott miners which accept transactions with "too low fees". "I won't mine on top of blocks which had transactions paying less than X/Kb unless it is already Y blocks deep". That would be another "decentralized" way to deal with the incentives issue, besides the eventual insurers.

Also, CAN the block size limit be changed easily, can we reach a consensus without splitting the network?

It would be a backward incompatible change, so it would need to be scheduled with a good advance to minimize this risk.

@zellfaze: scaling exponentially is almost the same as having no limit in the first place. If a large miner network wants to start abuse, this limit would make no difference, just delay the attack by a few hours or maybe days.

Not entirely sure. As it would be faking transactions, only their blocks would push the limit up. Every other "true block" would be way below the limit, pushing the average down. The amount of waste these abusing network would manage to create would be proportional to the amount of processing power they have, and I don't think you can gather that many people for such a pointless attack...
legendary
Activity: 1036
Merit: 1002
December 02, 2011, 08:23:07 PM
#88
Reviving this old topic just to say that my opinion on this subject has mostly changed, mainly after some very interesting talks with Stefan Thomas and others (even the great economist Detlev Schlichter was there Smiley ) during the bitcoin conference in Prague.

We probably don't need to mind with fixing a max block size on the protocol. The spam problem is a problem for pool operators and solo-miners only, as already said in this thread. And the transaction fee "tragedy of the commons" scenario would probably be better solved by market agents (like insurers) than by arbitrary rules on the protocol. By fixing arbitrary rules, we will either not create the amount of incentives needed, or, more likely, we will create more incentives than what's actually needed, provoking unnecessary waste of resources.

I already worried nobody would discuss the topic when I didn't have time to come to Prague. Thanks for keeping it up. Smiley

Concerning the difficulty equilibrium: this is also the conclusion I'm currently at. Insurers are the way to go, this will keep tx fees very low, which is a very good thing. Cool

But I'm still a little uncertain about the block size limit because of memory. A single miner with access to a lot of storage might flood the block chain to get rid of competition with less storage capabilities, so some mechanism against arbitrarily large blocks might be good. "The spam problem is a problem for pool operators", I agree, so what to do about it?

Also, CAN the block size limit be changed easily, can we reach a consensus without splitting the network?


@zellfaze: scaling exponentially is almost the same as having no limit in the first place. If a large miner network wants to start abuse, this limit would make no difference, just delay the attack by a few hours or maybe days.
full member
Activity: 141
Merit: 101
Security Enthusiast
November 30, 2011, 12:48:46 AM
#87
max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

With this formula, asymptotically, block size cannot increase by more than 2MB in 24-hours.
That is roughly 300000 transactions a day.
(What about Visa spikes ? probably similar).

This is a hard limit, so if bitcoins are still in use in a hundred years, maybe it would be better to scale exponentially. For example:
Quote
max block size = 1000000 + 1.01 (average size of last N blocks in the best chain)
and blocksize would scale up to (about) 2% per 24-hours.


This is the way that I would do it personally.  I don't really much see a problem with it.
legendary
Activity: 1106
Merit: 1004
November 29, 2011, 05:09:38 PM
#86
Reviving this old topic just to say that my opinion on this subject has mostly changed, mainly after some very interesting talks with Stefan Thomas and others (even the great economist Detlev Schlichter was there Smiley ) during the bitcoin conference in Prague.

We probably don't need to mind with fixing a max block size on the protocol. The spam problem is a problem for pool operators and solo-miners only, as already said in this thread. And the transaction fee "tragedy of the commons" scenario would probably be better solved by market agents (like insurers) than by arbitrary rules on the protocol. By fixing arbitrary rules, we will either not create the amount of incentives needed, or, more likely, we will create more incentives than what's actually needed, provoking unnecessary waste of resources.
legendary
Activity: 1526
Merit: 1134
I already explained in the "disturbingly low tx fee equilibrium" thread how I think fees will be set in future. It does not rely on altruism.
legendary
Activity: 1106
Merit: 1004
I don't believe artificial scarcity is a good plan nor necessary in the long run,
Forgot to say I fully agree, but it was probably obvious.

so requiring end-user software to enforce these sorts of rules makes me nervous. I don't plan on adding max size checks to BitCoinJ at least, they aren't even enforceable as in future SPV clients probably won't request full blocks.
Yes, as long as most miners perform sizes check and refuse to build over obviously oversized blocks, everything should be fine.
There's no reason for light clients to check that.

So are you both among those who think miners would charitably work on a net loss for the "common good" in the long future (not entirely impossible, but I'd rather not rely on that), or do you have another idea on how will transactions fees remain above 0,01µBTC?
Or maybe you believe the entire user base can expect miners to set up an agreement regarding such artificial scarcity? It's true it may happen, but I feel uneasy about it.... I'd feel more comfortable if this was set by the "client consensus" instead of a "miner consensus"...
gim
member
Activity: 90
Merit: 10
I don't believe artificial scarcity is a good plan nor necessary in the long run,
Forgot to say I fully agree, but it was probably obvious.

so requiring end-user software to enforce these sorts of rules makes me nervous. I don't plan on adding max size checks to BitCoinJ at least, they aren't even enforceable as in future SPV clients probably won't request full blocks.
Yes, as long as most miners perform sizes check and refuse to build over obviously oversized blocks, everything should be fine.
There's no reason for light clients to check that.
gim
member
Activity: 90
Merit: 10
Visa handles around 8,000 transactions per second during holiday shopping and has burst capacity up to 10,000tps.
Uch... realizing I underestimated it by a lot in my first post.
Assuming the dayly transaction increase is, let's say, 1/100 of this peek, the proposed linear adjustment is then definitely far too small for that scale.
The 1MB constant should be multiplied by at least 20.

Exponnential adjustment is not an option IMO. Localized spammers can't exploit it anyway (even with a sloppy 5%, 10% or even 100% per day adjustment).
legendary
Activity: 1526
Merit: 1134
The problem is that we will never reach the point at which the block size "needs" to be increased because by the time this is regularly happening people will be trying BitCoin, deciding that transactions are expensive and slow, then leaving again. It's a self-fulfilling prophecy in that sense.
member
Activity: 98
Merit: 13
And the only reason BitCoin cares about wire-size is that we're afraid of scaling up the system, pretty much.

Well, it's still pretty cheap to dump tons of useless data into the block chain.

Satoshi didn't seem to think the block size limit should be changed... until it needed to be.  Right now, we are nowhere near the limit, so his rationale still seems sound.

legendary
Activity: 1708
Merit: 1010
Why would it matter? BitCoin is the only financial system I know of that cares about wire-size. Any serious processing company just builds a few datacenters and you're done. They don't even have to be very big.

And the only reason BitCoin cares about wire-size is that we're afraid of scaling up the system, pretty much.

Wire-size?

Scaling isn't really an issue if the system is suited to compensate the network for the resources, that 's what I'm concerned about.  If we choose an algorithim that just permits limitless growth, then we might as well just remove the blocksize limit altogether and cross our fingers, because the result is the same.  We don't have the option of "just build a few datacenters" because this is the method by which we must pay for those datacenters, and ours need to be bigger and faster than any others.
legendary
Activity: 1526
Merit: 1134
Why would it matter? BitCoin is the only financial system I know of that cares about wire-size. Any serious processing company just builds a few datacenters and you're done. They don't even have to be very big.

And the only reason BitCoin cares about wire-size is that we're afraid of scaling up the system, pretty much.
Pages:
Jump to: