Pages:
Author

Topic: How a floating blocksize limit inevitably leads towards centralization - page 15. (Read 71590 times)

legendary
Activity: 1064
Merit: 1001
How about a simple as Satohshi's reward-halving method?

Simply double the block size when you halve the block-reward size.

This doesn't respond directly to scarcity. It will either produce too little scarcity, or too much. And for long periods of time too, since the reward changes only once every four years. Furthermore, we don't know if the maximum block size would cross the threshold that pushes smaller miners out. It breaks both of the guidelines I established earlier:

Quote
1) React to scarcity
2) Prevent centralization by forcing out marginal miners

I think that leaving the maximum block size is preferable to doubling it periodically, because at least with the current scheme of fixed size we are guaranteed not to marginalize small miners (but it fails to react to scarcity).
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
The cost would scale directly with how many transactions are sent and how much fees are paid. We have no way of knowing how much protection is enough. We can't define it in a fixed fashion. The best way is to have some sort of a market. More transactions and more fees would lead to more protection. Less would lead to less protection.

What is enough and what is not enough? That is a problematic fundamental question.
hero member
Activity: 504
Merit: 500
WTF???
How about a simple as Satohshi's reward-halving method?

Simply double the block size when you halve the block-reward size.

Voila, all done for the next 136 years, nice and predictable, plenty of time to prepare for etc etc etc.

-MarkM-


Bingo. And we should double it in the next 12 months for the first one because we missed the first half.

Dynamic, not open ended, and completely predictable. Probably not what yays or nays want but simple middle ground.
legendary
Activity: 2940
Merit: 1090
The point is that if this system was in place, and the fixed reward was small, there would be no "ASIC boom 2". There would be no "massive investments in next gen ASIC". This is because there would be a super incentive to not have too much mining power. Not only would the difficulty increase massively, so would the block size, leading to less fees from the users. Justifying the investments in massive amounts of new mining would be quite impossible.

And carrying out a 51% attack that much cheaper for attackers.

-MarkM-
legendary
Activity: 1064
Merit: 1001
there would be a super incentive to not have too much mining power.

So your tradeoff would be to sacrifice network hash rate in favor of more transactions? I can't imagine anyone would accept that.
legendary
Activity: 2940
Merit: 1090
How about a simple as Satohshi's reward-halving method?

Simply double the block size when you halve the block-reward size.

Voila, all done for the next 136 years, nice and predictable, plenty of time to prepare for etc etc etc.

-MarkM-
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
But it does, and I gave an example. The transition from FPGA/GPU to ASIC will cause the network hashing rate to skyrocket in a way that is totally unconnected to the value of Bitcoin or the amount collected in fees. This alone should tell you that the connection between hash rate and transaction scarcity is tenuous at best (non-existent at worst, as is the case currently). If we had this system in place now, it would cause the block size to grow despite the absence of scarcity, resulting in less miner revenue not more.

I hope that we can put to rest the idea that tying the block size to the network hash rate is a bad idea.

I don't think we can put it to rest quite yet. Remember that currently we have a fixed reward of 25 BTC per block for mining. If this reward was in place forever, block size scarcity in terms of miner incentive would be irrelevant. Well not completely, even with a fixed reward it would eventually be such a small part of the monetary base that it wouldn't be enough. But that would take a very long time.

The point is that if this system was in place, and the fixed reward was small, there would be no "ASIC boom 2". There would be no "massive investments in next gen ASIC". This is because there would be a super incentive to not have too much mining power. Not only would the difficulty increase massively, so would the block size, leading to less fees from the users. Justifying the investments in massive amounts of new mining would be quite impossible.

There could be a limit on how much the block size can grow in any adjustment (which is what MoonShadow proposed) to have some kind of compromise though.
legendary
Activity: 924
Merit: 1004
Firstbits: 1pirata
we should see what happens as we run into the soft blocksize limits...what do you predict will happen?

In this order:

1. Most blocks are at or near the 250 kilobyte soft limit.
2. The memory pool of transactions grows due to insufficient space in blocks.
3. Users notice trend of transactions taking longer to confirm, or not confirming at all.
4. Fees increase as users pay more to improve confirmation times.
5. Miners (or mining pools) modify code to select transactions with the highest fees per kilobyte to fit into blocks. They remote the 250 kilobyte soft limit. Some miners disallow free transactions entirely.
6. Transactions clear much more quickly now, but fees decrease.
7. Blocks increase in size until they are at or near the one megabyte hard limit.
8. Fees start increasing. Free transactions rarely confirm at all now.
9. Small transactions are eliminated since they are not economically feasible. SatoshiDice increases betting minimums along with fees. The volume of SatoshiDice transactions decrease.
10. Users at the margins of transaction profitability with respect to fees are pushed off the network.
11. Many people, most non-technical, clamor for the block size limit to be lifted.
12. Fees reach an equilibrium where they remain stable.
13. Spurred by the profitability of Bitcoin transactions, alternate chains appear to capture the users that Bitcoin lost.
14. Pleased with their profitability, miners refuse to accept any hard fork to block size.


^ I like this, what do you think Gavin? Could be what Satoshi had in mind when implementing the halving of block rewards?
legendary
Activity: 1064
Merit: 1001
If I add a rule to my client that it quietly drops blocks that are over 250Kb, what have I done to you?

The change you are suggesting is to set MAX_BLOCK_SIZE to 250kb. Isn't that a hard fork?
legendary
Activity: 1072
Merit: 1181
I've done nothing of the sort.  If I add a rule to my client that it quietly drops blocks that are over 250Kb, what have I done to you?

Nothing, but you've kicked yourself off the network (until a majority of mining power + a significant amount of full nodes on the network implement your rule too.
legendary
Activity: 1064
Merit: 1001
if we scale up to a point where only the Googles of the world are capable of keeping up, what does that buy us?

We should

1) Have a more rigorous argument / test data that shows how a block size above a certain threshold will push miners off the system

2) Figure out how to relate measurements of the network to estimating the largest block size threshold
legendary
Activity: 1708
Merit: 1010
Include the soft limit into the verification rules of as many clients as possible, and miners who first comment out that rule for themselves will be punished by the network at least until a majority of users upgrade their clients to match.  The rest of the miners that didn't commetn out the rule would benefit from teh harm the first mover takes upon himself.

Huh...wha...eh??? This makes no sense. The "soft limit" is not a verification rule, it is part of the algorithm that the mining example code uses to put together a candidate block. It stops when it reaches 250kb. This doesn't mean that miners will reject blocks that are over 250kb, it just means that they will not PRODUCE them (unless someone modifies the code).

I know what it means.

Quote
Making the 250kb a verification rule of clients is a fork (not sure if its a hard fork or a soft fork). It makes no sense to do this. You can't assume that everyone is going to upgrade to this version, nor should you assume that once this rule is adopted by clients that it will ever go away. You have effectively reduced the 1 megabyte hard limit down to a 250 kilobyte hard limit. Good job, LOL, the opposite of what people are arguing for here!  Cheesy  Cheesy  Grin

I've done nothing of the sort.  If I add a rule to my client that it quietly drops blocks that are over 250Kb, what have I done to you?  Unless a majority of users also do so, I've done nothing.  Perhaps I dno't drpop it from my own chain, I just don't forward it.  It's something that I can do right now, and it's only effective if a significant number of others also do so.  However, if it does exist, it's presence becomes an enforcement mechanism upon the soft limit that miners presently abide by, that can be easily removed simply by a significant portion of the users agreeing that it should be done, and upgrading to the next versiuon of their client that has a higher soft limit.

For all we know, there are already clients that quietly drop blocks based upon the block reward address not being in their whitelist, or any other such metric.  Or even randomly.  None of this would matter until half of users followed the same rule.
legendary
Activity: 1064
Merit: 1001
Include the soft limit into the verification rules of as many clients as possible, and miners who first comment out that rule for themselves will be punished by the network at least until a majority of users upgrade their clients to match.  The rest of the miners that didn't commetn out the rule would benefit from teh harm the first mover takes upon himself.

Huh...wha...eh??? This makes no sense. The "soft limit" is not a verification rule, it is part of the algorithm that the mining example code uses to put together a candidate block. It stops when it reaches 250kb. This doesn't mean that miners will reject blocks that are over 250kb, it just means that they will not PRODUCE them (unless someone modifies the code). This is neither a hard fork, nor a soft fork. Think of it as a "canary in the coal mine." Right now, there is little economic incentive to modify the piece of code. For two reasons: 1) the transaction volume is not high enough, and 2) block subsidies are orders of magnitude larger than fees. When these conditions change, miners at the margin will have a financial incentive to change the code. Someone like Gavin can study the blocks in the block chain to see what fraction of blocks are larger than 250kb. This will provide insights into how miners react to the soft limit.

Making the 250kb a verification rule of clients is a fork (not sure if its a hard fork or a soft fork). It makes no sense to do this. You can't assume that everyone is going to upgrade to this version, nor should you assume that once this rule is adopted by clients that it will ever go away. You have effectively reduced the 1 megabyte hard limit down to a 250 kilobyte hard limit. Good job, LOL, the opposite of what people are arguing for here!  Cheesy  Cheesy  Grin

The fundamental market logic behind that idea seems solid enough that it actually doesn't matter too much how the relation is calculated.

But it does, and I gave an example. The transition from FPGA/GPU to ASIC will cause the network hashing rate to skyrocket in a way that is totally unconnected to the value of Bitcoin or the amount collected in fees. This alone should tell you that the connection between hash rate and transaction scarcity is tenuous at best (non-existent at worst, as is the case currently). If we had this system in place now, it would cause the block size to grow despite the absence of scarcity, resulting in less miner revenue not more.

I hope that we can put to rest the idea that tying the block size to the network hash rate is a bad idea.

I believe that any scheme for adjusting the maximum block size should:

1) React to scarcity
2) Prevent centralization by forcing out marginal miners
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
This is what I mean when I say that an oracle would be needed to determine the formula relating difficulty to block size. There is absolutely no way to know what this formula would look like. The penalty for getting it wrong (and it would be wrong most of the time) is vanishing fees, resulting in hysteresis (wild oscillations in network hash rate). ASCIMiner's 24% of the network hash rate could become 51% after a difficulty adjustment.

The fundamental market logic behind that idea seems solid enough that it actually doesn't matter too much how the relation is calculated. There could be some limit for how much the block size can change which is what MoonShadow suggested, if it's feared that the change can be suddenly too drastic. Otherwise it's all irrelevant since the market will balance the block size regardless. If it's too high miners will stop due to lack of fees and incentive, leading to a smaller block size, if it's too low it will lead to more fees, more miner incentive, more miners, and a higher difficulty and higher block size.

Quote
What's wrong with the scheme I described, which responds directly to changes in transaction space scarcity?

What you described isn't a bad idea either. I don't have more advanced opinions on it yet, but it looked decent.
legendary
Activity: 1708
Merit: 1010
The problem is the baseline and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?

This is what I mean when I say that an oracle would be needed to determine the formula relating difficulty to block size. There is absolutely no way to know what this formula would look like. The penalty for getting it wrong (and it would be wrong most of the time) is vanishing fees, resulting in hysteresis (wild oscillations in network hash rate). ASCIMiner's 24% of the network hash rate could become 51% after a difficulty adjustment.

What's wrong with the scheme I described, which responds directly to changes in transaction space scarcity?

If we...use soft limits and/or other block verification rules to impose scarcity on transaction processing, most of the miners & pools will abide by the soft rules even when commenting out those rules is provablely in their own economic interests.  Reputation matters here, even moreso than it does in the "real" business world.

False. At least one of the pools will rewrite the soft limit to their economic advantage. Once this happens, members of other pools will quickly abandon ship and join the pool that modified the soft limit, since they have higher payouts.

You're saying that miners will choose to make less money rather than more? Huh Huh

I can see it now: "p2pool: smaller payouts, better reputation!"

Yeah, I don't think so.


Not a certainty; so don't depend entirely upon limits that can be commented out by miners.  Use block verfication rules as well, which could be commented out by the users, but why would they do this?  The propogation of the block is very much part of the system.  Include the soft limit into the verification rules of as many clients as possible, and miners who first comment out that rule for themselves will be punished by the network at least until a majority of users upgrade their clients to match.  The rest of the miners that didn't commetn out the rule would benefit from teh harm the first mover takes upon himself.
legendary
Activity: 2940
Merit: 1090
Miners like lots of paying transactions, but payment processors like lots of transactions that pay the payment processor, and if they can get those transactions into the blockchain without paying miners heck that is a nice bonus for them, is it not?

So it seems to me that outfits that make money by spamming as many transactions as they can into the blockchain, making their money on what they can charge whoever their customers are for either that spamming or the effects of that spamming or even the ultimate long term effects/results of that spamming, are another group of actors who stand to gain by the largest blocks they can convince miners to create. I am not sure whether they would also ally with miners in the forcing out of competing miners, but maybe if miners demand such co-operation in return for preference in getting into that miner's mined blocks maybe they would happily go along with it?

Mike suggested somewhere that one (such a one as Google, to give you an idea of the kind of one he is accustomed to) can handle massive numbers of transactions, even using commodity hardware (which Google is wont to do) by processing them with many machines in parallel, so no matter how many transactions pour in one can simply add more machines. Obviously for one such as Google, with their databases designed and proven for handling huge chunks (aka blocks) of data, handling large blocks is also not a problem.

So if we scale up to a point where only the Googles of the world are capable of keeping up, what does that buy us? Nice high value for our hoards of bitcoins, hopefully, but what else? And at what cost in terms of freedom, accountability and the like?

Maybe acquire deepbit while one is at it, maybe a couple of other pools? How many would one need to acquire, if even any at all, to reach a point where one's specialised ability to verify transactions, possibly accompanied by enough miner co-operation or aquisition of enough mining pools and/or ASIC fleets, lets you squeeze/force everyone else other than maybe Microsoft (would they even care?), Yahoo (would they?), Amazon (hmm they have a payment processor, would they maybe put up some resistance or be just another don't care?) Paypal/Ebay (would even they care, really? Isn't an unverifiable competitor better for them than a verifiable one, one whose work/transactions they are not a party to verifying better than one whose they are?) and so on and so on out of the business?

Why the heck did we ever want more than just the single most-equipped-to-do-it player on the planet to verify transactions, again?

Can we leave watching over them to backwards-looking approaches, maybe? Never be able to catch up to where the blockchain is actually at but with a fleet of accounting/audit firms on the job be able to determine within a few weeks or months of some slipup that a slipup has happened, leaving the last several weeks of the chain invalid with respect to all transactions downstream of some erroneous one the accountants eventually discovered?

What if that "error" turned out to be a failure of a large number of silk road's coins to arrive at silk road's wallet?

Etc... All this push toward eliminating more and more of the population's chance of being able to verify makes me wonder why the heck we ever cared about verifying anything in the first place? Can't we just let Big Brother take care of everything as he always did/does, maybe even trusting that if blockchain technology has any application in such fields Big Brother will make use of it on our behalves?

A skeptical / cynical little voice in me scoffs ha ha you'd be lucky to only be halved, more likely you'll be quartered or decimated or worse...

-MarkM-
legendary
Activity: 1064
Merit: 1001
The problem is the baseline and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?

This is what I mean when I say that an oracle would be needed to determine the formula relating difficulty to block size. There is absolutely no way to know what this formula would look like. The penalty for getting it wrong (and it would be wrong most of the time) is vanishing fees, resulting in hysteresis (wild oscillations in network hash rate). ASCIMiner's 24% of the network hash rate could become 51% after a difficulty adjustment.

What's wrong with the scheme I described, which responds directly to changes in transaction space scarcity?

If we...use soft limits and/or other block verification rules to impose scarcity on transaction processing, most of the miners & pools will abide by the soft rules even when commenting out those rules is provablely in their own economic interests.  Reputation matters here, even moreso than it does in the "real" business world.

False. At least one of the pools will rewrite the soft limit to their economic advantage. Once this happens, members of other pools will quickly abandon ship and join the pool that modified the soft limit, since they have higher payouts. More realistically, all the competitive pools would make this change immediately after getting a source code patch with a soft limit. The only reason you don't see it happening now is because the block reward subsidy is orders of magnitude greater than the fees. As the block rewards get cut in half, there will be increasing pressure on miners to optimize their selection of transactions and that means abandoning the soft limit.

You're saying that miners will choose to make less money rather than more? Miners at the margin (those who would go bankrupt with the soft limit) obviously will choose to optimize the transaction selection code rather than going out of business. Your premise that reputation matters more than profit is wrong.

legendary
Activity: 1708
Merit: 1010
and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?

Or it could be a hybrid of your two ideas.  The increase in difficulty triggers an increase in the blocksize, but not linerally.  For example, no matter how much the difficulty increases (beyond a minimum), the blocksize increases by 10%.  No matter how much the difficulty decreases (beyond a minimum) the blocksize decreases by 5%.  Or vise versa, depending upon which is more likely to result in a favorable sscarcity.

Throw in my unlimited-if-all-free transactions rule.
legendary
Activity: 1708
Merit: 1010

Any adjustment to the maximum block size must preserve scarcity. The question is not how many transactions can be handled by a one gigabyte hard limit, but rather will a one gigabyte hard limit produce sufficient scarcity?


I don't agree that the hard limit is the only way to promote scarcity.  Bear in mind, no matter how we do this, the scarcity is still artificial.  If we don't do it right with a hard fork, we're stuck with it.  If we increase the hard limit to a high predicted future limit, and use soft limits and/or other block verification rules to impose scarcity on transaction processing, most of the miners & pools will abide by the soft rules even when commenting out those rules is provablely in their own economic interests.  Reputation matters here, even moreso than it does in the "real" business world.
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
This doesn't take scarcity into account and would require an oracle to provide the constants for the necessary formula linking size to difficulty. It's easy to see the case where difficulty outpaces transaction volume; We're about to see that now with ASICs coming online. Once the maximum block size is sufficiently large so that all pending transactions can fit, now we're back to the case where there's no limit and fees trend to zero. Hopefully this example should kill off any ideas about tying block size to difficulty.

I don't think you thought it through. It does take scarcity into account. Whenever fees start trending towards zero, it will eventually lead to a decrease in mining power. Miners will stop mining. This will decrease difficulty, thus lowering the block size. Eventually there will be scarcity again, thus leading to increased fees, and thus more mining, and a larger block size. There would be an equilibrium and a market.

The problem is the baseline and how much of a difficulty change equals how much of a block size limit change. The baseline could be how much difficulty is at the time of the change (for example), but the other part is tricky. Maybe it could be done as a percentage change, it would change the same percentage as difficulty?
Pages:
Jump to: