Pages:
Author

Topic: Block size limit automatic adjustment - page 2. (Read 14572 times)

legendary
Activity: 1708
Merit: 1010
Visa handles around 8,000 transactions per second during holiday shopping and has burst capacity up to 10,000tps.


What's the average size of a simple transaction?
legendary
Activity: 1708
Merit: 1010

Quote
How does this do anything but grow?

Not sure if I am answering your question but y=mx + b is a high school algebra equation for a line on a graph.  Using this equation or some other polynomial equation to predict the size of the next block shouldn't be too hard.  Just plug in values for m, x, and b and solve for y.


That wasn't really what I was asking.  I'm not a math geek, I'm an econo-geek (and a radio geek, but that's not relevant).  I think that a simple equation to predict the trend in order to set a blocksize has the incentives wrong, and almost certainly trends toward infinity because both those paying for transactions to be processed and miners have an incentive for every transaction to be included in every block, and then we truly do have a 'tragedy of the commons' situation as the blocksize shoots to the moon, senders no longer have an encentive to pay anything over a token fee, and miners start dropping out because the fees can't cover the cost of bandwidth and electric; resulting in a difficulty level that is too low to defend itself as the block reward is reduced.  There needs to be some mechanisim that resists arbitrary growth of the blocksize, even if only a little.  Tying the max blocksize to the difficulty in some linear fashion is a smooth way to do this.  I'm not married to the details, but the implementation just seems smooth to me.  Although I have no concept of how difficult that would be to impliment into the code, because I'm not a coder, but I imagine that it would still be easier than a rolling average or a predictive algorithim, because it's just linear math.
legendary
Activity: 1304
Merit: 1015
I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?


I think averaging the "last N blocks in the best chain" is good but there may be a better way.  How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

We can start of simple, and just use y=mx+b.  

How does this do anything but grow?

Not sure if I am answering your question but y=mx + b is a high school algebra equation for a line on a graph.  Using this equation or some other polynomial equation to predict the size of the next block shouldn't be too hard.  Just plug in values for m, x, and b and solve for y.

http://www.math.com/school/subject2/lessons/S2U4L2DP.html

I think Gavin is right in that we need some data, maybe plot it on a graph, and determine which method/equation can best fit that graph.

Just my two millicoins.
legendary
Activity: 1246
Merit: 1016
Strength in numbers
I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?

I like it. Don't worry about Christmas, I'm pretty sure that's a bubble.
legendary
Activity: 1526
Merit: 1134
Visa handles around 8,000 transactions per second during holiday shopping and has burst capacity up to 10,000tps.

Of course MasterCard also handles quite a bit. I don't have figures for them but I guess it'd be in the same ballpark.

I don't believe artificial scarcity is a good plan nor necessary in the long run, so requiring end-user software to enforce these sorts of rules makes me nervous. I don't plan on adding max size checks to BitCoinJ at least, they aren't even enforceable as in future SPV clients probably won't request full blocks.
legendary
Activity: 1708
Merit: 1010
I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?


I think averaging the "last N blocks in the best chain" is good but there may be a better way.  How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

We can start of simple, and just use y=mx+b. 

How does this do anything but grow?
gim
member
Activity: 90
Merit: 10
max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

With this formula, asymptotically, block size cannot increase by more than 2MB in 24-hours.
That is roughly 300000 transactions a day.
(What about Visa spikes ? probably similar).

This is a hard limit, so if bitcoins are still in use in a hundred years, maybe it would be better to scale exponentially. For example:
Quote
max block size = 1000000 + 1.01 (average size of last N blocks in the best chain)
and blocksize would scale up to (about) 2% per 24-hours.

Yes, that is one more random constant :p
legendary
Activity: 1708
Merit: 1010
I was thinking about all this on my commute to work, and I have a proposal.

1,000,000 + (Difficulty * Byte * K) = max block size

Wherein K= some factor high enough that the max block size is never really an issue, say K=2.  But some analysis is due on that.

But here is another change to the default fee schedule, granted that individual miners are likely to have tighter requirements themselves than this...

First, the max block size calculated above becomes the basis metric for a set of soft max block sizes.  I suggest 16 tiers of equal size.

0 to one-sixteenth of max block size, no special requirements, miners can include whatever valid transactions they desire up until this point.

1 to 2 (sixteenth) at least one transaction paying a fee equal or greater than the minimum fee required for unusual transactions must be present.  That transaction can be one wherein the fee was required or not.  As long as at least one is present, miners can include whatever else they desire up until this limit.

2 to 3  At least one transaction paying a fee double the fee for the above class.

3 to 4  At least one transaction paying at least double the rule above this one must be present.

And so on, so the fee paid by the highest fee paying transaction sets the bar for the block, and then the miner can include whatever other transactions that it sees fit.  This not only encourages the use of -sendtomany whenever possible, which is more efficent for the network anyway; most of the fee paying transactions (and free transactions) are then competing for the fill in space left by the one transaction that is paying for the bandwidth.  And this also sets a method of ongoing price discovery, as any client can look at the last block and it's own transaction queue and predict how much it will have to pay in order to get into the next block (probably equal to or higher than the highest single fee in the last block, if the queue is steady, slightly more if it is growing, slightly less if it is dropping) as well as establish a bidding mechanism for the 'median' transaction to be included in a block in the near future; as all other transactions besides the high transaction are then bidding for the remaing space by looking at their own queue of transactions, guessing which will be the high (and therefore the sixe of space available) and looking at the second highest to outbid if it wishes to be included in the next block.

In this way, the well-heeled senders set the bar.  Imagine if Wal-Mart, which has half a million employees to pay each week, were to compile that entire paylist into a single -sendtomany transaction.  They would be able to definitively determine the minimum fee they would have to offer just to be considered, based solely on the actual size of the transaction, and then be able to guess how much more they should offer based upon how many large senders there were in the previous several blocks.  Say in this transaction had a million outputs (probably 10 million inputs) and was 3.2 Mb once done.  The difficulty was 2 million at the last adjustment, so wlmart knows that the max block size is 5Mb.  In order to fit their 3.2 Mb single transaction into the block, they have to offer a fee at least 16 times the minimum fee (5/8=.625, 3.2/.625=5.12, so 6th tier, first tier is free, second is equal to the minumum fee, so 6th tier is 4 doublings of the minumum fee).  If the minimum is .01, then Wal-Mart pays at least .16 just to qualify.

EDIT: somewhre I switched my numbers in my head from 16 teirs to only eight.  So my numbers are wrong, but hopefully I conveyed the idea.
legendary
Activity: 1106
Merit: 1004
How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

That starts to get more complex than what it needs to be, IMHO. As long as the delay of readjustment is short (24h for ex., as Gavin suggested), any formula which slightly increases the last average size should be fine. Maybe just making the increase relative instead of absolute should help with commercial holidays.
legendary
Activity: 1304
Merit: 1015
I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?


I think averaging the "last N blocks in the best chain" is good but there may be a better way.  How about we try to predict the size of the next block?  We take the last N blocks and determine if it is linear, exponential, or polynomial.  Then we solve the linear or polynomial equation to determine the N+1 point.  Basically, this method is attempting to predict the size of the next block.

We can start of simple, and just use y=mx+b. 
newbie
Activity: 1
Merit: 0
In the long-run the miners are all going to have their own rules on the fee schedules; the best we can do is set the default rules with the expectation that one day they will become ignored.

It will be in the big miners interest to make the most amount of profit ( Sum[of all fees] ).  This might be a smaller number of transactions, but each transaction taking a large fee, or many many small fee transactions.


I propose that the fee schedule is:

A: (optional) first 100KB open to any transactions  -  this is not adjusted no-matter the block size/fees, the miner can optionally not include any free transactions.

B: (recommended) next 100KB given to highest fee transactions - a miner must include up-to 100KB of the highest fee transactions. (I don't know if you could enforce this)

C: (enforced max) Based upon the average of part B over the last 100 blocks, if you can accept transactions up to:

Max size of Section C = (Total fees in B section)/(AVG free last 100 B sections) * 100KB

Total Max:  Must not be over 100x AVG size of last 6 blocks.  (can grow very large very quickly, if those making the transactions are willing to pay for it)


Why I propose the above schedule:

1.  Has a 'no-cost, but limited size area for any transactions of the miners choice... eg the Miner can Choose to include transactions from his buddies for no transaction fee. (Section A)

2.  Top priority transactions have a dedicated place in every block to compete for. (section B)

3.  If there is a strong demand for fee paying transactions then the the blocks will scale quite large very quickly (aka Christmas shopping)

4.  The total fees must always be significantly more than the average for very large blocks.


I have put quite a bit of thought into this fee schedule, I would love the forums comments on it.

Overall, whatever we what decide will not matter as one day the big miners will decide for themselves... This is just my best guess about what will fit with the natural economics of bitcoin.
legendary
Activity: 1708
Merit: 1010
Automatic adjustments based on difficulty assume difficulty will scale with traffic.

Here's how it scales automatically:

If blocks are getting full, people pay higher fees to get their transactions in the block. Increased mining profitability causes increased mining which causes increased difficulty.

I agree with this perspective.  This simple rule maintains scarcity, prevents scalability issues, and is likely to find it's own equilibrium via transaction price discovery.
donator
Activity: 826
Merit: 1060
Automatic adjustments based on difficulty assume difficulty will scale with traffic.

Here's how it scales automatically:

If blocks are getting full, people pay higher fees to get their transactions in the block. Increased mining profitability causes increased mining which causes increased difficulty.
legendary
Activity: 1652
Merit: 2311
Chief Scientist
I'd tweak the formula to be:  max block size = 1000000 + (int64)(difficulty)

... just to avoid "if block number is < X max block size = 1000000 else..." logic.  Adding in the current 1MB max limit means all the old blocks are valid under the new rule.

I like Mike's point that difficulty and transaction volume aren't necessarily related.  Maybe a better formula for miners would be something like:

max block size = 1000000 + (average size of last N blocks in the best chain)
... where N is maybe 144 (smooth over 24-hours of transactions)

Anybody have access to what Visa daily transaction volume looks like in the days around Christmas?  Are there huge, sudden spikes that the above formula wouldn't handle?
legendary
Activity: 1106
Merit: 1004
Automatic adjustments based on difficulty assume difficulty will scale with traffic. I'm not convinced that relationship will hold.

Neither am I. Using the size of the last X blocks seems more reasonable.
legendary
Activity: 1106
Merit: 1004
I think non-miners don't need to check the block size even if they are full nodes

I'm not really convinced of that...

There are some arbitrary rules regarding what a valid block is which are of interest to the entire bitcoin community, not only miners. And I'm not talking about obvious rules like no double-spending or signature validation. I mean rules like the difficult factor or block rewards, for example. These two concern the inflation control, which are of interest of every bitcoin user.

Of course that miners that disagree with the current rules could always try to change them. But if users reject their blocks, the result of their mining may be worth much less as it would be a fork used by few.
So, when users validate blocks, they create a strong incentive for miners to obey the entire user base consensus. If instead users accept all blocks that miners decide to build upon, then it's up to the miner consensus only to decide these kind of rules. Even if they change to something which is not really of interest to the entire user base, users will passively accept it.

I think that the maximum block size is a rule of this kind. It's not only about spam. It's about creating an artificial scarcity too.
It's true that miners may come up with a good agreement since this artificial scarcity is good for them, but still, it sounds dangerous to me for the entire user base to give a "blank card" to miners to decide on that entirely on their own... don't you think?
donator
Activity: 826
Merit: 1060
How about: The maximum block size equals the higher of: (a) the current hard-coded maximum block size, and (b) 'difficulty' bytes.

That's awesome...

EDIT:  Rule (b) might have to be some agreed upon multiple of difficulty, however.  If the blocksize does not naturally increase until difficulty is over one million, I'm afraid that we really would have some scalability issues.
What? Difficulty will be above one million real soon now. Two to three months probably.

Quote
And Rule (a) should be reduced by half at least.
Why risk compatibility with existing software, just for the sake of a minor tweak that will only be relevant for the next two or three months?
sr. member
Activity: 440
Merit: 250
#SWGT CERTIK Audited
Here's an idea that might be dismissed as stupid-simple, but sometimes stupid-simple ideas work really well.

How about: The maximum block size equals the higher of: (a) the current hard-coded maximum block size, and (b) 'difficulty' bytes.

I was just thinking that block-size should depend on some measurement of how much it taxes the network, just like difficulty is measuring how fast blocks can be found and corrects that rate.

I have a hard time seeing how to objectively measure block-size impact (which is global!) in a similar way as mining?
legendary
Activity: 1526
Merit: 1134
There are actually several block size limits. Practically speaking, the client holds incoming messages in RAM. Gigabyte sized blocks would require a gigabyte of RAM to receive. Any block size limit would have to ensure the max message size is also adjusted to take it into account, at least until blocks are distributed as header+tx hash lists.

We can probably just set it to a gigabyte max for non-miners and forget about it for a while. That's approximately what it'd take to keep up with VISA - might as well aim high, right? :-) Miners can be more distinguishing as long as they're responsive.

Automatic adjustments based on difficulty assume difficulty will scale with traffic. I'm not convinced that relationship will hold. If there's going to be an automatic formula (median size of recent blocks * 1.1) seems like as good as any, and is also simple.
legendary
Activity: 1708
Merit: 1010
Here's an idea that might be dismissed as stupid-simple, but sometimes stupid-simple ideas work really well.

How about: The maximum block size equals the higher of: (a) the current hard-coded maximum block size, and (b) 'difficulty' bytes.

That's awesome.  And I agree it's sledgehammer simple.  I like that rule better than mine.  The free section, and the tiered fee schedule, would have to become percentages of that number; also sledgehammer simple.  No fudge factors.  Elegant.

EDIT:  Rule (b) might have to be some agreed upon multiple of difficulty, however.  If the blocksize does not naturally increase until difficulty is over one million, I'm afraid that we really would have some scalability issues.  Can anyone think of a metric that can be used for that multiple, or must it be fixed?

EDIT #2:  And Rule (a) should be reduced by half at least.
Pages:
Jump to: