Pages:
Author

Topic: A Scalability Roadmap - page 2. (Read 14919 times)

legendary
Activity: 1246
Merit: 1011
October 15, 2014, 07:00:54 AM
#92
Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.

A market entity is not restricted to a single geographical location.  McDonald's have locations all over the world.
That's not the point. McDonald's is also not a cartel. Geography plays a part in where you can have beef or pork sandwiches as well. My point is that nobody has every competitive edge in everything. Anyone that can gain a competitive edge over certain resources will attempt to exploit them. This merely addresses profitability. If you want to realize an actual profit, you will need customers, and customers need incentives.

Certainly, McDonald's is not a monopoly; it is a single player in a competitive market.  I merely wished to highlight that geographical diversity of the factors of production alone is not sufficient to ensure decentralisation of mining.

The point that "nobody has every competitive edge in everything" is what I'm worried about.  To be clear, I'm not saying that free-market forces cause an industry to converge to monopoly, far from it.  I'm suggesting that comparing Bitcoin mining with a typical industry requires care because the miners have some influence over the very nature of Bitcoin itself.
legendary
Activity: 1400
Merit: 1013
October 15, 2014, 06:51:57 AM
#91
Sure.  By centralisation here I'm referring to the gradual reduction in the number of block-generating entities.  To be clear, I claim: absent a block-size limit, this centralisation process would occur naturally and that a good relay bandwidth market would accelerate this process.

Always happy to be proven wrong; just want to give you something more concrete to work with.
The number of individuals who control hashing equipment has been increasing since 2008, during the time in which the block size limit is effectively non-existent (because tx volume is too low to be affected by the limit).

Why are you predicting that this trend would reverse instead of continue?
legendary
Activity: 1246
Merit: 1011
October 15, 2014, 06:43:16 AM
#90
Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

I submit that pursuit of just these policies would actually encourage centralisation.  A small number of large miners will consume fewer resources than a decentralised mass.  A single trusted data centre could be even more efficient.

Can you define decentralization/centralization in this context?

Sure.  By centralisation here I'm referring to the gradual reduction in the number of block-generating entities.  To be clear, I claim: absent a block-size limit, this centralisation process would occur naturally and that a good relay bandwidth market would accelerate this process.

Always happy to be proven wrong; just want to give you something more concrete to work with.
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
October 15, 2014, 06:25:33 AM
#89
Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.

A market entity is not restricted to a single geographical location.  McDonald's have locations all over the world.
That's not the point. McDonald's is also not a cartel. Geography plays a part in where you can have beef or pork sandwiches as well. My point is that nobody has every competitive edge in everything. Anyone that can gain a competitive edge over certain resources will attempt to exploit them. This merely addresses profitability. If you want to realize an actual profit, you will need customers, and customers need incentives.
member
Activity: 129
Merit: 14
October 15, 2014, 05:47:43 AM
#88
I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

......

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.


With the current mining dynamics my proposal would not be suitable for the reasons you suggest.  I merely suggest it as an eventual objective for when the block reward becomes low and hopefully mining becomes more decentralised, competitive and fee driven.  If mining doesn’t develop this way then Bitcoin may not be sustainable in the long run anyway.  We could still keep another maximum of maximums block size limit based on bandwidth considerations and then this transaction fee targeting based limit system could operate within this.

Whatever happens to the hash rate total mining revenue represents the “economic value” of network security.  For example currently the security of the network can now be considered as 25 bitcoin per block, regardless of the large hash rate increases as in theory 25 bitcoin per 10 minutes is the cost of mining.  In the future the value of the total transaction fees will represent the network security and therefore the dynamics which determine the fees will be vital.  Having “supply” potentially grow exponentially forever may not be appropriate.

The above proposal could be a good framework for a discussion on how the dynamics for the transaction fees could be determined in the future.  The system is kind of an aggregate transaction fee targeting scheme.  For example a target of 1 bitcoin per block is around 50,000 bitcoin per annum or 0.24% of the eventual total supply per annum.  Deciding if this is a suitable level would be difficult.  Is 0.24% high enough to secure the network or should it be 1%?  What if the number is too high, we create an arbitrarily high amount or environmental damage?
legendary
Activity: 1400
Merit: 1013
October 15, 2014, 05:31:39 AM
#87
The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.
Price discovery of bandwidth is the solution.

A bandwidth market can lead to an efficient use of bandwidth but may do nothing to address the potential tragedy of the commons concerning decentralisation.

Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

I submit that pursuit of just these policies would actually encourage centralisation.  A small number of large miners will consume fewer resources than a decentralised mass.  A single trusted data centre could be even more efficient.
Can you define decentralization/centralization in this context?
legendary
Activity: 1246
Merit: 1011
October 15, 2014, 04:29:52 AM
#86
The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.
Price discovery of bandwidth is the solution.

A bandwidth market can lead to an efficient use of bandwidth but may do nothing to address the potential tragedy of the commons concerning decentralisation.

Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

I submit that pursuit of just these policies would actually encourage centralisation.  A small number of large miners will consume fewer resources than a decentralised mass.  A single trusted data centre could be even more efficient.

Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.

A market entity is not restricted to a single geographical location.  McDonald's have locations all over the world.
legendary
Activity: 1050
Merit: 1002
October 14, 2014, 11:09:23 PM
#85
I sympathize with their plight, but Bitcoin is not made for these first.  Bitcoin is for everyone.  There are parts of the planet (some of which have the greatest need for Bitcoin) that have very limited bandwidth today and can be expected to not see rapid improvement.

You know I'm starting to think it doesn't matter. We win either way.

In the worst case, say we overshoot and Bitcoin becomes completely centralized by powerful miners which then emulate the current SWIFT system, blocking and regulating transactions. What would happen next? Would we curse and shout CRAP! We were this close. If only we'd ratcheted down our numbers a tiny bit. Well everyone go home. Nothing more to see here.

LOL of course not. We'd move to the next alt-coin not co-opted and continue on, having learned from our mistakes. In a post I wrote long ago which seems to have come true I talked about how alt-coins gave a value to our community Bitcoin never could by providing the one thing it alone never could: alternative.

The people who still say there can be only one will always be wrong. Alt-coins are not going anywhere. Most will have low market caps or blow up and deservedly die horrible deaths, but Bitcoin won't ever be all by itself. Won't happen. And if the free market demands a coin with fixed or less-than-bitcoin block size limit then that's what it will get, and value and usage will flow there.

The converse is also true. Say we are unable to gain consensus for raising the size limit, causing a collapse in price as people perceive Bitcoin as unable to serve the base they thought it would; or we proceed with a messy hard fork creating a rift in the community and price crash as people become confused about the future of Bitcoin and what to do next. Cryptocurrency would still go on, eventually, because that cat is out of the bag and people will continue working on it. Of course, I'd rather see the first scenario (a need to adopt an alt-coin) than second as I'm less certain about recovering well from the second since cryptocurrency ultimately has no backing other than overall confidence in its viability.

Either way I see Bitcoin as providing the world with education. It's teaching the world the possibilities of decentralization with currency and that's where the real value is, because Bitcoin isn't the only thing which can work in that model.
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
October 14, 2014, 08:48:21 PM
#84
I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

......

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.




Elements to make mining competitive are cheap power, good connectivity, cheap heat management, and technology development. Cartels can form that take advantage of either of these elements. As long as all of these elements are not overly abundant in only a few geographical regions, mining can stay decentralized.
legendary
Activity: 1400
Merit: 1013
October 14, 2014, 08:42:28 PM
#83
The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.
Price discovery of bandwidth is the solution.

Users want their transaction to be relayed to miners.
Miners want the transaction to reach them so they can earn the fees associated with those transactions.
Miners want other miners to receive their blocks to have their reward recognized by the network.
Users want to receive the block the miners produces to they can know the state of the network.

Relay nodes provide a service which everybody wants. Build a competitive open marketplace for relay nodes to offer connectivity to users on both sides of the network and then price discovery can occur. The relay nodes will get compensated for the resources which they are providing and the price signal will automatically make sure we have the right amount of relay node.

Then we never need to have these unresolvable debates again. When block sizes and tx rates increase, there will automatically be a mechanism in place to make sure the relay nodes receive additional income which they can use to defray their rising expenses.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
October 14, 2014, 07:42:59 PM
#82
This blocksize increase effort is to support the interests of the merchant service companies, Circle et. al.  I sympathize with their plight, but Bitcoin is not made for these first.  Bitcoin is for everyone.  There are parts of the planet (some of which have the greatest need for Bitcoin) that have very limited bandwidth today and can be expected to not see rapid improvement.

We do need a path forward.  We need a way to scale up.  What I can't abide is the notion of picking a number based on historical data, extrapolating, and applying it to the future.  Whatever we guess, we are guaranteed to be wrong.  Its wrong now, (and since we are not facing any imminent existential crisis) unless we can do better than still being wrong, we aren't ready to contemplate hard forks.

Isn't it worth it to the future generations of Bitcoiners to get this right?  At the moment we have the luxury of time, and we have other developments that will further mitigate this issue are coming to give us even more time.

So... Either let the large(ish) companies that are pushing for this (through TBF) make the best use of this time to give us a path forward that will be a lasting one... or wait until the decentralized brains come up with something more future proof than a guess based on historical data.

Essentially... good work Gavin, for raising the issue and making a proposal, but more research is needed.  I have faith that you'll be able to win me over on this (as well as the others opposing it in its current form).  Its just not there yet.  I don't know the answer, and I don't think anyone else does yet, but with all of us working toward it (again thanks to you for raising the issue), we may find it.

We need to be better than the Central Bankers who get together with their economic advisers and pick numbers arbitrarily.  We need automated future-proof solutions written into open protocols that will still be working when we are long dead.  It is our responsibility being alive and here now at the beginning, to see it done right.  To do less than our best is shameful.
legendary
Activity: 1652
Merit: 2301
Chief Scientist
October 14, 2014, 05:12:07 PM
#81
I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

......

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.



legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
October 14, 2014, 11:38:08 AM
#80
Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

My aim is a broad call for a (re)consideration of a dynamic "demand-driven" block size limit mechanism. The best adjustment estimators have yet to be determined. I think the concept should not be prematurely dismissed, because it could be highly beneficial in terms of resource preservation and hence decentralization.

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).


I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.
QFT
Lets use measurement and math over extrapolations where possible,  Balance the risk to decentralization vs making it easier for transaction volume in favor of decentralization.  It is difficult to recover from centralizing effects.
If block bloat by conspiring miners is a concern then there can be growth caps on top of a dynamic scalability protocol too.

We have no crystal ball to tell us the future.  All we know is that we don't know.

And I'll just leave this here:
http://xkcd.com/605/


member
Activity: 129
Merit: 14
October 14, 2014, 03:23:09 AM
#79
Game theory suggests that under certain conditions, these types of agreements or associations are inherently unstable, as the behaviour of the members is an example of a prisoner's dilemma. Each member would be able to make more profit by breaking the agreement (producing larger blocks or including transactions at lower prices) than it could make by abiding by it.

There are several factors that will affect the miners ability to monitor the association:

1.         Number of firms in the industry – High in a competitive mining market, with low barriers to entry and exit for potentially anonymous miners -> association difficult

2.         Characteristics of the products sold by the firms – Homogenous -> association is possible

3.         Production costs of each member – Differing and low costs -> association difficult

4.         Behaviour of demand – Transaction volume demand is highly volatile in different periods -> association difficult
legendary
Activity: 1176
Merit: 1020
October 13, 2014, 10:23:33 PM
#78
I apologise that I was not being very clear, I was talking about miners manipulating the block size limit upwards or downwards in the hypothetical scenario that the block size limit is determined dynamically by an algorithm, for example the one I mention above linking the block size limit to aggregate transaction fees.  What do you think on this proposal?

I think I did understand what you were saying.  I was trying to point out that miners already have control of the size of blocks they publish.  And therefore - collectively - miners have control over how fast the blockchain grows.  But that freedom is not absolute.  There upper and lower limits.  Since blocks with a size less-than-zero is mostly an absurd concept, we can safely put just a few bytes as the smallest possible block.  The biggest possible block size is what we are discussing here.  It basically serves as a check that full nodes can use against the miners, meaning nodes can audit the service the miners are providing and otherwise connect and communicate about the sate of the network.  Any proposal that gives the miners some automated way to influence the MaxBlockSize could be used to make the blocks so big as to promote centralization of the nodes.  Individuals would loose there ability to audit the network.

Miners currently do influence of the MaxBlockSize variable, but the influence is based human communication, persuasion, and lobbying within the ranks of the Bitcoin Community.  If MaxBlockSize was algorithmically controlled, with the formula taking as input conditions the miners had some form of control over, then MaxBlockSize could be raised or lowered by the miners directly, without the consensus of full nodes.  It would no longer be a check.



Why do you say that miners can create their own cartel to create artificial scarcity?  Perhaps they can do this, but a healthy Bitcoin network has a competitive and diverse mining industry where this may not be possible.  If miners collude together in this way then Bitcoin has more serious problems that this scalability issue.

I agree that a max block size is also helpful to keep the network decentralised and “available to the interested individual, both financially and practically speaking” as you say, however I postulate that the max size is also necessary for another reason:  

Artificial scarcity in block space -> higher aggregate transaction fees -> higher equilibrium mining difficulty -> more secure network

No scarcity in block space -> lower aggregate transaction fees (yes a higher volume, but no "artificial" profit) -> lower equilibrium mining difficulty -> less secure network

That's why I said cartel, not collude.  Perhaps I should have used the word 'association' to describe miners working together in a constructive fashion.  Miners collaborating is itself not a problem.  In fact, they do work together all the time and the shared computational output is the blockchain.  If at some future point a majority of the miners start behaving badly, the community will respond.  if the MaxBlockSize was very large, and the dynamics of the bitcoin system were causing the hashrate to fall, I would expect miners to get together and solve the problem.  That could include a miner-only agreement to only publish blocks of a certain size, to drive up fee requirements.  This is not a proposal to raise MinBlockSize.
member
Activity: 129
Merit: 14
October 13, 2014, 05:54:15 PM
#77
It is already explicit in the bitcoin network structure that miners can 'manipulate the block size down'.  They could all issue empty blocks if they wanted.  And yes, miners can also 'manipulate' the block size up.  So the lower bound for the 'manipulation' is zero.  The upper bound is the block size limit, currently at 1MB.  We all agree miners can do whatever they want within those limits.  Gavin's proposal is just a concept for moving that upper bound, and thus giving miners a larger range of sizes of which they may choose to make a block.  An idea I support, and I think Gavin supports, is to have the block size be bounded by the technical considerations of decentralization.  

I apologise that I was not being very clear, I was talking about miners manipulating the block size limit upwards or downwards in the hypothetical scenario that the block size limit is determined dynamically by an algorithm, for example the one I mention above linking the block size limit to aggregate transaction fees.  What do you think on this proposal?

Miners can create their own cartel if they want to create artificial scarcity, so they don't need a max block size to do it.  But cartel or not, max block size enshrines, essentially into 'bitcoin law', that bitcoin will remain auditable and and available to the interested individual, both financially and practically speaking.

Why do you say that miners can create their own cartel to create artificial scarcity?  Perhaps they can do this, but a healthy Bitcoin network has a competitive and diverse mining industry where this may not be possible.  If miners collude together in this way then Bitcoin has more serious problems that this scalability issue.

I agree that a max block size is also helpful to keep the network decentralised and “available to the interested individual, both financially and practically speaking” as you say, however I postulate that the max size is also necessary for another reason:  

Artificial scarcity in block space -> higher aggregate transaction fees -> higher equilibrium mining difficulty -> more secure network

No scarcity in block space -> lower aggregate transaction fees (yes a higher volume, but no "artificial" profit) -> lower equilibrium mining difficulty -> less secure network
legendary
Activity: 1176
Merit: 1020
October 13, 2014, 05:27:44 PM
#76
It could be in miners interests to keep the block size limit small, to make the resource they are “selling” more scarce and improve profitability.  The assumption that miners would try to manipulate the block size limit upwards is not necessary true, it depends on the bandwidth issue versus the need for artificial scarcity issue dynamics at the time.  If Moore’s law holds then eventually the artificial scarcity argument will become overwhelmingly more relevant than the bandwidth issues and miners may want smaller blocks.  Miners could manipulate it both ways depending on the dynamics at the time.

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

It is already explicit in the bitcoin network structure that miners can 'manipulate the block size down'.  They could all issue empty blocks if they wanted.  And yes, miners can also 'manipulate' the block size up.  So the lower bound for the 'manipulation' is zero.  The upper bound is the block size limit, currently at 1MB.  We all agree miners can do whatever they want within those limits.  Gavin's proposal is just a concept for moving that upper bound, and thus giving miners a larger range of sizes of which they may choose to make a block.  An idea I support, and I think Gavin supports, is to have the block size be bounded by the technical considerations of decentralization.  Miners can create their own cartel if they want to create artificial scarcity, so they don't need a max block size to do it.  But cartel or not, max block size enshrines, essentially into 'bitcoin law', that bitcoin will remain auditable and and available to the interested individual, both financially and practically speaking.

My own feeling is that we should be looking at "as much block-space as possible given the decentralisation requirement" rather than "as little block-space as necessary given current usage". 

Totally agree. 

However, if you can find an appealing notions of necessity, smallness, or some alternative method of attempting to balance centralisation risk against utility which involves fewer magic numbers and uncertainty than the fixed-growth proposal then it's certainly worth it's own thread in the development section.

I think MaxBlockSize will remain a magic number, and I think that is okay.  It is a critical variable that needs to be adjusted for environmental conditions, balancing, exactly as you put it teukon, [de]centralization against utility.  As computing power grows, it is easier to conceal, hide, and keep "decentralized" computational activities.

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

So I'm saying: the future is uncertain, but there is a clear trend. Lets follow that trend, because it is the best predictor of what will happen that we have.

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).

The more accurate the projection of computing / bandwidth growth is, the less often the magic number would need to be changed.  If we project very accurately, the magic number may never need to be adjusted again.  That being said, it is safer to err on the side of caution, as Gavin has done, to make sure any MaxBlockSize formula does not allow blocks to grow bigger than the hobbiest / interested individual's ability to keep up.
member
Activity: 129
Merit: 14
October 13, 2014, 03:56:30 PM
#75
I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

For example:
If the average aggregate transaction fees in the last 100,000 blocks is 1 Bitcoin per block, or more, then there could be a 20% increase in the block size limit.  

Advantages of this methodology include:
  • This algorithm would be relatively simple
  • The limit is determined algorithmically from historic blockchain data and therefore there will be a high level of agreement over the block size limit
  • The system ensures sufficient fees are paid to secure the network in a direct way
  • It would be difficult and expensive to manipulate this data, especially if mining is competitive and decentralized
  • The limit would relate well to demand for Bitcoin usage and real demand based on transaction fees, not just volume

I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

It could be in miners interests to keep the block size limit small, to make the resource they are “selling” more scarce and improve profitability.  The assumption that miners would try to manipulate the block size limit upwards is not necessary true, it depends on the bandwidth issue versus the need for artificial scarcity issue dynamics at the time.  If Moore’s law holds then eventually the artificial scarcity argument will become overwhelmingly more relevant than the bandwidth issues and miners may want smaller blocks.  Miners could manipulate it both ways depending on the dynamics at the time.

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?
legendary
Activity: 1246
Merit: 1011
October 13, 2014, 12:27:01 PM
#74
The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).

This is starting to sound hairy to me.  I can easily imagine that 60% of the largest miners would benefit sufficiently from the loss of the weakest 20% of miners that it's profitable for them to all include some number of plausible-looking transactions between their addresses (thereby causing an inflated median).  I feel that anything involving IP addresses is prone to abuse and much worse than the admittedly ugly fixed-growth proposal.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.

My own feeling is that we should be looking at "as much block-space as possible given the decentralisation requirement" rather than "as little block-space as necessary given current usage".  However, if you can find an appealing notions of necessity, smallness, or some alternative method of attempting to balance centralisation risk against utility which involves fewer magic numbers and uncertainty than the fixed-growth proposal then it's certainly worth it's own thread in the development section.
legendary
Activity: 1153
Merit: 1012
October 13, 2014, 07:37:50 AM
#73
Certainly, a dynamic means of adjusting the block size which could not be gamed by miners would be great.  Linked by Peter was an idea from Gavin of determining an appropriate block size by the times taken by nodes to verify blocks.

Unfortunately, I'm not aware of any proposal which really does this.  I don't know the precise details of your proposal and currently don't see how you intend to guard against large miners simply creating millions of transactions to trick the dynamic algorithm into thinking that very large blocks are needed.

My aim is a broad call for a (re)consideration of a dynamic "demand-driven" block size limit mechanism. The best adjustment estimators have yet to be determined. I think the concept should not be prematurely dismissed, because it could be highly beneficial in terms of resource preservation and hence decentralization.

The problem that selfish large miners create millions of transactions could be alleviated by using a median (instead of a mean) statistic in the adjustment estimation, which is much less susceptible to extreme values. Maybe one could also do statistical correction based on IP adresses (those that frequently submit only blocks with a excessively huge number of transactions get less weight).


I'd better say: "Only raise block size limit if required by the minimum amount necessary."

What constitutes "necessary"?  What if there are so many "necessary" transactions that it's impossible for Bitcoin to continue as a decentralised system?  I'd like to see as much block space as possible but will happily work to keep the blocksize smaller than many deem "necessary" to avoid centralisation.

Of course "necessary" has to be defined. I think it is acceptable to make Bitcoin progressively more unviable (through higher fees) for microtransactions if decentralization is at risk. Very small transactions could also happen off-the-chain. However what "small" means is open to debate.
Pages:
Jump to: