Pages:
Author

Topic: Increasing the block size is a good idea; 50%/year is probably too aggressive - page 7. (Read 14303 times)

sr. member
Activity: 278
Merit: 254
KISS:

1. Since technology allows increase to 20 MB per block today, make an increase to this size as soon as consensus and logistics allow.

2. Continue to evaluate the situation based on computer-communications technology growth, transaction growth and observed network behavior.  There will be ample time to make a second hard fork should this become necessary.  (A one time jump  of 20x is equivalent to 40% annual growth for 9 years.)
legendary
Activity: 1050
Merit: 1002
Consider the existence of a central authority, employed by a member organization with the charter of interfacing with governments.  The Chiefs then take the role of arbitrarily deciding on the supply and adjusting as the organization's economic advisers suggest, we then have progressed towards replicating the Federal Reserve Bank.

I completely disagree with this. Believe it or not it's actually not that easy for the Fed to adjust monetary policy. I mean all things considered, it's exceptionally easy, but they still have to get their board to go along and sell the public on what they're doing. That's a task made harder as they try more extraordinary things (like now) and the public becomes more astute to the way money works and its importance (like now), and that's a center driven design.

Bitcoin is designed from the ground up to be the opposite. It's extraordinarily hard to implement changes affecting the whole without consent from the whole. I sincerely believe after a certain point of adoption it will be impossible to make changes to Bitcoin, even ones not so controversial; if there isn't a do or die mandate behind the action (like a break in SHA256) I don't see consensus from millions and millions of independent thinkers coming easily. Somebody's going to think differently for some reason, even if it appears irrational. People call this ossifying of the protocol.

Think how hard this 1MB issue is. There was a time when Satoshi simply told everyone to start running a protocol change without question. He knew there was a severe bug allowing excess coins, but people simply upgraded and now the fork ignoring that block is locked in.

Bitcoin isn't the first to come up with decentralization. That was actually the idea behind America. Instead of power reigning down from monarchs it would be vested within all the individuals. However, the founders even then recognized authority by committee wasn't always ideal. It would be a clear disadvantage if attacked since the battle might be lost before it was decided what to do. That's why the president has full authority to respond militarily in case of attack.

It sounds like you're objecting for reasons more ideological than practical. While that's admirable and understandable I hope you also recognize that's not automatically best given the circumstances.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
Did you read my "blocksize economics" blog post?
yes, I should take this as request for comment in the thread more appropriate for that.

I don't understand why you think MAX_BLOCK_SIZE necessarily has anything to do with "supporting mining" (aka securing the network).
Simply put: It is the supply side of the mining resource which miners are selling.
This should be clear enough.  
I can go into more detail in its thread, tdryja made some decent comments here already.

What stops this from happening:

Big miners accept off-blockchain payments from big merchants and exchanges that want their transactions confirmed. They are included in very small blocks with zero fees.  The blocksize stays at 1MB forever.

Lets look at incentives:

Big miners: have cozy agreements with Big Merchants. Those agreements keep the little guys out.

Big Merchants: same thing. The need to get an agreement with a miner to get your transactions accepted is a barrier to entry for the Little Guys.

The fear of this theoretical arrangement was addressed in tdryja's initial post, and further explained in the latest.
I do agree that any feedback mechanism such as we are seeking with this line of discussion holds the potential for creating a perverse incentive.  


Admittedly there is also a philosophical basis for what may seem like a useless discussion to some since the Chief Scientist of The Bitcoin Foundation has already decided and is seeking to end discussion.  


Consider the existence of a central authority, employed by a member organization with the charter of interfacing with governments.  The Chiefs then take the role of arbitrarily deciding on the supply and adjusting as the organization's economic advisers suggest, we then have progressed towards replicating the Federal Reserve Bank.

It is nothing personal with Gavin, I like you and love what you do.  I think your proposal also could possibly work in the short term, except that it sets a most dangerous precedent.  One risk is certain, and that is that those who come after us will not be us, but it is our hope, and the effort for which we strive mightily, that Bitcoin will still be Bitcoin. It is this which I am hoping to protect by seeking for a way to put this authority on the block chain, and not on the decree of any person now, or in the future.

Both of these risks are theoretical, (a perverse miner/merchant Cartel, and a perverse Central Authority) On balance, the risks of possibly creating perverse incentives by basing decision effecting the monetary support of the network on the evidence provided by the block chain, and decisions by unknown people of the future who may have their own perverse incentives that will be more difficult to observe, I would give the role of this governance to the Bitcoin block chain.  Simple because there it will be exposed and may be seen, and is also a much easier perversity to dislodge.
newbie
Activity: 6
Merit: 0
Did you read my "blocksize economics" blog post?

I don't understand why you think MAX_BLOCK_SIZE necessarily has anything to do with "supporting mining" (aka securing the network).

I can't speak for NewLiberty, but I have certainly read it, and agree with the majority of what you've written.  The part about "Block Subsidy, Fees, and Blockchain Security" is most relevant here.  I agree that as it stands, there is no guarantee that 1MB blocks would be full of high value transactions, and no guarantee that 1GB blocks would be full enough of low value transactions to secure the network. 

However, if the max block size is linked to the transaction fees, we can at least know that the 1GB block does have sufficient fees, because the size would contract if it didn't.  The other scenario -- a half empty 1MB block with minimal fees on a few large transactions -- implies that Bitcoin has either failed or been superseded, at which point the max block size is not relevant.


What stops this from happening:

Big miners accept off-blockchain payments from big merchants and exchanges that want their transactions confirmed. They are included in very small blocks with zero fees.  The blocksize stays at 1MB forever.

2 things: 1 which stops it from happening, and 1 which means it could happen anyway.

This scenario supposes that 1MB is sufficient to maintain the miner / merchant cartel's transactions, which may not be the case, but is plausible.  What is implausible is that every member of this cartel of miners continues to reject a vast mempool of outsider fee paying transactions.  Thousands of merchants saying "shut up and take my bitcoins! include my tx!" and the miners all say "No!", maintain their cartel, and deny themselves that money?  Or, if they try to on-board these merchants into their cartel, the 1MB block isn't big enough anymore.  Similarly for merchants, are they getting a better deal with the cartel?  If so, great, but why is the cartel being nice to the merchants; it's much more likely that the merchants would hate the cartel and try to get their transactions in a cheaper, independent block.

Why maintain membership in the cartel if you make less money?  One of those two groups (miners, merchants) must be making less money.

This type of cartel is also possible with an open-loop exponential expansion of max block size.  The majority of the miners can stick to 1MB blocks, and reject blocks with transactions not in their cartel.  >50% of miners need to participate in this cartel to effectively push down the median fees.  It doesn't make rational sense (unless megabytes are extremely expensive) in this case either, but if we worry about a malicious majority mining cartel, is still doable.

I think an open-loop larger block size would probably be fine, but it involves a lot of extrapolation.  Maybe computers get way better really fast, and 1GB is laughably small.  Or maybe they stay the same, and 1GB is too large, meaning the networking and storage costs of mining exceed the sha256 costs, centralizing mining.  I think a closed-loop feedback system based on median aggregate transaction fees is able to reduce these risks.
hero member
Activity: 709
Merit: 503
Is there a known functional limit above which MAX_BLOCK_SIZE breaks the code?  Have we ever cranked the MAX_BLOCK_SIZE up on testnet and then deliberately filled a block up with transactions and seen it fail?

Do any instabilities appear when the pool of unconfirmed transactions grows large enough?

Does every transaction eventually get put into a block for sure?  Is it possible for a transaction to hang out in the pool forever?
legendary
Activity: 1652
Merit: 2311
Chief Scientist
By including the coinbase fee (or maybe a square or other root of it) we would come closer to Gavin's increase in the early years and move steadily toward a fee supported mining within the next 20 years or so while increasing the MAX_BLOCK_SIZE.

Did you read my "blocksize economics" blog post?

I don't understand why you think MAX_BLOCK_SIZE necessarily has anything to do with "supporting mining" (aka securing the network).

What stops this from happening:

Big miners accept off-blockchain payments from big merchants and exchanges that want their transactions confirmed. They are included in very small blocks with zero fees.  The blocksize stays at 1MB forever.

Lets look at incentives:

Big miners: have cozy agreements with Big Merchants. Those agreements keep the little guys out.

Big Merchants: same thing. The need to get an agreement with a miner to get your transactions accepted is a barrier to entry for the Little Guys.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
I'd welcome comments / criticism of why having such a feedback mechanism is a good or bad idea.
There may be a way of using the coinbase fee also in this calculation, but treated differently.  The coinbase fee primarily serves the emission and distribution functions, but also stimulates adoption in the early years.  It might be used as a way of amplifying the metric in the early years (when lack of adoption is a significant existential risk, and percentage growth is presumably higher) and then let this effect subside in later years by some form of multiplying by (Coinbase)-1/2

Currently the cost per transaction, with the coinbase included, is often higher than the transacted amount.  Such transactions would not occur without the coinbase, so a way to accomodate for what this proposal would mean (because we would be unlikely to have any meaninful MAX_BLOCK_SIZE increases so long as coinbase transactions are the funding source for the network.
If I miner did somehow push the fees and blocksize up, that miner could then publish large blocks in an attempt to spam / DoS the network.  That's the only real threat, and it could be very costly and slow for the miner to accomplish.  Unlike the difficulty adjustment, which is bounded at 0.25X to 4X, the max block size adjustment could have a much tighter bound, like 10%, so that it would take months to double the max block size.

Currently the TX fees are way below 1 BTC per block, this will likely continue for quite a while.  It is less than 15 BTC per day in fees.
By including the coinbase fee (or maybe a square or other root of it) we would come closer to Gavin's increase in the early years and move steadily toward a fee supported mining within the next 20 years or so while increasing the MAX_BLOCK_SIZE.

I think this is simple and straightforward enough that miners, developers, and bitcoin users can read it, understand it, and be OK with it.  I also think that it's safe long-term, and doesn't require human intervention once set up, regardless of how computer technology evolves (or fails to) over the next few decades.

yes.

edit:
Another critique of the fee-basis method vs the block size basis might be that the "% of M0 to dedicate to mining" would gradually increase over time as bitcoin are lost/destroyed.  I don't see this as highly important, but may be a source of future refinement if it were ever to become a concern.
newbie
Activity: 6
Merit: 0
trout:
empty blocks are possible now, and not a big deal.  They become very expensive longer term as fees take over the block reward; an empty block could have no or negligible reward.  If the median is used, this attack will have minimal effect on the network, while costing the attacker 1 BTC per empty block.  I don’t think we need to worry about an attack which is very expensive to for the attacker, and has no appreciable effect on the network.

I agree that it may be easier to form a majority cartel if the only thing at stake is block size.  But a majority cartel of miners can pretty much do this anyway; they just tell everyone “Hey guys, the new max block size is 1GB.  We’re all moving our mining power there, you’d best update your clients.

Basically I think worrying about a majority of miners doing something you don’t want them to is beyond the scope of the problem.  And if they all want to have huge blocks and put all my transactions in there for free, I for one welcome our new benevolent mining overlords Smiley

David Rabahy:
The idea of only allowing known transactions into a block has been discussed before, but has been found unworkable.  The purpose of the block is to achieve consensus on which transactions have happened.  Presupposing consensus on the set of transactions removes the need for the block.  In other words, if all the miners already agree on what’s going to be in the next block, why bother broadcasting it to each other?

There are different ways to try to make that work, and I’ve discussed it with several people, but I think it’s fundamentally incompatible with Bitcoin’s current consensus system.
hero member
Activity: 709
Merit: 503
.. more about this:
there's actually the opposite kind of manipulation (or rather attack) possible:
empty blocks. Right now they exist but don't hurt anyone; here they would push
the max block size down, hurting the network.
Would it be reasonable to reject blocks with too few transactions in them if the pool of transactions waiting is above some threshold?
hero member
Activity: 709
Merit: 503
A miner that fills a block with self-dealing transactions (for whatever reason; malicious or stupid) is a nuisance or perhaps worse.  Is there a way to reject blocks that contain transaction that haven't appeared on the network yet?  If transactions must appear on the network before they can appear in a block then some other miner might block them before the bad actor and obtain the associated fees undermining the entity attempting to bloat blocks with self-dealing transactions.  I suppose such a bad actor could hold the self-dealing transactions until they have a block ready and then transmit the self-dealing transactions and block out together as close as possible in time in an attempt to minimize the risk of another miner grabbing their fees.

Oh, I wonder; Does a full node have to have enough bandwidth to keep up with both the blocks *and* transactions waiting to be blocked?  If so then my earlier calculation based on just the blocks (and no orphans for that matter) is low.
sr. member
Activity: 333
Merit: 252
.. more about this:
there's actually the opposite kind of manipulation (or rather attack) possible:
empty blocks. Right now they exist but don't hurt anyone; here they would push
the max block size down, hurting the network.
sr. member
Activity: 333
Merit: 252
yep, median would work much better than the mean, and a group of <50% miners would only have limited power.

However, I don't quite agree with the reliance on no collusion above 50%.
I understand the  premise  that a group of >50% miners can do something much worse:
a doublespend.  But it is not at all the same type of collusion.
Assembling  a group to collude for a doublespend and destroying the credibility
and value of bitcoin in the eyes of the public is one thing, and assembling a group
to push the max block size to infinity, in order  to slowly push out low-bandwidth competitors
from mining, is a very different thing. It seems the latter is much easier.

This said, I find both this and NewLiberty's idea interesting.
newbie
Activity: 6
Merit: 0
David Rabahy:
I generally try not to think in dollar terms about the economic issues in Bitcoin.  If there is a feedback system such that block rewards from tx fees tends towards 1BTC / block, the blocks could potentially be quite large; 100MB/block, or with your estimates 179,000 transaction, at a cost of 5.5 uBTC per tx.  More transactions trying to fit into a 100MB block will tend to push up the per tx fee, which would to expand the max block size to say 110MB, which pushes the fees per tx back down such that the new 110MB blocks are just about full of txs at a 5.4 uBTC / tx fee, still earning 1BTC per block.

trout:
I address this in my initial post and go into detail below.

2112:
I've thought about the same set of changes, and have decided that it's probably too much of a change to practically push through into Bitcoin.  Something where the miner of block n gets 1/2 the tx fees, and the miner of block n+10 gets the other half would both incent inclusion of high-fee transactions, as well as eliminate the risk that miners would pay fees to themselves.  Such a fundamental change however is probably impractical, as it would be dismissed by many as "not Bitcoin".  Integrating something like p2pool is also quite complex and will be viewed as too risky.

NewLiberty:
I wasn't clear enough about this in my post, but I meant that the new epoch's block size to be a function of the previous one, just like the difficulty adjustments.  Difficulty adjustments don't actually care about hash rate, just the time taken for the 2016 block epoch, and a relative difficulty adjustment is made based on the divergence from the two week target.  Similarly, I agree that max block size should use transaction fees as a relative adjustment factor.  I mention bounds of this adjustment below.

-

Simply using median transaction fees per block over the past epoch is hopefully simple and straightforward enough to be accepted by people, and does not have significant incentive problems.

There are two ways this can be 'gamed' by miners.  The way that is most dangerous to the non-mining users of the network would be for miners to artificially limit block sized to a small value, in the hopes that they would profit from high transaction fees.  Doing this requires malicious collusion among miners (in excess of that in a proof-of-idle system which I've written about) and in a situation where most of the miners are trying to harm bitcoin, we're already in much bigger trouble.  In practice miners will grab all the fees they can fit into a block, especially if they know the next miner will do the same.

The more problematic way a malicious miner can 'game' this is by paying fees to itself.  Using thresholds, or the median instead of mean, or some other mathematical way to cut out the outliers may be helpful.  I like using the median block reward -- it's really quite simple and would prevent anyone with <50% of the hash power from accomplishing much.  And the assumption in all of this is that there is no >50% miner. 

If I miner did somehow push the fees and blocksize up, that miner could then publish large blocks in an attempt to spam / DoS the network.  That's the only real threat, and it could be very costly and slow for the miner to accomplish.  Unlike the difficulty adjustment, which is bounded at 0.25X to 4X, the max block size adjustment could have a much tighter bound, like 10%, so that it would take months to double the max block size.

I think this is simple and straightforward enough that miners, developers, and bitcoin users can read it, understand it, and be OK with it.  I also think that it's safe long-term, and doesn't require human intervention once set up, regardless of how computer technology evolves (or fails to) over the next few decades.

Thanks to everyone who's read and commented on this; I actually thought of this a few years ago and mentioned it to people but never had gotten any attention.  My personal opinion is that Gavin's idea of just increase blocks based on a guess of continuance of Moore's law would probably work fine... but I like my idea a little better Smiley  Thanks for the comments.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
I'd welcome comments / criticism of why having such a feedback mechanism is a good or bad idea.

As with the proposal I offered, this proposal has the virtue of expanding MAX_BLOCK_SIZE when it is in demand, and contracting if fees are not sufficient to support the network (so that fees will rise).

Some issues for examination:
Previous block size:
In its simplest form the tdrja proposal the block size of previous epochs aren't factored.  This makes MAX_BLOCK_SIZE subject to rapid switching which as tdrja mentions could be cured by hysteresis, or also (new suggestion borrowed from my proposal) by having the MAX_BLOCK_SIZE a product of previous MAX_BLOCK_SIZE, modified by the tdryja proposed transaction fee metric (so a % increase/decrease).  The rapid switching may be problematic if some event stimulates a desire in many decentralized miners to radically reduce block size limit in order to restrain commerce during an event.  (It doesn't take a conspiracy, a single factor influencing miners in aggregate can do this.)

Coinbase Fee
As mentioned I like the tdrja proposal for its simplicity so I'd look ways to keep that virtue.  Still, if transaction fees are the primary metric, it would seem there may be some peril in ignoring the coinbase entirely due to it's impact on mining in the early years.  It is currently about 300x the transaction fee and so it almost entirely supports the mining effort.

There may be a way of using the coinbase fee also in this calculation, but treated differently.  The coinbase fee primarily serves the emission and distribution functions, but also stimulates adoption in the early years.  It might be used as a way of amplifying the metric in the early years (when lack of adoption is a significant existential risk, and percentage growth is presumably higher) and then let this effect subside in later years by some form of multiplying by (Coinbase)-1/2

Currently the cost per transaction, with the coinbase included, is often higher than the transacted amount.  Such transactions would not occur without the coinbase, so a way to accomodate for what this proposal would mean (because we would be unlikely to have any meaninful MAX_BLOCK_SIZE increases so long as coinbase transactions are the funding source for the network.

It would be good to increase MAX_BLOCK_SIZE long before the coinbase reward is no longer the driving force of network growth.

Squeezing out arbitrariness
There isn't much in the tdrja proposal which is arbitrarily declared by decree (fiat) other than the allocation of "What should it cost to run the bitcoin network?"
We have some indication of this from the hash rate and the total fees.  Currently total fees (coinbase and transaction) are stimulating growth in difficulty even in declining markets.


sr. member
Activity: 333
Merit: 252
a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
What about the following modification:

a1) fold a modified p2ppool protocol into the mainline protocol
a2) require that transactions mined into the mainline blockchain have to be seen in the majority of p2ppool blocks
a3) p2ppool then has an additional function of proof-of-propagation: at least 50% of miners have seen the tx
a4) we can then individually adjust the fees and incentives individually for:
a4.1) permanent storage of transactions (in the mainline blockchain)
a4.2) propagation of transactions (in the p2ppool blockchain, which is ephemeral)

Right now the problem is that miners receive all fees due, both for permanent storage and for network propagation.

Another idea in the similar vein:

b1) make mining a moral equivalent of a second-price auction: the mining fees of block X accrue to the miner of block X+1
b2) possibly even replace 1 above with a higher, constant natural number N.
Late edit:
b3) reduce the coinbase maturity requirement by N
Later edit:
b4) since nowadays the fees are very low compared to subsidy, (b3) would imply a temporary gap of global mining income. Subsidy of block X accrues to the miner of X, fees of block X accrue to the miner of block X+N.
End of edits.

Both proposals above aim to incentivize and enforce propagation of the transactions on the network and discourage self-mining of non-public transactions and self-dealing on the mining fee market.


a) is vulnerable to sybil attacks
b) smothers the incentive to include any transactions in blocks: why should I (as a miner) include a tx if the fee would go to someone else?

Also it seems both  are too disruptive  to be implemented in bitcoin.
Anything this much different would take an altcoin to be tried.

legendary
Activity: 1050
Merit: 1002
As I see it Bitcoin is like the U.S. government. It has made too many promises to keep. I agree with Gavin Bitcoin has been sold as being able to serve the world's population. At the same time it has been sold as being effectively decentralized. These two things can't happen at the same time with today's technology, because bandwidth numbers (primarily) don't  align with global transaction data numbers. They will work eventually, but they don't today.

The question is how to get from today to the future day when Bitcoin can handle the world's transaction needs while remaining decentralized down to technology available to average people.

We have effectively three choices.

- Do nothing and remain at 1MB blocks
- Gavin's proposal to grow transaction capacity exponentially, possibly fitting in line with Bitcoin adoption numbers
- Some algorithmic formula to determine block size which is probably more conservative than exponential growth, but less predictable

I think doing nothing is unrealistic.

I like Gavin's proposal because it can solve the issue while also being predictable. Predictability has value when it comes to money. I agree that some other algorithm using real world inputs is safer, but I wonder at what expense. In the worst case, using Gavin's proposal, there may be some risk of heaving hitting players hogging market share from lesser miners, maybe even to the extent of becoming centralized cartels. I don't think there is a good chance of that happening, but agree it's in the realm of possibility. In that case, though, nobody would be forced to continue using Bitcoin, since it's a voluntary currency. It's easy to move to an alternative coin. Free market forces, in my mind, would solve the problem.

If we try being as cautious as possible, seeking inputs along the way we probably rest assured centralization won't happen with Bitcoin. At the same time, though, the market has to continually assess what is Bitcoin's transaction capacity, and therefore value. I'm not sure how that would play out.

My question is can a majority of the community (say 70-80%) be convinced to choose one of the last two options?
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
What about the following modification:

a1) fold a modified p2ppool protocol into the mainline protocol
a2) require that transactions mined into the mainline blockchain have to be seen in the majority of p2ppool blocks
a3) p2ppool then has an additional function of proof-of-propagation: at least 50% of miners have seen the tx
a4) we can then individually adjust the fees and incentives individually for:
a4.1) permanent storage of transactions (in the mainline blockchain)
a4.2) propagation of transactions (in the p2ppool blockchain, which is ephemeral)

Right now the problem is that miners receive all fees due, both for permanent storage and for network propagation.

Another idea in the similar vein:

b1) make mining a moral equivalent of a second-price auction: the mining fees of block X accrue to the miner of block X+1
b2) possibly even replace 1 above with a higher, constant natural number N.
Late edit:
b3) reduce the coinbase maturity requirement by N
End of edit.

Both proposals above aim to incentivize and enforce propagation of the transactions on the network and discourage self-mining of non-public transactions and self-dealing on the mining fee market.

These are interesting propositions in their own right.
There is a virtue in simplicity in that it is less likely to create perverse incentives.  (Gavin alludes to this in his critique)
For example adding a p2ppool dependency may have complexity risks we don't see so the (b) series by that metric may be better than the (a).

legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.

The answer cannot be in the blockchain, because the problem being addressed (resource usage rising too quickly so only people willing to spend tens of thousands of dollars can participate as fully validating nodes) is outside the blockchain.

You will go down the same path as the proof-of-stake folks, coming up with ever more complicated on-blockchain solutions to a problem that fundamentally involves something that is happening outside the blockchain. In this case, real-world CPU and bandwidth growth. In the POS case, proof that some kind of real-world effort was performed.


Thank you for your contribution and criticism.

Since the difficulty adjustment already effectively assesses real-world CPU growth, I'm unready to assume impossibility of real-world assessment with respect to bandwidth, as there are evidence of both in the block chain awaiting our use.
Analogies to PoS are also no proof of a negative.  

They answer may be in the block chain, and it seems the best place to look, as the block chain will be there in the future providing evidence of bandwidth usage if we can avoid breaking Bitcoin protocol today.  

I don't need anyone to be right or wrong here so long as in the end we get the best result for Bitcoin.  I am very happy to be wrong if that means an improvement can be made.

Gavin, I remain grateful for your raising the issue publicly, and for keeping engaged in the discussion.  I do not agree that discussion on the matter ought end, and think we can do better through continuing.

Wherever we can squeeze out arbitrary human decision through math and measurement, it is our duty to the future to do so.  The alternative is to commit our progeny to the whims and discretion of whomever is in authority in the decades to come.  As David Rabahy pointed out a few posts ago, we may not be pleased with that result.
legendary
Activity: 2128
Merit: 1073
a miner can include into his block a transaction with an arbitrary large fee (which he gets back of course), throwing the mean off the chart.
What about the following modification:

a1) fold a modified p2ppool protocol into the mainline protocol
a2) require that transactions mined into the mainline blockchain have to be seen in the majority of p2ppool blocks
a3) p2ppool then has an additional function of proof-of-propagation: at least 50% of miners have seen the tx
a4) we can then individually adjust the fees and incentives individually for:
a4.1) permanent storage of transactions (in the mainline blockchain)
a4.2) propagation of transactions (in the p2ppool blockchain, which is ephemeral)

Right now the problem is that miners receive all fees due, both for permanent storage and for network propagation.

Another idea in the similar vein:

b1) make mining a moral equivalent of a second-price auction: the mining fees of block X accrue to the miner of block X+1
b2) possibly even replace 1 above with a higher, constant natural number N.
Late edit:
b3) reduce the coinbase maturity requirement by N
Later edit:
b4) since nowadays the fees are very low compared to subsidy, (b3) would imply a temporary gap of global mining income. Subsidy of block X accrues to the miner of X, fees of block X accrue to the miner of block X+N.
End of edits.

Both proposals above aim to incentivize and enforce propagation of the transactions on the network and discourage self-mining of non-public transactions and self-dealing on the mining fee market.
legendary
Activity: 1652
Merit: 2311
Chief Scientist
I happen to think we can do better than Gavin's idea.  I like the idea of trying to come up with a solution that works with the blockchain and adapts over time instead of relying on Gavin, or NL or whomever.  The answer should be in the blockchain.

The answer cannot be in the blockchain, because the problem being addressed (resource usage rising too quickly so only people willing to spend tens of thousands of dollars can participate as fully validating nodes) is outside the blockchain.

You will go down the same path as the proof-of-stake folks, coming up with ever more complicated on-blockchain solutions to a problem that fundamentally involves something that is happening outside the blockchain. In this case, real-world CPU and bandwidth growth. In the POS case, proof that some kind of real-world effort was performed.
Pages:
Jump to: