Pages:
Author

Topic: Why Peter Rs Fee Market Wont Work - page 3. (Read 5641 times)

sr. member
Activity: 277
Merit: 257
January 25, 2016, 05:39:22 PM
#37
TL/DR: A transaction fee market exists without a block size limit assuming miners act rationally.
Transaction fee's are bid to the minimum necessary to include them on the blockchain. This minimum does not magically include all miners, and would reach equilibrium, absent any artificial scarcity, at roughly the cheapest rate possible to achieve that end. That collection of rates would be enough to cover a single datacenter in an ideal location.
The only artificial scarcity in the system is the block size limit. With it, fees can be bid above that equilibrium and thereby fund redundancy of both nodes as well as hashpower.
Without it, nodes would atrophy toward that equilibrium (~1 node). Bitcoin would have failed long before that equilibrium would have been reached, not only because it would mean the goal of a decentralized currency would have failed, but also because hashpower would fall below what would be necessary to keep it secure.

If you're saying that even in that outrageously broken scenario that it would still cost money to add transactions to the block chain, yes you are right. Are you planning on actually addressing the real problems with removing the block size limit?

Yes, probably a better explanation then mine.


If a fee market breaks down because miners are not acting rationally,

The orphan based fee-market brakes down when miners act rationally. The incentive is to always reduce orphan rate and that can be done with 2 datacenters with big pipe between them utilising the methods Greg describes for sending blocks.
legendary
Activity: 1358
Merit: 1014
January 25, 2016, 05:20:41 PM
#36
You have two ways to deal with the situation:
Raise block size to mega centralization levels so only datacenters can run nodes which makes Bitcoin useless but at least you can pay lower fees.
Or keep block size small and decentralized, let a market fee develop, and run spammy transactions through LN.

Choose one.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
January 25, 2016, 04:17:27 PM
#35
If a fee market breaks down because miners are not acting rationally,
it seems that that could happen with or without a blocksize limit, and
generally to the same degree.

Example:  Assume a pool/farm with 20% hashing power that solves 1 out of every
five blocks and will include anyone's transaction regardless of fees,
this would cause a problem with a fee market regardless of whether
or not there was a blocksize limit.

although i suppose it would be worse with no limit, since everyone
would have room to get their transaction in , in the irrational miner's
block vs waiting longer.
staff
Activity: 4284
Merit: 8808
January 25, 2016, 03:52:16 PM
#34
See http://www.bitcoinunlimited.info/public/downloads/feemarket.pdf
p.13, from 2nd paragraph onwards. They're there.
Not so, according to Peter R himself:

Quote

Thank you for your response.  I think you bring up some interesting points that readers should be made aware of.  Transforming your concerns to the language of my paper, I think you're challenging the claim that "gamma" [the coding gain with which block solutions are transmitted] can not be infinite (cf. Section 7).  Indeed, if gamma is infinite then the analysis breaks down. 

I propose the following changes:

1.  I will make it more clear that the results of the paper hinge on the assumption that block solutions are propagated across channels, and that the quantity of pure information communicated per solution is proportional to the amount of information contained within the block. 

2.  I will add a note [unless you ask me not to] something to the effect of "Greg Maxwell challenges the claim that the coding gain cannot be infinite…" followed by a summary of the scenario you described.  I will reference "personal communication."  I will also run the note by you before I make the next revision public. 

3.  I will point out that if the coding gain can be made spectacularly high, that the propagation impedance in my model will become very small, and that although a fee market may strictly exist in the asymptotic sense, such a fee market may not be relevant (the phenomena in the paper would be negligible compared to the dynamics from some other effect). 

4. [UNRELATED] I also plan to address Dave Hudson's objections in my next revision (the "you don't orphan your own block" point). 

Lastly, thank you for the note about what might happen when fees > rewards.  I've have indeed been thinking about this.  I believe it is outside the scope of the present paper, although I am doing some work on the topic. (Perhaps I'll add a bit more discussion on this topic to the present paper to get the reader thinking in this direction).

Best regards,
Peter
sr. member
Activity: 433
Merit: 267
January 25, 2016, 03:40:52 PM
#33
Is Peter R the one who constantly posts "diagrams" rather than "words"?

If so then for sure he is not a very genuine person - as diagrams are generally always used to convince stupid people when you don't have the words to convince those that are not.

Fits the bill.

https://bitcointalksearch.org/topic/m.13491777
legendary
Activity: 1890
Merit: 1086
Ian Knowles - CIYAM Lead Developer
January 25, 2016, 12:51:33 PM
#32
Is Peter R the one who constantly posts "diagrams" rather than "words"?

If so then for sure he is not a very genuine person - as diagrams are generally always used to convince stupid people when you don't have the words to convince those that are not.
sr. member
Activity: 433
Merit: 267
January 25, 2016, 12:47:40 PM
#31
But if throughput of transactions greatly exceed the relay
speeds, so that there is some variation in mempools,
then the orphaning cost again plays a part.
No, it doesn't, that's the point. Centralized mining pools aren't at some disadvantage when it comes to throughput because of some perceived increased risk of orphaning.

If by centralization here you mean miner cartellization,
I don't think there is a disagreement. What I am trying to describe
is the opposite of cartellization, where there is a generous amount
of transactions coming in steadily, and mediated through large
amount of full nodes.
I don't know why I wrote "mining pool", it doesn't matter if it's a pool or a big centralized mining entity of any other kind.
legendary
Activity: 996
Merit: 1013
January 25, 2016, 11:36:23 AM
#30
But if throughput of transactions greatly exceed the relay
speeds, so that there is some variation in mempools,
then the orphaning cost again plays a part.
No, it doesn't, that's the point. Centralized mining pools aren't at some disadvantage when it comes to throughput because of some perceived increased risk of orphaning.

If by centralization here you mean miner cartellization,
I don't think there is a disagreement. What I am trying to describe
is the opposite of cartellization, where there is a generous amount
of transactions coming in steadily, and mediated through large
amount of full nodes.
sr. member
Activity: 433
Merit: 267
January 25, 2016, 11:14:16 AM
#29
But if throughput of transactions greatly exceed the relay
speeds, so that there is some variation in mempools,
then the orphaning cost again plays a part.
No, it doesn't, that's the point. Centralized mining pools aren't at some disadvantage when it comes to throughput because of some perceived increased risk of orphaning.
legendary
Activity: 996
Merit: 1013
January 25, 2016, 10:44:03 AM
#28
Greg Maxwell already addressed this incorrect idea that decentralization is protected because somehow information is better transmitted by being decentralized.


In the portion that you quote, as I read it, he is addressing the scenario
where all transactions have been already communicated before,
so the cost (in terms of orphaning rate) of communicating
block information is no longer significant. Not sure what that
has to do with decentralization.

But if throughput of transactions greatly exceed the relay
speeds, so that there is some variation in mempools,
then the orphaning cost again plays a part. For that you need
decentralization by way of having a huge number of widely
distributed full nodes that act as sources of fresh transactions.

sr. member
Activity: 433
Merit: 267
January 25, 2016, 09:29:10 AM
#27
Greg Maxwell already addressed this incorrect idea that decentralization is protected because somehow information is better transmitted by being decentralized.

Quote from: GMaxwell
> What is your definition of a peer?  To me, a peer needs to get information
> from his other peers from which he makes his own independent decisions.
> This information takes time to communicate.  If there's more transactions,
> there's more information to communicate.  The only way this isn't true if
> I'm not mistaken is if there's no information to communicate related to the
> transactions in the block.  But like I said earlier, in this case then the
> miner's aren't peers--they're just slaves--and the system is already
> centralized. [- Peter R]
 
I think you're stuck thinking in a particular model and I'm not sure
how to break you out of it.
 
For example, I'm about to communicate the whole 40 some gigabytes of
the Bitcoin blockchain to you:
 
00000000000000001454fdecfdb2b18cc07bf759f759ce4d8cac3301dc98f478
 
As you can see, I was able to do that by communicating only 32bytes--
and amount that I sent you here had no real dependence on the size of
the blockchain.
 
This was possible because our computers had already agreed on the
content of the blockchain before my communication to you happened.
 
Yes, communication did occur-- but not at the time of my transmission,
it happened arbritarily before.  This is devistating for your analysis
because you are working exclusively with the increased delay in
transmission as a function of the blocksize as a mechenism for
increasing orphaning resulting in an equlibrium profit maximizing
size.
 Data which they sent earlier may have some costs to them, but
it does not have a cost in terms of increasing their orphan rate.
 
In that case I used data I didn't choose (the blockchain), but this
works just as well for data I did choose: I could send you a gigabyte
of data right now, then later send you an extranonce to append that
makes the result have a low hash.
 
There are many ways to use this fundimental result, and the relay
network protocol currently uses it in the simplest possible way.
Ultimately, it means that its possible to construct schemes where
miners retain choice but transmit a constant amount of information at
the time of block announcement.

Bolded for emphasis. I would recommend reading the whole exchange.

http://pastebin.com/jFgkk8M3
legendary
Activity: 996
Merit: 1013
January 25, 2016, 04:13:39 AM
#26

3.  Lastly, if there are lots of transactions occurring on the network, then there will be a corresponding increase in the number of new TXs that pay decent fees that miners will want to immediately include in the blocks they're working on (i.e., they won't bother waiting for pre-consensus on some of those new TXs).  If rational miners make the decision to immediately include those TXs (and they would because it would earn them more money than by never included brand new transactions) then the block solution announcements will contain an amount of information that also depends roughly linearly on the transactional throughput of the network (i.e., on the avg block size)!  

I admire the approach in the paper and if anything it points us towards asking right
questions. And my every cell cries out that artificial scarcity by central planning is just
plain wrong!

But at present the part quoted above seems to me the only way that the fee market as
outlined in your paper would work.
Because advances in block propagation would push the supply curve slope arbitrarily close
to information-theoretical limits, so that finally all the info transmitted on
block discovery would be the bits necessary to eliminate double spends.
AFAICS The effect on fees would be negligible at this point.

However, besides large throughput other conditions would have to be present
in order to guarantee variation in the tx composition within mined blocks however.
I imagine that it would require a wide distribution of mining nodes that were
dependent mostly on non-mining full nodes for receiving transactions
.
That way the transactions would arrive to individual miners from "different directions"
and if the throughput was voluminous enough, there would be always txs that the other miners
had not heard about yet.

That situation is the polar opposite of miner cartellization, and for it to happen
the full nodes would have to be empowered and prove themselves indispensable
to miners. Full nodes forming a relay network would be a good start, but even more
importantly the full nodes should be the points where transactions actually happen. This means
many many Bitcoin businesses and points of sale that all operate a full node.

To sum it up
- Naturally occurring fee market needs
- a large throughput, which needs
- more decentralization by empowering full nodes, which tantamounts to
a cambric explosion of Bitcoin economy



sr. member
Activity: 433
Merit: 267
January 25, 2016, 12:44:15 AM
#25
TL/DR: A transaction fee market exists without a block size limit assuming miners act rationally.
Transaction fee's are bid to the minimum necessary to include them on the blockchain. This minimum does not magically include all miners, and would reach equilibrium, absent any artificial scarcity, at roughly the cheapest rate possible to achieve that end. That collection of rates would be enough to cover a single datacenter in an ideal location.
The only artificial scarcity in the system is the block size limit. With it, fees can be bid above that equilibrium and thereby fund redundancy of both nodes as well as hashpower.
Without it, nodes would atrophy toward that equilibrium (~1 node). Bitcoin would have failed long before that equilibrium would have been reached, not only because it would mean the goal of a decentralized currency would have failed, but also because hashpower would fall below what would be necessary to keep it secure.

If you're saying that even in that outrageously broken scenario that it would still cost money to add transactions to the block chain, yes you are right. Are you planning on actually addressing the real problems with removing the block size limit?
sr. member
Activity: 277
Merit: 257
January 24, 2016, 11:23:02 PM
#24
I have read and thought about what you are saying and
first of all, thank you for reading and peer reviewing
Peter's paper.

Let's discuss.

I would like to believe that Peter is correct but
I'm open minded enough to consider that he may not be.

I want to understand more clearly what you are saying and why.

Quote
Peter says that a non-zero supply curve exists because increasing block size increases cost to miners due to higher orphan rate. I have also heard a variation of this argument with increasing block size being constrained by hardware resources, internet bandwidth etc.

Theoretically these arguments are correct, practically they are not.

The effect of block size on these costs is almost negligible

I'm not sure why you say that. Can you support that point?

Would would they be negligible?

To me, it seems anything but negligible, and seems like common sense.  
The bigger block, the longer the time is required to process
it, and the greater the the risk of orphaning.

What am I missing here?




Sorry for the late reply, Gmax and Peter R already mostly answered though. I guess what I did not make clear is that the fee market is supposed to be a long term solution and a stable equilibrium would need to exist long term.

The blocksize supply curve (in Peter Rs speech) is only a function of its orphan rate. Ofcourse the market would place pressure to constantly increase the supply. The methods Gmax described would allow to greatly increase the supply in centralised environment to the point where block size is not an issue.
legendary
Activity: 1162
Merit: 1007
January 24, 2016, 10:25:10 PM
#23
Right.  If miners all agreed to behave in this way (e.g., pre-consensus by mining a practice chain first), then the fee market based on orphaning risk would break down.  It would be equivalent to the propagation impedance in my fee-market paper falling to zero.  

Assuming negligible block subsidies, why would miners agree to behave this way if that would collapse the fee market?  How would security be funded?

First let's ask if they would behave this way with today's block subsidies (the other question is harder to answer).  

In order for the fee market to collapse (due to the propagation impedance falling to zero), no new information about the transactions included in the block may be transmitted during block-solution announcement.  But if no new information was transmitted, then the miners must have already come to consensus on the state of the ledger.  If that were case, then what purpose did mining the blocks fulfil?  

Haha but there's a better way to show that the fee market won't collapse even if you ignore the absurdity in the previous paragraph.  It goes like this:

1.  In order for the miners to know another miner's block contents before that lucky miner finds a proof-of-work, all miners must somehow agree to what the possible block contents are beforehand.  Since coming to this agreement cannot happen instantly, this means that the transactions included in the block must all be delayed by at least time T, where T is the consensus time.  The current block would never include a transaction that was announced only a second before the block was found, because the miners couldn't have come to pre-consensus on it.  In other words, if a miner included that new transaction it would mean that he'd have to transmit a non-zero amount of new information with his block solution announcement (he'd have to send the new TX as well), thereby violating the condition required for the fee market to collapse.    

2.  However, maybe that new transaction had a big juicy fee attached to it!  If miners are free to build blocks according to their own volition--and if miners are rational profit maximizing agents--then there should exist some fee that would entice the miner to include that brand new TX in the block he's working on even though doing so increases the amount of information he might have to send with his block solution announcement.  

3.  Lastly, if there are lots of transactions occurring on the network, then there will be a corresponding increase in the number of new TXs that pay decent fees that miners will want to immediately include in the blocks they're working on (i.e., they won't bother waiting for pre-consensus on some of those new TXs).  If rational miners make the decision to immediately include those TXs (and they would because it would earn them more money than by never included brand new transactions) then the block solution announcements will contain an amount of information that also depends roughly linearly on the transactional throughput of the network (i.e., on the avg block size)!  

TL/DR: A transaction fee market exists without a block size limit assuming miners act rationally.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
January 24, 2016, 09:51:39 PM
#22



Right.  If miners all agreed to behave in this way (e.g., pre-consensus by mining a practice chain first), then the fee market based on orphaning risk would break down.  It would be equivalent to the propagation impedance in my fee-market paper falling to zero.  

Assuming negligible block subsidies, why would miners agree to behave this way if that would collapse the fee market?  How would security be funded?
legendary
Activity: 996
Merit: 1013
January 24, 2016, 04:45:29 PM
#21

Amusingly, all of these were pointed out in peer review on Peter_R's paper, and he agreed to revise it to at least make the assumptions explicit... but then did not do so.

See http://www.bitcoinunlimited.info/public/downloads/feemarket.pdf
p.13, from 2nd paragraph onwards. They're there.
legendary
Activity: 1162
Merit: 1007
January 24, 2016, 02:27:19 PM
#20
To me, it seems anything but negligible, and seems like common sense.  
The bigger block, the longer the time is required to process
it, and the greater the the risk of orphaning.
What am I missing here?

All the of expensive operations can be done before a block is found, then they do not add any proportional time.

Correct.  

Miners could agree to only solve blocks that the other miners are already expecting.  This would mean that exactly zero information about the block contents would need to be transmitted at the time of block-solution announcement.  This would eliminate any block-size dependent propagation risk.  

1. By expensive operations, I assume you mean the validation of transactions.
The validation of transactions could be done by the miner solving the block,  before a block is found, but the propagation of a large block across the network is another story.
Assuming a 1 MB/second speed, this seems like it would certainly become a factor when blocks get large, say 100 MB.  100 seconds
is surely significant (a factor of 16.6~% on a 600 second block interval).

2. Nodes validating the new block need to make sure they have all the transactions in the block and then compute the merkle tree (not sure how operationally expensive that is).
So are you sure "all" of the expensive operations can be done.  Seems like this would contribute to delay and orphaning.

P.S. no hostility personally to gmax despite my opinions about the blocksize debate.
 

A lot of things are possible.  

Block-size dependent propagation delays can be completely eliminated if all of the miners know with some certainty what the next block could be before it arrives.  In particular, if miners know that the next block will be 1 of N possible blocks, and if N does not depend on the block size, then communicating which of the N expected blocks is the "real" block requires only log2N bits (plus of course communicating the PoW).

It is even possible to make N one.  Imagine that the miners mine a "practice blockchain" first and that the "real blockchain" next such that the blocks in the real blockchain lag those in the practice blockchain by, for example, one hour.  All miners agree to make the real chain an exact copy of the practice chain.  Voila: the "real blocks" can be communicated by just transmitting the block headers.    



Assuming those possibilities, what would be the implication about the fee market?  
Wouldn't this support Daniel's point that a fee market would not develop?


Right.  If miners all agreed to behave in this way (e.g., pre-consensus by mining a practice chain first), then the fee market based on orphaning risk would break down.  It would be equivalent to the propagation impedance in my fee-market paper falling to zero.  
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
January 24, 2016, 02:25:27 PM
#19
To me, it seems anything but negligible, and seems like common sense.  
The bigger block, the longer the time is required to process
it, and the greater the the risk of orphaning.
What am I missing here?

All the of expensive operations can be done before a block is found, then they do not add any proportional time.

Correct.  

Miners could agree to only solve blocks that the other miners are already expecting.  This would mean that exactly zero information about the block contents would need to be transmitted at the time of block-solution announcement.  This would eliminate any block-size dependent propagation risk.  

1. By expensive operations, I assume you mean the validation of transactions.
The validation of transactions could be done by the miner solving the block,  before a block is found, but the propagation of a large block across the network is another story.
Assuming a 1 MB/second speed, this seems like it would certainly become a factor when blocks get large, say 100 MB.  100 seconds
is surely significant (a factor of 16.6~% on a 600 second block interval).

2. Nodes validating the new block need to make sure they have all the transactions in the block and then compute the merkle tree (not sure how operationally expensive that is).
So are you sure "all" of the expensive operations can be done.  Seems like this would contribute to delay and orphaning.

P.S. no hostility personally to gmax despite my opinions about the blocksize debate.
 

A lot of things are possible.  

Block-size dependent propagation delays can be completely eliminated if all of the miners know with some certainty what the next block could be before it arrives.  In particular, if miners know that the next block will be 1 of N possible blocks, and if N does not depend on the block size, then communicating which of the N expected blocks is the "real" block requires only log2N bits (plus of course communicating the PoW).

It is even possible to make N one.  Imagine that the miners mine a "practice blockchain" first and that the "real blockchain" next such that the blocks in the real blockchain lag those in the practice blockchain by, for example, one hour.  All miners agree to make the real chain an exact copy of the practice chain.  Voila: the "real blocks" can be communicated by just transmitting the block headers.    



Assuming those possibilities, what would be the implication about the fee market? 
Wouldn't this support Daniel's point that a fee market would not develop?


legendary
Activity: 1162
Merit: 1007
January 24, 2016, 02:20:35 PM
#18
To me, it seems anything but negligible, and seems like common sense.  
The bigger block, the longer the time is required to process
it, and the greater the the risk of orphaning.
What am I missing here?

All the of expensive operations can be done before a block is found, then they do not add any proportional time.

Correct.  

Miners could agree to only solve blocks that the other miners are already expecting.  This would mean that exactly zero information about the block contents would need to be transmitted at the time of block-solution announcement.  This would eliminate any block-size dependent propagation risk.  

1. By expensive operations, I assume you mean the validation of transactions.
The validation of transactions could be done by the miner solving the block,  before a block is found, but the propagation of a large block across the network is another story.
Assuming a 1 MB/second speed, this seems like it would certainly become a factor when blocks get large, say 100 MB.  100 seconds
is surely significant (a factor of 16.6~% on a 600 second block interval).

2. Nodes validating the new block need to make sure they have all the transactions in the block and then compute the merkle tree (not sure how operationally expensive that is).
So are you sure "all" of the expensive operations can be done.  Seems like this would contribute to delay and orphaning.

P.S. no hostility personally to gmax despite my opinions about the blocksize debate.
 

A lot of things are possible.  

Block-size dependent propagation delays can be completely eliminated if all of the miners know with some certainty what the next block could be before it arrives.  In particular, if miners know that the next block will be 1 of N possible blocks, and if N does not depend on the block size, then communicating which of the N expected blocks is the "real" block requires only log2N bits (plus of course communicating the PoW).

It is even possible to have N=1.  Imagine that the miners mine a "practice blockchain" first and then the "real blockchain" next such that the blocks in the real blockchain lag those in the practice blockchain by, for example, one hour.  All miners agree to make the real chain an exact copy of the practice chain.  Voila: the "real blocks" can be communicated by just transmitting the block headers.  The fee market thus appears to break down.  

Pages:
Jump to: