Pages:
Author

Topic: What will keep transaction fees up? - page 3. (Read 15387 times)

legendary
Activity: 980
Merit: 1014
November 21, 2010, 01:03:47 PM
#46
Accepting every transaction is the selfish thing. Every batch from the oven yields an unlimited amount of bread. And there is a thousand people who offer to pay very little. Will the baker decline to take the money, throw away the bread and let the competitors make the sales instead?


But don't you need to accept a price?
db
sr. member
Activity: 279
Merit: 261
November 21, 2010, 12:51:55 PM
#45
The fallacy that this entire discussion has been trapped into is the assumption that the ‘selfish’ thing is to accept every transaction, no matter how small the fee is, rather that is ‘altruistic.’

Accepting every transaction is the selfish thing. Every batch from the oven yields an unlimited amount of bread. And there is a thousand people who offer to pay very little. Will the baker decline to take the money, throw away the bread and let the competitors make the sales instead?
legendary
Activity: 1222
Merit: 1016
Live and Let Live
November 21, 2010, 11:55:05 AM
#44
@da2c7

Why would a generator refuse a transaction with any amount of fee if s/he can still add it to the block?

A bank doesn't need to mind with transactions the way you suggest. All transactions in between its clients don't need to be real bitcoin transactions, they can be just updates in the bank database. The bank would only need to send transactions to the chain when transferring from/to an "outsider" address.


Some will, that is charitable, but most aren’t that charitable (and this is a good thing).  I think that an analogy is in order.

A man runs a bakery; he bakes a certain amount of goods early each morning, and sells them throughout the day, when he has leftovers he may reduce the price to flog the bread off, or may take it home for his chooks to eat.

The fallacy that this entire discussion has been trapped into is the assumption that the ‘selfish’ thing is to accept every transaction, no matter how small the fee is, rather that is ‘altruistic.’ The real selfish thing is to accept whoever pays into the block.

See most that run bakeries know that most people buy the bread anyway, cheap or not.  They just wait until the end of the day when it is on sale. That is why it is (more) profitable to take the bread home for you chooks.

Both bakeries and generators work on the lovely formula:  Average Price = (Total Cost + Profit) / Average Sales.

On banks, banks will allow members free or cheap transactions for personal use, if they report the transaction to the bank.
legendary
Activity: 1106
Merit: 1004
November 21, 2010, 11:11:02 AM
#43
@da2c7

Why would a generator refuse a transaction with any amount of fee if s/he can still add it to the block?

A bank doesn't need to mind with transactions the way you suggest. All transactions in between its clients don't need to be real bitcoin transactions, they can be just updates in the bank database. The bank would only need to send transactions to the chain when transferring from/to an "outsider" address.
legendary
Activity: 1222
Merit: 1016
Live and Let Live
November 21, 2010, 10:57:15 AM
#42
From my understanding of the technology and current system, this entire discussion about the block size somehow being related to the transaction fees is silly in the long run.

Quite plainly, who said that the generators have to fill the entire block up?  To me it is simple that generators will simply include: x amount of low fee, and y amount of medium fee, and remainder high fee.  If the space isn’t used, it isn’t used.

It is in the generators interest for to keep the network healthy, this includes not generating at a loss.

Artificially holding the price high by having a smaller block size is both silly and rather pointless, it creates an artificial tax for bitcoin.  The only purpose of having block size limits is for flood protection; otherwise it should be as large as it needs to be.

You must remember that the generators are acting in their self-interest, therefor, they will charge for their service, of course, eventually the low fee transactions will be accepted by some charitable person, but on the whole, the large generators will be selfish and not undercharge for their service, rather they will decide on fees that are profitable.

Finally, who ever said that generators will blindly accept transactions from anyone?   I quite possibly foresee that a bank generator will have low transactions for any of its members address, and high fees for the competition.  This would give the bank a competitive advantage.
legendary
Activity: 1106
Merit: 1004
November 21, 2010, 09:54:23 AM
#41
Db is right. We should not expect people to generate on a net loss. I don't doubt that there will be people willing to do so. But I think we should aim more than that, otherwise the difficulty factor would be much lower than what it could be.

And this is a problem that could be partially solved by an automatic adjustment of the block size limit, as I argue on the other thread.
db
sr. member
Activity: 279
Merit: 261
November 21, 2010, 06:11:38 AM
#40
I'm afraid you missed the analogy that I was trying to present, and your's is more than a bit flawed as well.

Let's drop the analogies then and go straight at the problem.

The maximum block size is big, having room for all or most transactions.
Therefore, transaction fees are (close to) zero.
Therefore, total block transaction fees are also (close to) zero.
Therefore, all for-profit block generation ceases.
Therefore, difficulty drops.

If for-profit block generation was the only generation going on then difficulty would be very low and double spending very easy. The only thing that can save the system is people generating at a loss.

Now we have a lot of bitcoin holders that would lose greatly if payments become unreliable and confidence in the system drops. Will they contribute to their common good? If they do it will not be out of self interest. If you contribute to the system reliability you lose your contribution and benefit very little because the benefit is shared with everyone. If you use the system without contributing you benefit from everyone elses contributions anyway.

The usual sad result is that everyone tries to live off of everyone elses contributions, very little is actually contributed and everyone loses.

Classic tragedy of the commons.
hero member
Activity: 527
Merit: 500
November 20, 2010, 09:09:05 PM
#39
The maximum block size must be continuously adjusted to keep the transaction prices stable. The only way to change the maximum block size is through a lengthy political process of debate, decree, network fragmentation and majority agreement.

This is a bad idea. The generators will collude to keep the block size small; transactions scarce, gouging the market.

Some may try.  Keep in mind that generators have no sustainable monopolies on generation, not even as a group.  If the major generators collude to keep block sizes small amongst themselves; say by keeping their own max block sizes at 1 meg, but the regular users' clients all have a max block size limit of 3 megs, then the rising backlog of lower fee transactions will attract new players into generation.  Maybe forcing the colluding generators to change, maybe not, but a natural price balance will be maintained.  Perhaps the occasional blockchain split fight is neccessary.

Okay, I'm starting to like this idea. It could be a bit messy with blockchain splits, but it seems like it will work.
legendary
Activity: 1708
Merit: 1007
November 20, 2010, 08:56:27 PM
#38
how about making the max block size a function of the difficulty? If there are too few generators, then the max block size will decrease making transactions scarce. This will drive up the txfee and create incentive for new generators to enter the market. vice-versa.

Absolutely not.  The max block size would increase to the point that too much room was available for spamming, and that would be to much of a temptation for some. 

Now that's an example of the Tragedy of the Commons issue!  The amount of space available to free transactions in the block.
hero member
Activity: 527
Merit: 500
November 20, 2010, 08:56:16 PM
#37
I'm pretty sure the computation time is linear with respect to data size,


Thats true on most calculations, but most certainly not so with data structures that self reference, and a conditional is a self reference.  I'm not sure if it would be true with hashing algorithims or not, but I wouldn't assume that the algorithim is particularly linear in nature.


I meant:
"I'm pretty sure the computation time is linear with respect to data size, for hash functions"

Anyway, as theymos pointed out, you only have to hash the transaction once, not on every attempt. So, this isn't a problem.
legendary
Activity: 1708
Merit: 1007
November 20, 2010, 08:54:01 PM
#36
Which is why major institutions will still be willing to contribute clock-cycles at or just below a break even point.  Because there are more forms of economic motivation than just profit.  I'm really suprised that so many who seem so well educated on economic issues can't wrap their head around this simple concept.  If you have something valuable to protect, have you ever paid the rental fee on a safety deposit box?  The cost of the box rental is tiny compared to the value of the object within, but that's not a tragedy of the commons!  People do it all the time!  It's a cost of security, not a resource access issue!  The tragedy of the commons parable is a limited resource issue!
This is not like individual safety deposit boxes. This is like one big collective vault in which it is free for anyone to place their valuables and paying is optional.


I'm afraid you missed the analogy that I was trying to present, and your's is more than a bit flawed as well.
legendary
Activity: 1708
Merit: 1007
November 20, 2010, 08:52:20 PM
#35
I'm pretty sure the computation time is linear with respect to data size,


Thats true on most calculations, but most certainly not so with data structures that self reference, and a conditional is a self reference.  I'm not sure if it would be true with hashing algorithims or not, but I wouldn't assume that the algorithim is particularly linear in nature.
hero member
Activity: 527
Merit: 500
November 20, 2010, 08:52:06 PM
#34
how about making the max block size a function of the difficulty? If there are too few generators, then the max block size will decrease making transactions scarce. This will drive up the txfee and create incentive for new generators to enter the market. vice-versa.
db
sr. member
Activity: 279
Merit: 261
November 20, 2010, 08:37:03 PM
#33
Which is why major institutions will still be willing to contribute clock-cycles at or just below a break even point.  Because there are more forms of economic motivation than just profit.  I'm really suprised that so many who seem so well educated on economic issues can't wrap their head around this simple concept.  If you have something valuable to protect, have you ever paid the rental fee on a safety deposit box?  The cost of the box rental is tiny compared to the value of the object within, but that's not a tragedy of the commons!  People do it all the time!  It's a cost of security, not a resource access issue!  The tragedy of the commons parable is a limited resource issue!
This is not like individual safety deposit boxes. This is like one big collective vault in which it is free for anyone to place their valuables and paying is optional.
hero member
Activity: 527
Merit: 500
November 20, 2010, 08:34:30 PM
#32
Are you sure? Given any difficulty, you still have to crunch the numbers to solve a block. The more transactions, the more numbers to crunch, thus the longer it takes to compute a given hash.

Block hashes are only hashes of the fixed-size 80 byte block header, which contains a hash of the transactions. Transactions only have a small one-time CPU cost for adding.

Ahhh, okay then. Thanks for the reassurance.
hero member
Activity: 527
Merit: 500
November 20, 2010, 08:30:45 PM
#31
I just thought of something. The time it takes to generate a hash if proportional to the number of transactions you're hashing, right? So, it'll take twice as long (on average) to generate a block with 1000 transactions as one with 500. You're not going to waste precious hasing time on small fee transactions, they'll just decrease your hash/s for negligible gain.

I wouldn't assume that it's as straightforward as that, and you are probably overthinking it anyway.  Feel free to try it, though.

Well it's pretty simple really, unless I'm missing something, or hashes don't actually work like I think they do. The more data you have to hash, the longer it will take to compute the hash. Makes sense right? I'm pretty sure the computation time is linear with respect to data size, so double the number of transactions and you double the time to compute the hash.

This is a huge problem now, because why would anyone hash more than one transaction to generate a block. They get 50BTC either way, so you might as well just hash one transaction giving you the optimal hash/s.

I hope I'm wrong about this, I'd really like more feedback from you guys. Perhaps we need a minimum block size or something.
administrator
Activity: 5166
Merit: 12850
November 20, 2010, 08:24:34 PM
#30
Are you sure? Given any difficulty, you still have to crunch the numbers to solve a block. The more transactions, the more numbers to crunch, thus the longer it takes to compute a given hash.

Block hashes are only hashes of the fixed-size 80 byte block header, which contains a hash of the transactions. Transactions only have a small one-time CPU cost for adding.
legendary
Activity: 1708
Merit: 1007
November 20, 2010, 08:19:13 PM
#29
I just thought of something. The time it takes to generate a hash if proportional to the number of transactions you're hashing, right? So, it'll take twice as long (on average) to generate a block with 1000 transactions as one with 500. You're not going to waste precious hasing time on small fee transactions, they'll just decrease your hash/s for negligible gain.

I wouldn't assume that it's as straightforward as that, and you are probably overthinking it anyway.  Feel free to try it, though.
hero member
Activity: 527
Merit: 500
November 20, 2010, 08:14:55 PM
#28
The time it takes to generate a hash if proportional to the number of transactions you're hashing, right?

No, not really. It's related to the difficulty factor only.

Are you sure? Given any difficulty, you still have to crunch the numbers to solve a block. The more transactions, the more numbers to crunch, thus the longer it takes to compute a given hash.
legendary
Activity: 1106
Merit: 1004
November 20, 2010, 08:12:14 PM
#27
The time it takes to generate a hash if proportional to the number of transactions you're hashing, right?

No, not really. It's related to the difficulty factor only.
Pages:
Jump to: