Pages:
Author

Topic: Post-inflation security (Read 5030 times)

hero member
Activity: 481
Merit: 529
May 22, 2011, 11:05:47 AM
#30
Right, it's not implemented in the software. It's tricky because the codebase assumes you aren't trying to double spend.

Regardless, I don't think it's necessary. "Guessing" the right fee level like this is going to be very slow. There are better ways.

Better ways for end users, sure, like your idea (if I remember) of servicers who, for a fee, would guarantee an acceptance time and assume the associated risk.  I see nothing wrong with that.

I worry, though, about worse ways being proposed, such as changing the block payout policy, which amounts to messing with monetary policy.

Double-spending as a strategy to speed payment already exists, in a sense.  It is implicit in the protocol.  I wouldn't be surprised if someone has already tried it (using wallet backups or custom software).  I would be surprised if it never becomes an at least occasional standard practice.
legendary
Activity: 1526
Merit: 1134
May 22, 2011, 09:05:07 AM
#29
Right, it's not implemented in the software. It's tricky because the codebase assumes you aren't trying to double spend.

Regardless, I don't think it's necessary. "Guessing" the right fee level like this is going to be very slow. There are better ways.
hero member
Activity: 481
Merit: 529
May 22, 2011, 07:11:28 AM
#28
Replacing transactions like that isn't possible today. Making it possible would be, to quote Satoshi, "easier said than implemented".

You mean it's not implemented in the software, right?  As far as the protocol is concerned, I believe it is possible.  Yes, it would present an accounting challenge for the current wallet model.
legendary
Activity: 1526
Merit: 1134
May 22, 2011, 03:19:20 AM
#27
Replacing transactions like that isn't possible today. Making it possible would be, to quote Satoshi, "easier said than implemented".
hero member
Activity: 481
Merit: 529
May 20, 2011, 12:33:29 PM
#26
Some of us seem worried about how payers will know what size fee to offer.

I don't understand the problem.  Payers can effectively expedite a slow transaction by submitting a similar one with a larger fee while the first tx languishes unprocessed.  To guard against double-receipt (the flip side of the double-spend problem) the wallet software must simply:

  • make the transactions mutually incompatible by having them share inputs
  • use inputs that are unlikely to receive BTC (such as hidden change addresses), (edit: not needed; inputs on the wire reference earlier transactions, not addresses) and
  • prepare to handle the possible outcomes: tx 1 accepted, tx 2 accepted, or both languish.

As Gavin eloquently explained, miners will have different and evolving cost structures and preferences.  For a given received tx, a miner will have an algorithm by which to:

  • drop it and treat it as spam or DoS attack (block the sending node, sue the sender)
  • just drop it
  • keep it for possible later processing, or
  • include it in the current block

and either relay or not relay it.

The result will be a "fee curve" of cost over expected processing time for a given tx size.  People will invent ways to compute the curve, if nothing else, then by periodic experiment, and the curve will be generally published as a guide to payment software.

I hope this alleviates the worry.
legendary
Activity: 3920
Merit: 2349
Eadem mutata resurgo
May 19, 2011, 04:54:32 AM
#25
[mike]:

Quote
How much work is enough? I think that can only be found by lowering the security of the network until we start seeing transactions be reversed by botnets and such, at that point people (or insurance companies) will start paying more fees in order to reduce or eliminate the reversals.

There are other attacks they may require more security than double-spends. It is going to be really hard to answer the "How much is enough?" question until there is a systemic, existential threat presenting itself and we can measure the size of it .... or ?
full member
Activity: 234
Merit: 100
AKA: Justmoon
May 19, 2011, 03:02:22 AM
#24
Some points I'd like to add to the discussion:

Fees are always paid by the recipient (even though it might seem the other way around)

The only party directly interested in the success of the transfer is the recipient. The sender is only willing to pay a fee on behalf of the recipient. The sender gains nothing from coins disappearing from his wallet, however the recipient needs the transfer to be successful in order to take definitive custody of the coins transferred.

Even in cases where it seems like the sender pays the fee voluntarily he's still only acting on behalf of the recipient. So if I send money to my grandma and she says "don't include a fee," but I do, it's because I'm acting altruistically in the interest of my grandma.

Therefore it provides a clearer picture to look at transaction costs from the perspective of the recipient.

Cost function

Gavin's formula captures the normal situation perfectly. But [mike] correctly points out that there is another cost, currently hidden (because it is very, very near zero at the moment).

So extending Gavin's formula with the risks, we get:

cost = f(txn, B, P) + txamount * (1-P) + txamount * PA

PA ... Probability of a successful attempt by s.b. to overtake the block chain for the purpose of rejecting/delaying at least this transaction.

Or, in other words, the total cost of a transaction is
- the fee paid, plus
- the risk of it not being included because the fee is too low, plus
- the risk of it not being included because the block chain has been overtaken.

That means that if f() is continuous we can actually find an optimal P for any given txn and B. (In the formulas I use myself, I actually use t for time, however B is Bitcoin's representation of time, so it's just a different unit essentially.)

At the moment PA is very, very small. This is because seigniorage provides an added incentive to mine, leading to much, much higher hash rates.

As minting is phased out, the hashing rate will come down, and PA will rise. At some point it will start to be noticeable either as a risk by observant users or because an attack has actually taken place.

In terms of the P2P client... For now it seems reasonable to assume that PA is for all intents and purposes zero. And if we can indeed figure out a good approximation function for f(), we should be able to calculate the recommended fee based solely on txn and B. Note that an optimal P will likely be fairly close to 1, because we're counting the tx not getting confirmed at the requested time as a total loss (which is an exaggeration). But it allows us to ask the user "How fast do you want this to get confirmed?" and it will almost always get confirmed at least by that time.

In terms of future solutions... I expect insurance providers to crop up very soon. Even with PA = 0, they can still provide a useful service by offering t < 10min, something the P2P client can't offer. As soon as PA starts rising that will simply become another selling point for them.


Disclaimer: I actually suck at math, so feel free to point out any errors in my formulas/reasoning. Smiley
legendary
Activity: 1526
Merit: 1134
May 18, 2011, 10:56:00 AM
#23
Yes, that sets the minimum fee at the cost of processing and verifying the transaction. It doesn't set any particular level of mining though.

How much work is enough? I think that can only be found by lowering the security of the network until we start seeing transactions be reversed by botnets and such, at that point people (or insurance companies) will start paying more fees in order to reduce or eliminate the reversals.

Perhaps we need to be thinking in terms of minimum fee for acceptance (ie, the cost of processing the transaction) and the additional security fee for mining. Gavins proposal of looking at average fees makes a lot of sense for discovering what the current cost of processing a tx is, if nobody is paying big fees for work done. If the average is being distorted by transactions moving millions of BTC around then you might end up paying more than the actual cost of processing transactions.
legendary
Activity: 1652
Merit: 2311
Chief Scientist
May 18, 2011, 10:00:13 AM
#22
Okay I totally don't understand. If the block size is not a limit then why would anybody drop a tx that they had already paid the ECDSA price for?

They haven't paid the ECDSA price.  The decision is "I know how big this transaction, how many OP_CHECKSIG opcodes I'll have to compute to verify it, and how much transaction fees it pays.  Should I do the work of verifying it or should I just ignore it?"


@ribuck:  yes, the UI would be much simpler, but internally the client needs a model of what the miners are accepting.  Maybe a really simple internal model will work if the UI is really simple...
WNS
newbie
Activity: 39
Merit: 0
May 18, 2011, 08:55:43 AM
#21
In the long run, block size will NOT be the bottleneck, so it will NOT determine the marginal cost of a transaction.

The bottleneck is, and I believe will continue to be, the number of ECDSA signature verifications a miner can perform per second.  Each miner (or mining pool operator) will have a transaction processing capacity of N transactions per second.

If there are more than N transactions per second going across the network, then the smart miners will select the most profitable N for inclusion in their blocks and drop the least profitable.

Okay I totally don't understand. If the block size is not a limit then why would anybody drop a tx that they had already paid the ECDSA price for?
donator
Activity: 826
Merit: 1060
May 18, 2011, 08:26:09 AM
#20
Users want to know what fee to pay, given the constraints "I want this transaction confirmed in B or fewer blocks with probability greater than P"...

Code:
fee = f(txn, B, P)

That might be true for current users. Future users won't want anything more complicated than "I want this transaction to be confirmed (a) in about an hour, or (b) in about a day."

Or even better, just a checkbox labelled "Priority transaction (fee applies)".
legendary
Activity: 1652
Merit: 2311
Chief Scientist
May 18, 2011, 07:57:24 AM
#19
In the long run, block size will NOT be the bottleneck, so it will NOT determine the marginal cost of a transaction.

The bottleneck is, and I believe will continue to be, the number of ECDSA signature verifications a miner can perform per second.  Each miner (or mining pool operator) will have a transaction processing capacity of N transactions per second.

If there are more than N transactions per second going across the network, then the smart miners will select the most profitable N for inclusion in their blocks and drop the least profitable.

And the smart miners will keep track of how much it would cost them to invest in more CPUs or specialized ECDSA-verification hardware so they can process N+M transactions per second.  And figure out how much they would make in fees or side-deals (or whatever) when they handle those extra M transactions per second.  If it is profitable, they will increase their transaction processing capacity.


I think what bitcoin is missing right now is code in the clients to figure out the "right" amount of fees.  We're currently relying on hard-coded rules that match in the client and in the miners (because it was All One Application to start).  We need to move to something more dynamic.  Some thoughts I jotted down last night:

Users want to know what fee to pay, given the constraints "I want this transaction confirmed in B or fewer blocks with probability greater than P".

If we think of that as an equation:
Code:
fee = f(txn, B, P)
... then the question is can a client estimate f by looking at the block chain and/or observing transactions as they fly across the network and (eventually) get included in blocks?  Or can we come up with a protocol to communicate f between clients and miners?

WNS
newbie
Activity: 39
Merit: 0
May 18, 2011, 07:30:00 AM
#18
A mining cartel wouldn't last very long. Miners inside the cartel would have the temptation to secretly mine a little on the side with a lower minimum fee in order to get some extra profits. Miners outside the cartel will also be more profitable than miners inside the cartel, because they get the same amount of money from the expensive transactions plus additional money from lower fee transactions.

Apparently I have not explained well what I mean by cartel, my scenario:

-the block bounties are effectively gone
-the block size is abolished
-almost nobody pays fees, because there is no competition for block inclusion
-mining is slightly better than break even, everyone who can't pay their bill already dropped out
-some group(pool operator,company etc) has >20% of hashes
-that group decides not only not to accept free tx for their block, they refuse to validate any block that contains free tx

result:

-20%+ of blocks will be, by default, paid only, since they are generated by the cartel
-submitting blocks with no free transactions results in a 25%+ advantage in a validation race ( when two blocks are submitted to the network close together)
-other miners, in a low profit fee-as-donation system, don't want to take a 25%+ hit in races so they don't submit free tx, gaining an advantage, as well as creating a defacto barrier to entry for transactions.
-once they are onboard with not accepting frees, it's obvious that some, probably more than half the cash strapped market, will join the revolution and resign all pending frees to the tx pool forever.
-mining is now more profitable
BUT
-the market is broken because the miners are incentivized to continue to raise the entry barrier till they face outside(credit card) transactional cost competition.
-the market is also broken due to an ever growing pool of permanently abandoned tx that are blocked by arbitrarily imposed barriers.

A rule would be great if we had any basis for deciding one...

Which is exactly why I proposed a modulated, enforcible, protocol rules that sets the block size to be larger than the paid market.

If free riders don't like the delay they can include fees, get their tx processed, and indirectly increase the size of the paid market by paying, thereby allowing more free riders to ride, and the miners to make more money.

In my proposal everybodies interests are aligned. Each group competes amongst themselves, but the users and the producers are not at odds, and they can all make well informed decisions about the likely consequences of their movements in the market.

Quick sidenote: Note that if your rule creates fees that are too low for a long period of time, the whole merchants buying insurance, which raises prices mechanism will still kick in. So you wouldn't do very much damage, only introduce a bit of uncertainly about how/when the rule is going to be changed. It's only when you accidentally set it too high that you cause a massive amount of wasted resources including extra CO2 emissions, wasted electricity and mining hardware.

If the equation for adjustment is part of the protocol, and the recalculation is at a set interval (like everything else in the protocol), then everybody will be able to trivially predict the date, and outcome the the recalculation.

Nobody would be able to impose any rules on the system. The only person who wants to impose a rule is you. I'm arguing against instituting a rule.

There are two ways to get a rule applied in bitcoin:
-add it to the protocol
-game the validation system

If you refuse the first, and then create the conditions such that the second is the only way to profit as a miner, then you are setting a rule through negligence, I prefer that we consider these things proactively.

Insurance companies would have no influence on mining fees and neither would miners. The only factor influencing mining fees would be how much users (particularly merchants) are willing to pay to make sure their transactions get confirmed. If the network got under attack, they would be willing to pay more. If everything was running smoothly, they would be willing to pay less. That way, the hashing rate would auto-regulate depending on the actual real-life cost that attacks incur upon the Bitcoin economy.

If mining is unprofitable, and free transactions are unreliable, I suggest that a failure scenario is much more likely than any sort of market correction (since there would, of course, be no market, since there is no limit or marginal cost to the commodity in question).

The block size limit may not have been intended to create market scarcity necessary for the maintenance of the network after block bounties drop off, but it does do that. I don't see why we should just throw it out, instead of setting up a few other trivial rules to deal with its single short coming.
WNS
newbie
Activity: 39
Merit: 0
May 18, 2011, 06:20:19 AM
#17
There is a very real possibility that if the block size restriction is eliminated entirely that the transactional funding of the pool will force out any but the hard-core fanboys, and the most efficient pros. This gives the pros a strong incentive to collude to not accept any blocks which contain free/nearfree transactions.
Many people think this, when they first approach bitcoin.  But either way (with/out blk sz limit), free/near-free transactions give miners a userbase, which gives bitcoins their value.  Turning away users is a clear negative incentive.

I predict that the free TX area will increase in size, even as it suffers a Tragedy of the Commons, because we want to pull in more users.

having a userbase is only important to miners if processing transactions is profitable, without blk sz limits there is no market for transactional fees. The only way for miners to force mining to be profitable is to set an entry bar by colluding to block all free/nearfree traffic.
legendary
Activity: 1526
Merit: 1134
May 18, 2011, 02:03:59 AM
#16
I think we should distinguish between the rules that are in place now, vs the rules that are likely to be in place in future.

The rules that exist now can and probably should be simplified. Sipa has been making some proposals for this recently that sound reasonable. Fees aren't really an incentive to mine today because the income from them is so low, it's all being driven by inflation.

The rules that are in place in the future are what this thread is about, and I tend to agree that the block size limit should basically be some kind of multiple of recent average block sizes (over the last two weeks * 10 for example).

A moving block size limit gives some basic troll-resistance without constraining the market. It would, assuming the current set of rules for block inclusion never change, result in fees approaching zero very fast but nobody says miners have to use the current rule set. It's just a convenient bunch of defaults. Even if some miners use the default rules and include every transaction that fits into the block, other miners may choose to only include transactions that pay them specifically (so they are guaranteed to claim those fees eventually) which can lead to the insurance model.
full member
Activity: 234
Merit: 100
AKA: Justmoon
May 17, 2011, 08:59:57 PM
#15
Under this scenario the pros, who have much capital invested, would make a small residual profit, unless they enforce, through cartel, rules which allow them to collude to inflate the price of transactions over time until it reaches parity with the next most efficient system of monetary transaction, where it will stabilize due to outside competition.

A mining cartel wouldn't last very long. Miners inside the cartel would have the temptation to secretly mine a little on the side with a lower minimum fee in order to get some extra profits. Miners outside the cartel will also be more profitable than miners inside the cartel, because they get the same amount of money from the expensive transactions plus additional money from lower fee transactions.


If, on the other hand, we can agree on a mechanism to modulate blocksize to the market, then we can have a profitable mining pool, by having a  market for transactional priority, thereby encouraging competition between miners, instead of collusion against users.

A rule would be great if we had any basis for deciding one. You will have to predict:

- Fixed/variable costs of mining
- Economies of scale of mining
- Size of the Bitcoin economy
- Scale of ideological threats to Bitcoin
- Scale of criminal threats to Bitcoin
- Price elasticity on consumer side for fast confirmations

If you get one of these wrong by say a factor of ten, fees will too high or too low by a factor of ten. Even if you get everything right for the year 2016, you might still be wrong by a factor of ten for the year 2021.

Sure you can say that developers could get together every year to update the rule. But then you've created a political body governing the prices for Bitcoin.

Quick sidenote: Note that if your rule creates fees that are too low for a long period of time, the whole merchants buying insurance, which raises prices mechanism will still kick in. So you wouldn't do very much damage, only introduce a bit of uncertainly about how/when the rule is going to be changed. It's only when you accidentally set it too high that you cause a massive amount of wasted resources including extra CO2 emissions, wasted electricity and mining hardware.


The best thing about bitcoin is that it is based on rules which allow users and miners to make plans based on the future state of the market. If we change the rules in such a way that it encourages/enables some players to impose their own rules on the market, then we have broken the system, it goes from defensive democracy of preserving the rules to the oppressive democracy of colluding to gouge the market.

Nobody would be able to impose any rules on the system. The only person who wants to impose a rule is you. I'm arguing against instituting a rule.

Insurance companies would have no influence on mining fees and neither would miners. The only factor influencing mining fees would be how much users (particularly merchants) are willing to pay to make sure their transactions get confirmed. If the network got under attack, they would be willing to pay more. If everything was running smoothly, they would be willing to pay less. That way, the hashing rate would auto-regulate depending on the actual real-life cost that attacks incur upon the Bitcoin economy.
member
Activity: 98
Merit: 13
May 17, 2011, 08:41:00 PM
#14
There is a very real possibility that if the block size restriction is eliminated entirely that the transactional funding of the pool will force out any but the hard-core fanboys, and the most efficient pros. This gives the pros a strong incentive to collude to not accept any blocks which contain free/nearfree transactions.

Many people think this, when they first approach bitcoin.  But either way (with/out blk sz limit), free/near-free transactions give miners a userbase, which gives bitcoins their value.  Turning away users is a clear negative incentive.

I predict that the free TX area will increase in size, even as it suffers a Tragedy of the Commons, because we want to pull in more users.

WNS
newbie
Activity: 39
Merit: 0
May 17, 2011, 06:50:22 PM
#13
This thread is about pointing out that there is also a null solution. In other words, we as developers don't have to do anything. The market will negotiate the optimal level of security.

The null solution is leaving the current block limit in place, which is a pointless restriction on the size of the market. While that is "bad", the other choice you propose is not better.

I agree that the "market" - in this case the mining market, not the BTC market - will make decisions about how the rules end up determining the size of the mining pool, but that does not mean that those decisions are of necessity not detrimental to the currency itself.

There is a very real possibility that if the block size restriction is eliminated entirely that the transactional funding of the pool will force out any but the hard-core fanboys, and the most efficient pros. This gives the pros a strong incentive to collude to not accept any blocks which contain free/nearfree transactions. The fanboys even if they want to throw all the outstanding frees into their block, will leave them out so they have a better chance of their block being accepted.

Under this scenario the pros, who have much capital invested, would make a small residual profit, unless they enforce, through cartel, rules which allow them to collude to inflate the price of transactions over time until it reaches parity with the next most efficient system of monetary transaction, where it will stabilize due to outside competition.

If, on the other hand, we can agree on a mechanism to modulate blocksize to the market, then we can have a profitable mining pool, by having a  market for transactional priority, thereby encouraging competition between miners, instead of collusion against users.

The best thing about bitcoin is that it is based on rules which allow users and miners to make plans based on the future state of the market. If we change the rules in such a way that it encourages/enables some players to impose their own rules on the market, then we have broken the system, it goes from defensive democracy of preserving the rules to the oppressive democracy of colluding to gouge the market.

The need for mining means that scarcity will be enforced, whether it is enforced by the protocol, or by the collusion of miners is a matter of protocol definition. If we don't define the protocol in such a way that miners can remain profitable, they will rip the market apart (artificially remove the low transaction cost)  to get whatever they can out of their investment, which does not strike me as the best long term plan.
full member
Activity: 234
Merit: 100
AKA: Justmoon
May 17, 2011, 08:18:03 AM
#12
All these concerns about future protocol failure seem to be predicated on the assumption that there is not a trivial solution

We know there is a trivial solution. It's to have developers decide on limits and rules to estimate the correct level of security.

This thread is about pointing out that there is also a null solution. In other words, we as developers don't have to do anything. The market will negotiate the optimal level of security.

We still have to worry about miners creating huge blocks, but this is less of a problem because there is no economic incentive to create unnecessarily large blocks (see [mike]'s proposal for sharing Bitcoin's work if you need to timestamp large amounts of data for your custom application.) So you only need to prevent individual "troll" miners from blowing up the block chain to unreasonable size by themselves. For that you could indeed use some kind of limit that grows with the average of the last 2016 blocks like the difficulty does.

Again, if the block size limit only needs to protect against spam rather than regulate network security, what that means is it can be much, much higher, much less deliberately chosen and everything still works fine.

Perhaps the most important point is to not to try and prevent double spends. Double spends are fine, they are simply a cost that comes from the imperfection of the real world. If we try and aim for a hash rate that prevents all double spends, it will be far, far more costly than paying for the double spends that occur at the margins. More importantly, if we aim for a higher hash rate, we get no feedback about changing conditions. So instead of the hash rate adapting to slightly increasing or decreasing rates of double spend attacks, we would have absolutely no signals to base our decisions on.
WNS
newbie
Activity: 39
Merit: 0
May 16, 2011, 08:33:01 AM
#11
All these concerns about future protocol failure seem to be predicated on the assumption that there is not a trivial solution, I propose this off the top of my head:

1) the default protocol implements the rule that if the transaction pool is 110% of the block size at least 60sec before a solution is found, no non-full (with .5k grace) block will be accepted as valid --  no permanent refusal of non-paid blocks -- accepting free transactions is part of getting paid

2) at reward reassessment points the blocksize is recalculated to be the smallest n^2 * 1M block that can hold the entire average daily PAID transaction pool size -- all paid transactions get done, but the pool is kept too small to guarantee that all free transactions are cleared quickly.

Even without a block size increase, assuming an average fee of .005 BTC in a deflationary economy will be at least as profitable as mining right now.

If these two, simple, back compatible rules were added to the default client they would reach saturation long before these rules are needed, and we can stop all the hand ringing in the meantime.
Pages:
Jump to: