Pages:
Author

Topic: SatoshiDice, lack of remedies, and poor ISP options are pushing me toward "Lite" (Read 7372 times)

hero member
Activity: 756
Merit: 522
MPEx may be a big fish, but  bitcoin as a whole is a very small pond presently.

No argument there. At issue is the blind conviction of most everyone that Bitcoin has to be a sort of universal payment processor directly (possibly because all the payments these people ever engage in are to the tune of ten bucks).

In fact consumer level transactions, those five dollars for a moccachino, those nineteen dollars for a new pair of socks, those three hundred dollars for a new set of dildos and handcuffs constitute an area where Bitcoin's advantages are significantly dimmed (not reversible?! ouch), Bitcoin's disadvantages significantly magnified (wait up to six hours for a transaction to clear?! srsly!?), and currently existing infrastructure is well adapted (hey, Visa already does a billion transactions a week, why bother to put all the hard work into supplanting them? Why replace all the billion piece-of-shit point of sale units they already have on field? Guess what, they're not pieces of shit because corporations are evil, they're pieces of shit because when you do retail and interact with consumers you have to REALLY keep costs down).

Bitcoin's advantages shine at the other end of the spectrum. If I have to pay my Chinese suppliers for eight containers of socks or dildos or handcuffs or whatever else, I currently have to wait for a month or more just to obtain some bank's permission. The system of handling large payments is so complex, so risky, inefficient, and so downright insulting to the customer that I couldn't begin to tell you. That's where Bitcoin's advantages really matter, and as long as we gain market share there it makes absolutely zero difference if fifty or fifty million coffee shops start taking or stop taking Bitcoin. It may be a fact that mostly nobody currently involved with Bitcoin has ever paid for anything outside of retail channels. That happenstance doesn't make retail the only thing that exists.

I know that most everyone would like Bitcoin to be a currency for the masses, because most everyone actually is the masses. This still has no bearing, much like the case of the duckling that sat on a dragon egg. You can't expect the dragon to be sitting around the pond playing with your duckling friends, now can you? It's a dragon, it has dragony shit to do!

Store of value? Great, sure, forever. Ultimate unit of account, gateway to real finance as opposed to the wrestling show put up by Wall Street? Sure. Payment processor? Sure, if you're buying a plane. If you're buying a cup of coffee enjoy it while it lasts, but don't expect it to last forever. It just makes no sense to buy your cup of coffee in Bitcoin (tho it may make sense to buy a Bitcoin's worth of store tokens once a month, and from there on the gate is wide open to Bitcoin-based payments, using bitcoin-backed private currencies, such as United States Dollars, Unified Store Dingdongs or Universal Spurious Dobaloos. It's just that they won't be put through the blockchain, because there's no need to put them though the blockchain and no benefit to doing so.)
legendary
Activity: 1400
Merit: 1013
MPEx may be a big fish, but  bitcoin as a whole is a very small pond presently.
hero member
Activity: 756
Merit: 522
I general fall-in and agree with gmaxwell that this is a core economic rule, and should not be changed. (just like any other of bitcoin's economic rules, such as the block-reward).
The first successful 51% attack is going to be a company who wants to use Bitcoin to get some real work done, but discovers a vocal group of miners who dream of being the next Goldman Sachs blocking them with silly size restrictions on the block chain.

That company will dig through its couch cushions and find a billion dollars per year to experiment with and will successfully out-mine the rest of the network, placing more commerce-friendly rules in play and displacing everyone who came before.

Unless the network makes the changes ahead of time.

And you know this because why?

MPEx is the real company getting real work done. No need to 51% the network (yet).
legendary
Activity: 1400
Merit: 1013
I general fall-in and agree with gmaxwell that this is a core economic rule, and should not be changed. (just like any other of bitcoin's economic rules, such as the block-reward).
The first successful 51% attack is going to be a company who wants to use Bitcoin to get some real work done, but discovers a vocal group of miners who dream of being the next Goldman Sachs blocking them with silly size restrictions on the block chain.

That company will dig through its couch cushions and find a billion dollars per year to experiment with and will successfully out-mine the rest of the network, placing more commerce-friendly rules in play and displacing everyone who came before.

Unless the network makes the changes ahead of time.
hero member
Activity: 756
Merit: 522
I also think that it is a non-issue. As one day solutions such as Open Transactions will provide secure off-chain transactions.  (and in the future we see Bitcoin transactions for the settlement between OT servers and issuers.)

For the record, MPEx's off chain transaction system (the PUSH commands) is already moving more value daily than any of the alt chains, and possibly more than all the altchains combined (course it's mostly LTC and NMC that have any value to speak of as it is).
legendary
Activity: 1222
Merit: 1016
Live and Let Live
While I believe that insurance based network security services would naturally keep the network secure (while using the minimal amount of work).

I general fall-in and agree with gmaxwell that this is a core economic rule, and should not be changed. (just like any other of bitcoin's economic rules, such as the block-reward).

I also think that it is a non-issue. As one day solutions such as Open Transactions will provide secure off-chain transactions.  (and in the future we see Bitcoin transactions for the settlement between OT servers and issuers.)
hero member
Activity: 555
Merit: 654

Second, there are other solutions to the tragedy of the commons "problem" with fees. One of the is fee confiscation. This is another excerpt of the paper:

10.6.1. Fee confiscation. In this scheme, part of the fees collected by a miner get “confiscated”. When a transaction with fee f is included in a block, the miner applies a predefined multiplier x to the fee f. The miner can only collect x ∗ f and the rest is confiscated. The multiplier x is always lower or equal to 1.The longer the message, the lower the multiplier. The slower the cryptographic operations required by the transaction, the lower the multiplier.
As an example, if CPU usage was the only factor to consider to calculate the cost of a transaction to the network, then a transaction which requires 100 times more time to evaluate than other would have a multiplier that is 100 times lower. Because miners always choose the transactions that give them the higher reward, then users would be forced to compensate the punishment of confiscation by increasing the fees by the same factor for those commands.
CPU usage is not the only factor to consider when calculating the cost of a transaction to the network as a whole. All expensive resources already  described must be considered to design a realistic function that takes into account average costs and tries to anticipate how those costs will evolve in the future.
As fees are reduced by the multipliers, is necessary to restore the remaining money (1 − x) ∗ f to the network to avoid destroying it. One possible solution is to accumulate all the remaining fees and setup a price to be awarded to the miner of the following block. To prevent the miner from trying to delay broadcasting a block in order to mine the next, we can setup the price to be awarded to the miner of some blocks ahead (e.g. ten blocks).
This automatic prize generation may give an incentive for the casual miner not to include so many transactions, since a fixed reward (higher than the transaction fees) may exist. But since including transactions in a block requires very little resources, here is no reason not to include all known transaction and collect all possible fees. For the miners who have a high percentage of network computing power (like mining pools) obviously no such incentive exists, since including less transaction imply being awarded less money as prices in following mined blocks.

Anyone would like to write a book with the discussions of these threads? I imagine the book be called  something like
"Agreement without Trust, Inside the Bitcoin P2P community" (by P2PMaster, 2013) 0.99 BTC, paperback.

hero member
Activity: 555
Merit: 654
I will add my grain of salt to the discussion.
First, regarding miner´s incentives, this is an excerpt of my paper "Mavepay".
(It adds little new matter to what was said, only the idea tha miners could be required to do more work than the rest of the nodes)

From the game-theoretic point of view, Bitcoin miners do not have a strong incentive to protect end-users resources. In the long term miners may want to protect the end-users in order to maintain the value of their savings in the virtual coin, and the fixed cost of the infrastructure acquired for mining. But miners can at any time sell their coins and start mining for other P2P currencies, so the incentive is not strong enough. In the short term, they compete to collect fees, even if the transactions included in a block impose a high workload on the end-users. If miners were forced to store, transfer or compute much more data than end-users, then they would choose transactions of shorter length and lower CPU usage. In Bitcoin transactions are checked only once before the block mining process can start, and the quality and quantity of the transactions included in a block does not alter the winning probability significantly for a miner. The block size may affect slightly the dispersion time of a block across the network, and so may reduce the chances of a miner winning over a currently competing short-sized block. But currently this is not a limiting factor on miners, since blocks travel fast, and the diameter of the Bitcoin network is low. Also if a greater time is required to check a transaction, then the time when the block mining can effectively begin is postponed in that same amount. But transactions are checked only once, and each block mined requires a hashing effort orders of magnitude higher than the time required for transaction verification. So the CPU resources used in transaction verification during mining have little effect on the block cost and almost no effect on miners revenue.
legendary
Activity: 1246
Merit: 1077
If you use a properly designed client (electrum for instance)  then you sacrifice no security, only privacy (arguably).
Pedantically, an electrum server can put you on a not-longest fork, and from there could give you fake payments which look confirmed (electrum is gradually getting better too, so in six months this may no longer be true).

More generally, if most Bitcoin users follow your reasoning bitcoin itself will be insecure— subject to theft, inflation, etc.  Because the marginal personal security/privacy cost is low (and likely to be underestimated due to hyperbolic discounting and underestimation of small risks) in running a reduced node we have a tragidy of the commons risks where almost no one honest runs full nodes (except attackers as they have a clear incentive!) and the network dies.  To address this we should strive to minimize the cost of running a full node (relative to technology) and we should produce altruistic software which automatically runs a full node on hardware that can handle it without burdening the user.

Wasn't there a proposal before to pay full nodes? That idea could possibly be explored more, though it would require a hard fork.
staff
Activity: 4242
Merit: 8672
If you use a properly designed client (electrum for instance)  then you sacrifice no security, only privacy (arguably).
Pedantically, an electrum server can put you on a not-longest fork, and from there could give you fake payments which look confirmed (electrum is gradually getting better too, so in six months this may no longer be true).

More generally, if most Bitcoin users follow your reasoning bitcoin itself will be insecure— subject to theft, inflation, etc.  Because the marginal personal security/privacy cost is low (and likely to be underestimated due to hyperbolic discounting and underestimation of small risks) in running a reduced node we have a tragidy of the commons risks where almost no one honest runs full nodes (except attackers as they have a clear incentive!) and the network dies.  To address this we should strive to minimize the cost of running a full node (relative to technology) and we should produce altruistic software which automatically runs a full node on hardware that can handle it without burdening the user.

hero member
Activity: 560
Merit: 500
I am the one who knocks
TL;DR ^^^

If this has been stated before then please smack me.


It is important to remember the difference between privacy and security.  The are different, although often confused.

If you use a properly designed client (electrum for instance)  then you sacrifice no security, only privacy (arguably).

The worst thing a central server could do to you is lie, it could not permanently alter the blockchain.  Because you broadcast your TXs through the server they could potentially track that you own which address, although I think you can tunnel electrum through TOR if that is a concern.
legendary
Activity: 1246
Merit: 1077
Not quite, the accepting side would still see the mining of the non-accepting side. It's not mutually exclusive.

Depends which blockchain is longer, right?  Isn't that the whole point of requiring more than 51% of the hashrate?  If the blockchain with the >1MB blocks is longest, the clients on the accepting side will not see the mining of the non-accepting side because they will be working on a different blockchain.  But if the blockchain with the <=1MB blocks is longest, both clients will see the same thing & be working on the same blockchain.  Hints why it'd make sense to do it now because there aren't any blocks being generated anywhere near the 1MB limit.  So both clients could co-exist for the time being.  There wouldn't be any issue until a >1MB block is generated & accepting nodes own more than 50% of the hashrate.
The point is, at 50/50, the non-accepting side will eventually win over because the accepting side still accepts their blocks. At a certain point, their chain will be longer. At 51% or above, the accepting side should eventually carve their own blockchain—but it might take a while.
sr. member
Activity: 247
Merit: 250
Not quite, the accepting side would still see the mining of the non-accepting side. It's not mutually exclusive.

Depends which blockchain is longer, right?  Isn't that the whole point of requiring more than 51% of the hashrate?  If the blockchain with the >1MB blocks is longest, the clients on the accepting side will not see the mining of the non-accepting side because they will be working on a different blockchain.  But if the blockchain with the <=1MB blocks is longest, both clients will see the same thing & be working on the same blockchain.  Hints why it'd make sense to do it now because there aren't any blocks being generated anywhere near the 1MB limit.  So both clients could co-exist for the time being.  There wouldn't be any issue until a >1MB block is generated & accepting nodes own more than 50% of the hashrate.
staff
Activity: 4242
Merit: 8672
Lets assume 50% of the miners/clients were running bitcoin instances that accepted blocks over 1MB.  It would look like the hashrate dropped in half to both sides.
Not quite, the accepting side would still see the mining of the non-accepting side. It's not mutually exclusive.

Quote
Basically 2 different versions of bitcoin would exist.  Mtgox, blockchain, slush, deepbit, etc would all have to decide what side to take.  Or they could even fight on both sides.  Technically both could exist indefinitely.
Of course, the value of bitcoin depends on it not biurficating. That would be a maximally bad outcome: basically everyone with funds before the split would double their funds. So thats obviously highly unstable.

Quote
Same problem happens if advances in quantum computing make people want to use a different encryption method.
We don't use encryption in bitcoin. Perhaps you meant signatures? We can actually upgrade signature algorithms without a hard fork.
sr. member
Activity: 247
Merit: 250
I'm not sure if you can really call it a "hard" fork.  Some people could change the limit today without really effecting the network since we aren't really hitting the limit yet.  And it would make the most sense for us to change it now so by the time people start hitting the 1MB limit, it won't be an issue.

Regardless, it isn't really up to the core developers anymore.  The developers at slush or deepbit could change the limit today.  It'd be risky to try to propagate a block today over the limit, but if the larger miners got together and decided on a date to collectively change the limit, there'd be a higher success rate.

Wrong.

All bitcoin clients will reject a block above 1MB, regardless of what any miner produces.

Thus, it would take a hard fork to change the maximum block size.

This makes very little sense. Why not code the limit into miners only, and have the client simply prefer to relay smaller blocks over larger ones?

Miners & clients are the same thing.  Even if you aren't necessarily trying to find the next block, if you are running the client, you are still voting on what blocks are valid & which ones aren't and propagating them appropriately.  The problem isn't that blocks over 1MB would be too large for miners/clients to propagate.  The problem is that they would see the larger blocks as INVALID.

Lets assume 50% of the miners/clients were running bitcoin instances that accepted blocks over 1MB.  It would look like the hashrate dropped in half to both sides.  Basically 2 different versions of bitcoin would exist.  Mtgox, blockchain, slush, deepbit, etc would all have to decide what side to take.  Or they could even fight on both sides.  Technically both could exist indefinitely.  The prices would probably even be different between the two.  Nothing is stopping that from happening...even today.  Same problem happens if advances in quantum computing make people want to use a different encryption method.
staff
Activity: 4242
Merit: 8672
I don't see how this applies to block size. Miners do not have a significant benefit in increasing the block size, the only extra power granted to them.
They would— it would allow them to accept more low fee transactions, thus increasing their fee income in that single block, by making the space non-scarce— though in the long run it will make miners less, at the time of forming a block it would make them more. There would be no rational reason to not gobble up the fees as as fast as you can unless there was cartel behavior to force lower sizes anyways. Maybe the cartel regulating miners would choose better limits, but it would mean that we'd have to trust the cartel to behave wisely rather than a set in stone rule which produces a market for fees by fixing the maximum size.

That is all in regard to why it should work the way it does— the reason it does work the way it does is just a result of the principle of the system. Miners are only depended on for ordering.
legendary
Activity: 1246
Merit: 1077
This makes very little sense. Why not code the limit into miners only, and have the client simply prefer to relay smaller blocks over larger ones?
Because the purpose of Bitcoin is to build a currency that substantially eliminates trust.  If Bitcoin users are forced by technology have to trust miners to do the correct thing— to not inflate the currency, to not destroy its decentralization, etc— then they can be easily disenfranchised. Not only would they be forced to trust (which is against the goals of the system) but without the enforcement by a great many users miners would have reduced incentives to not cheat in the various ways that the enforcement prohibits absolutely. E.g. why not increase the subsidy to 26 BTC? Because it would undermine confidence? pfft. They'd justify it by some argument about lost coins, expanding economy, importance of security and such, just like the inflation producing governements and central banks do. Doing it would benefit all the miners, so why shouldn't they "vote" a raise for themselves? And a little bit of inflation at a time demonstrability doesn't undermine all confidence in a currency.

Of course, you do have the option to trust miners in exchange for a reduced validation cost to you by using a SPV node— but that option is only really viable because the miners are regulated by the many other participants who do verify.

Why have mining at all? Because we can't accomplish a decentralized currency without a way of providing for the ordering of transactions, and we don't know a way to provide ordering without some kind of vote or without centralization. So mining is just an attack resistant way of voting on the order of transactions. But we don't use mining for more than that because voting is not actually a good solution, it requires a kind of trust (though better than centralization)— and so for the non-ordering things that can be validated independently nodes do validate them independently. (Even SPV nodes should and do validate all that they can within the context of their permitted operating cost).
I don't see how this applies to block size. Miners do not have a significant benefit in increasing the block size, the only extra power granted to them. And in the case that there is a significant benefit, that probably means blocks are so tightly packed increasing the block size would make sense anyways.
staff
Activity: 4242
Merit: 8672
This makes very little sense. Why not code the limit into miners only, and have the client simply prefer to relay smaller blocks over larger ones?
Because the purpose of Bitcoin is to build a currency that substantially eliminates trust.  If Bitcoin users are forced by technology have to trust miners to do the correct thing— to not inflate the currency, to not destroy its decentralization, etc— then they can be easily disenfranchised. Not only would they be forced to trust (which is against the goals of the system) but without the enforcement by a great many users miners would have reduced incentives to not cheat in the various ways that the enforcement prohibits absolutely. E.g. why not increase the subsidy to 26 BTC? Because it would undermine confidence? pfft. They'd justify it by some argument about lost coins, expanding economy, importance of security and such, just like the inflation producing governements and central banks do. Doing it would benefit all the miners, so why shouldn't they "vote" a raise for themselves? And a little bit of inflation at a time demonstrability doesn't undermine all confidence in a currency.

Of course, you do have the option to trust miners in exchange for a reduced validation cost to you by using a SPV node— but that option is only really viable because the miners are regulated by the many other participants who do verify.

Why have mining at all? Because we can't accomplish a decentralized currency without a way of providing for the ordering of transactions, and we don't know a way to provide ordering without some kind of vote or without centralization. So mining is just an attack resistant way of voting on the order of transactions. But we don't use mining for more than that because voting is not actually a good solution, it requires a kind of trust (though better than centralization)— and so for the non-ordering things that can be validated independently nodes do validate them independently. (Even SPV nodes should and do validate all that they can within the context of their permitted operating cost).
legendary
Activity: 1246
Merit: 1077
I'm not sure if you can really call it a "hard" fork.  Some people could change the limit today without really effecting the network since we aren't really hitting the limit yet.  And it would make the most sense for us to change it now so by the time people start hitting the 1MB limit, it won't be an issue.

Regardless, it isn't really up to the core developers anymore.  The developers at slush or deepbit could change the limit today.  It'd be risky to try to propagate a block today over the limit, but if the larger miners got together and decided on a date to collectively change the limit, there'd be a higher success rate.

Wrong.

All bitcoin clients will reject a block above 1MB, regardless of what any miner produces.

Thus, it would take a hard fork to change the maximum block size.

This makes very little sense. Why not code the limit into miners only, and have the client simply prefer to relay smaller blocks over larger ones?
donator
Activity: 2058
Merit: 1054
Sponsoring the resources required to handling block size is a TotC problem in itself (the fee for including a tx is given to the miner which included it; the burden on the network is placed on all nodes). Which is why I earlier suggested some block size limit (or something equivalent) remains. But I'll be generous and assume the marginal resource cost finds a way to fund itself efficiently (e.g., "grassroots" rejection of large blocks).

This leaves the problem of sponsoring hashing. If the marginal cost of a transaction is C, including a transaction with fee C+epsilon is pure profit. There is no reason for a small miner not to include it.
But I pointed out this isn't true.  Including the transaction increases block size and therefore storage costs, increases time to propagate, and increases the chance of some other relay not liking it.

I'm disagreeing with the fundamental assumption you're making that it's "pure profit".
This is all supposed to be incorporated in C, the marginal cost per transaction. Some of what you mentioned is a bit abstract so there's room fo debate. But it is still the case that the costs you mentioned relate indirectly to the number of transactions; letting it influence the incentives to hash, which is independent, is neither robust nor efficient.

Also, whatever the costs are to the network, the fact is they're being borne by the network.  If the network can't afford it, the network won't afford it.  I haven't seen a convincing (to me) refutation of these points.  If something isn't worthwhile people won't do it.  If it is, people will.

My gut feeling is that the block size limit should be removed, and let the free market reign.  It works elsewhere, why not for transaction processing?
It works when there's no TotC (meaning, the business pays for any externalities). When there is one (as we can clearly identify in this case), some method to resolve it is needed.
It's not clear to me, as I've tried to explain.  Just like producing widgets - if someone can't do it profitably, that isn't a reason to seek out a central authority (here the arbitrary block size limit) to "solve" the problem (by specifying a minimum price in the case of the widget, or maximum block size for a block).  They just go bust.
Sometimes it works, sometimes not. When incentives are aligned the invisible hand works. But when there's an identifiable systemic reason why some business can't be profitable - and we want these businesses to exist - there is need for some intervention.

This is even the case now - clearly block size is not a constraint at the moment, it might as well be infinite, as most blocks are way below the limit, and not even at the point where the reference client makes it "more expensive" to take up more space.  And yet the network isn't suffering, and also not every transaction is getting accepted in the immediately next block.  Because there are costs, lags, and delays, and that is their way of being expressed.  Some people moan, but they're the ones who will stop trying to do it (including qt client users who aren't adding much to the network).  If the "market" wants those users to keep doing it, the market will pay a higher fee to ensure they find it worthwhile.
The problem of sponsoring hashing hasn't even begun to manifest itself because of the coinbase, which is still more than sufficient for it. And txs are still few enough that the crude anti-spam rules (required fee for low-priority txs) are enough to keep the nodes from overloading.

Hashing is an artificially difficult problem and thus has an additional degree of freedom. Most things aren't artificially difficult; e.g., energy generation really is difficult, and if a more efficient method can be found to do it, people can enjoy cheaper energy. But if a more efficient way to hash is found, it favors attackers and the honest network equally.
By hashing, I was talking about the cost of rebuilding and rehashing the merkle tree, not the cost of hashing the block.  Sorry for any confusion; I feel that may have led you to misunderstand my argument.
I understood this is what you meant. AFAIK the cost of hashing into the Merkle tree is negligible in comparison to ECDSA verification; and even if not, it's part of what I call marginal cost, as opposed to the amortized cost of hashing block headers.

The degree of freedom is exactly as you described: If there's not enough profit, miners will quit. Which is exactly what I'm worried about; with less miners the network will be more vulnerable to hashrate attacks. I want to make sure there is enough revenue to fund the amount of hashing required to secure the network; and I argue that left to itself, the TotC problem will create a race to the bottom with little total revenue and low network hashrate.

There are several suggested solutions. I'm not saying none of them can work, I'm saying the problem shouldn't be swept under the carpet.
I'm not certain an unconstrained block size can work.  But I think it's highly likely it can, and I've not read anything to persuade me otherwise.
If the market wants something (here, more hashing power) it will pay for it, like anything else.

So why not give it a go?  If it's a disaster, there will be no problem getting the 50% consensus to put one back, right?
In most cases gradual changes are healthier. Switching instantly to no-limit and then back to some limit can be disruptive. I think you're too charitable towards the free market, in those situations where it works it works great, but there are situations where it doesn't - or at least, where the free market works only by deciding to constrain itself to be less free.

The study of game theory brings up some examples, especially wherever bargaining is involved.
Pages:
Jump to: