Pages:
Author

Topic: How a floating blocksize limit inevitably leads towards centralization - page 20. (Read 71590 times)

legendary
Activity: 1708
Merit: 1010

Yeah I know that is not realistic. But it seems to give some insight into the numbers, at least to me. Full nodes seemingly are not settling up even as often as qarterly right now, why is that? Is it that actually there are only about 10,000 individual users actively transacting currently at any one time / on any given hour or during any given day or something like that?

-MarkM-


None of the above.  If there are 10K full nodes, then there are 10K copies of the blockchain.  That is all that this implies.  There is no data to tell us; 1) how many are single user nodes versus how many are multi-user nodes or Stratum servers, 2) how many users on any of these light client servers exist nor 3) how often these users need to "settle up".  The actual number of the full clients in the network is mostly a product of the "many-copies-keeps-data-safe" security model.  But the cost of this redundency is not zero (even though it's still cheaper than fiat currencies).  This cost would be spread out greater as the market grows, but the resources that miners consume are not the only costs of running the network, as a great many of those resources are contributed in-kind by users with both the resources and the determination to run full nodes.  As the resources required to participate in the network as a full node increase (at a rate faster than those same resources grow, as available to the average power user) some of those back-up nodes with marginal/non-existent pay-back will stop participating.  Hopefully, the services of running a parrallel payment network, for example BitcoinSpinner's business model, will be able to turn a profit, since they will (out of necessity) also run a full node.
legendary
Activity: 2940
Merit: 1090
Okay then, lets for the sake of argument imagine you are correct that 10,000 nodes is sufficient decentralisation.

1,000,000 bytes divided by 10,000 nodes is 100 bytes per block per node.

Multiply by 144 blocks per day and you get 14400 bytes per node per day.

If transactions were one byte each that would be enough for each node to send one settlement to each of the other nodes per day and still have 4,399 bytes to spare. Unfortunately transactions are what, some 400 or so bytes each on average? But, settling up only requires one of the nodes of each pair of nodes to send coins: the one that owes some to the other.

If all the nodes needed to settle every day, it seems from this very sketchy back of an envelope approximation that they would need about 200 times the block size we currently have. 200+ megabytes per block

Maybe if we can get 10,000 full bitcoin nodes to also run Ripple we can start to see how often they actually do tend to need to settle up with each other. It actually, surprisingly, seems quite a surprise to realise that the core backbone of full nodes obviously is not actually settling up pairwise between all of them every day already. Are my numbers way way off? As it seems from this guestimation that only about one in a hundred of the combinatorial pairs of full nodes is settling up on any given day even if we assume no transaction other than such settlements hit the blockchain. Since we also hear that Satoshi Dice is maybe the biggest user of the blockchain apparently it must be quite a bit less than one day in a hundred that a full node actually settles up with another full node at this time. So maybe we can figure typical full nodes settle up about once per quarter of a year?

The more traffic Ripple can cause to not require immediate settlement on the blockchain the longer 10,000 full nodes should be able to continue to settle up at the same rate they currently are, and if Satoshi Dice also moved off the blockchain, settling with each gambler's full node of choice quarterly too, we'd continue to have plenty of space on the blockchain unless/until far more full nodes sprang up, each wanting to settle with all other full nodes at the same rate full nodes settle up currently.

Yeah I know that is not realistic. But it seems to give some insight into the numbers, at least to me. Full nodes seemingly are not settling up even as often as quarterly right now, why is that? Is it that actually there are only about 10,000 individual users actively transacting currently at any one time / on any given hour or during any given day or something like that?

-MarkM-
legendary
Activity: 1708
Merit: 1010
I suspect we could double the block size without massive disruption, but once you start where does it end?

Doubling the liimit might well just prove it is going to give way at every push or shove, leading it it being pushed and shoved way up beyond any reasonable person-to-person network's personal computer's ability to actually operate as a node on a personal internet connection.

It will become a business to business or even backbone to backbone network, making the whole p2p premise just a bait-and-switch trick used to con people into doing the initial investment and work to get the elite's new elite billionaire-to-billionaire-network up and running initially and suck people in to the early adoption phase of it.


You seem to fear the inevitable.  If Bitcoin is ever to become truly successful, a transaction throughput volume that is well beyond what the average end user is capable of, or willing to, committing the resources to maintain a full client must occur.  There is no point in crying about this, it has already begun.  I dont run a full client anymore, myself.  While it's important that the blockchain be replicated many places across the Internet, and into the deep web such as Tor, there comes a point of rapidly dimminishing returns.  I think that we have around 10K full nodes that can be identified (which should exclude any in the deep web), who here thinks that this isn't more than enough redundency for security purposes?  There must be some degree of centralization, as the bitcoin network as it presently exists is too costly relative to it's current market size.  We don't want the network to get smaller, really, but nor do we need it to grow more; we simply need the market to outgrow the network until the relative costs of running the netowrk are much lower than now, and we need this to happen well before the block reward cuts in half once again.


Light clients and their supporting overlay networks are the future of Bitcoin, if it is to ever have one. 
Eventually, a live transaction on the main netowrk should become an uncommon event, relative to the number of off-network transactions that occur.  This futture has been known to those of us that are versed in economic theory, for as the economy grows the growth must outpace the ability of the main bitcoin netowrk to scale.  As this happens, there will be incentives for off-network transactions to dominate; and a balance between the costs of the main network with it's high security model versus the costs with overlay and parrallel netowrks with their lower transaction costs and associated less seucre models will develop.  If Bitcoin is to ever become as large an economy as, for example, the SystemD (for which it is so well suited) the vast majority of small value and daily transactions are gointg to be handled by overlay networks and the online wallet-like servies that develop them; while higher value transactions (for example, your weekly paycheck) might still be handled by the main netowrk.
donator
Activity: 994
Merit: 1000
An increase of the block size from 1MB to 100MB is manageable for ordinary users. This equates to a bandwidth of less than 1 MB/s which CAN be arranged if you want to run a validation node yourself (which is advisable if there is a hard fork going on and you need to do validation yourself...).

I could be swayed into increasing the threshold limit to allow for a higher transaction throughput. However it must be maintainable by the AVERAGE miner. An increased orphan rate due to bandwidth limitations can be compensated by increasing hashing power. So the average miner may be faced with a decision to either put investments into bandwidth or increased hashing power.

I would advice against a floating blocksize limit, because it prevents average miners from planning for bandwidth and storage requirements. As pointed out by the OP it also adds another attack vector which is poorly understood.

Let's keep the blockchain accessible (guaranteed)!
legendary
Activity: 2940
Merit: 1090
Ripple is out of closed beta too, maybe it can take a lot of the load off the blockchain. A lot of users use online wallets too, those have internal transfers too.

-MarkM-


you bring up a really good point. With ripple we can do off blockchain transactions with out the negative effects of centralization.

We can also maybe refer to Ripple as a Business to Business network from the outset rather than as a Person to Person network, since it seems designed and intended from the outset to be B2B oriented rather than P2P oriented.

-MarkM-
legendary
Activity: 1722
Merit: 1217
Ripple is out of closed beta too, maybe it can take a lot of the load off the blockchain. A lot of users use online wallets too, those have internal transfers too.

-MarkM-


you bring up a really good point. With ripple we can do off blockchain transactions with out the negative effects of centralization.
legendary
Activity: 2940
Merit: 1090
I suspect we could double the block size without massive disruption, but once you start where does it end?

Doubling the liimit might well just prove it is going to give way at every push or shove, leading it it being pushed and shoved way up beyond any reasonable person-to-person network's personal computer's ability to actually operate as a node on a personal internet connection.

It will become a business to business or even backbone to backbone network, making the whole p2p premise just a bait-and-switch trick used to con people into doing the initial investment and work to get the elite's new elite billionaire-to-billionaire-network up and running initially and suck people in to the early adoption phase of it.

-MarkM-
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
I find the need to rely on centralized services more to send small transactions, as a much more unlikeable scenario than letting the requirements for running a full Bitcoin node or a mining node increase, and let Bitcoin continue to support small transactions. There might be a fundamental vision conflict regarding this issue but we'll see how it goes. This issue should be one of the priorities for the dev team for sure.

Thing is, with centralized services handling small transactions we most certainly face a much higher degree of centralization than if we raise the block size limit. Based on the calculations in this thread running nodes would still be within reach for dedicated hobbyists - NOT super computer level like some FUD-spreading people are trying to imply.

Not raising the limit could lead to a situation where Coinbase literally becomes the new PayPal, and nothing has changed. Currently everyone has the freedom to use the blockchain directly which is sensorship free.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
My prediction for what will happen if we DON'T increase the block size limit:

At some point in the future (this year probably), the block size limit will be reached. People will start noticing that their transactions don't get confirmations, and they start increasing transaction fees.

It would be nice if this is true, but bitcoin will hit the limit like a train hitting the buffers and there will be widespread chaos. Right now the price on mtgox has hit silver parity. This is big news and will accelerate wider adoption. Yes, SD could be throttled back to give breathing space, but only a few weeks.

At the very least the 1Mb limit should be decoupled from block-readers, and soon, so that only block-writers are constrained.

They don't? Users of mtgox can send each other bitcoins off blockchain, so can users of most other exchanges, hell Coinbase is even built with this being it's primary goal..
I defer to your deeper knowledge on this...
cjp
full member
Activity: 210
Merit: 124
My prediction for what will happen if we DON'T increase the block size limit:

At some point in the future (this year probably), the block size limit will be reached. People will start noticing that their transactions don't get confirmations, and they start increasing transaction fees. It would be nice to have a feature in the Bitcoin software to retransmit a transaction with a higher fee (as a "double spend", so that it will never be performed twice).

"Frivolous" use of Bitcoin decreases, as it is discouraged by the higher fees. SatoshiDice will be played with higher amounts, so that the fees don't matter that much: we will have fewer SatoshiDice transactions, and more "useful" transactions.

Bitcoin price can still rise, if its use shifts from "consumer-level" to "business-level", where large transactions happen more frequently.

People will start using centralized services for Bitcoin transactions, especially for smaller transactions: these will have smaller fees or no fees at all, since none of these "internal" transactions need to be written into the block chain. Effectively, these services are a sort of "banks". Note that, as mentioned by hazek, these services already exist. Note that this is also a form of centralization (though a more visible one), and that fractional reserve banking becomes possible (it already is, in fact, for these services).

These centralized services will connect to each other to offer interoperability to each others' customers without the high fees of block chain transactions. This will start with ad-hoc solutions, and then become more standardized.

In a couple of years, my Ripple-like concept could be a standard for these interconnections. One of the nice things is that it does not allow fractional banking, and it dramatically reduces the need for trust in the service providers.

Once we reach usage by a significant fraction of the world population, the block size limit won't be high enough to support my Ripple-like concept (fees will be unaffordable for most people), so people will be forced to trust their "banks" again. We'll then have more than enough experience with the effects of the size limit, and computer power has also increased significantly. We'll then be able to make a well-informed jump to a higher limit value.

Problem in this scenario: "banks" have an interest in maintaining this "forced trust" (among other things, it allows them to do fractional banking), so once they exist, they will oppose the limit increase. This is an argument for increasing the limit before "banking" appears in the Bitcoin world.
legendary
Activity: 2940
Merit: 1090
Ripple is out of closed beta too, maybe it can take a lot of the load off the blockchain. A lot of users use online wallets too, those have internal transfers too.

-MarkM-
legendary
Activity: 1078
Merit: 1003
We seem to have oodles of spare space currently

The problem is the rate of adoption likely isn't linear but exponential meaning it just seems we have "oodles" of spare space. This could change quite fast.

Competing centralised services built atop of Bitcoin to cater for micro payments will not only be more efficient and cheaper ...

These services do not exist yet. It is the root of the problem because the 1Mb limit will cripple bitcoin (later this year) before such services can take over micro-transactions (several years away). The limit needs to be flexible enough to allow bitcoin to work as it does now, until/if synergistic competing services develop on their own merits.

They don't? Users of mtgox can send each other bitcoins off blockchain, so can users of most other exchanges, hell Coinbase is even built with this being it's primary goal..
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Competing centralised services built atop of Bitcoin to cater for micro payments will not only be more efficient and cheaper ...

These services do not exist yet. It is the root of the problem because the 1Mb limit will cripple bitcoin (later this year) before such services can take over micro-transactions (several years away). The limit needs to be flexible enough to allow bitcoin to work as it does now, until/if synergistic competing services develop on their own merits.
legendary
Activity: 2940
Merit: 1090
Once ASIC production gets down pat, churning out chips whose research and setup costs have long been covered, and so many are in use that they are only borderline profitable and even then maybe only in places where electricity is dirt cheap and/or the heat produced is needed or recycled back into more electricity what level of difficulty will be be looking at?

A megabyte per decimal-digit of difficulty would already be seven megabytes right now, since difficulty hit 1,000,000. I am not sure offhand how many leading zeroes that is in the binary version that is actually used by the code.

But fundamentally we just cannot really know what the future needs will look like, which is why I still favour going through this whole should we shouldn't we and the whole is everyone on board for this hard-fork process each time it starts to seem to some as if the limit does need to be raised.

How long is that process likely to take? We seem to have oodles of spare space currently, if we take our time about maybe adding a whole 'nother megabyte we still might find we have more than enough space by the time it even comes into effect.

I have even read here and there claims it is mostly Satoshi Dice that has made block sizes climb enough to notice much, if that is so I'd say we should correct the failure to discourage such frivolous waste first before considering increasing the block size.

Since I last wrote, bitcoin price has risen significantly again, so it is clearer by the hour that the block size limit is not discouraging use / driving away adoption enough to hit us in the pocketbooks, if at all. Maybe the people inventing in the dram of $10,000 per bitcoin actually like the idea that transaction fees will be high enough to ensure each and every one of their coins is safer than houses.


-MarkM-
cjp
full member
Activity: 210
Merit: 124
Half-baked 1:
why the heck do we have 8 decimal places for the transaction amount?
That's just a design decision that had to be made in an early stage, when it wasn't clear yet what the potential of Bitcoin would be. I think it was a good decision at the time, since the number of bits for storing values is still small compared to e.g. scriptsigs. As other posters have mentioned, it can still be useful in the future, even when we will never have really small (satoshi-sized) transactions in the block chain.

Half-baked 2:
Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.
How is that arbitrary limit better than a "number of transactions" limit?
Also, you choose quite a high number: it would be about 0.01BTC/transaction. How is that going to allow "satoshi-sized" transactions?

About coupling the transaction limit to difficulty:
Interesting idea. For now I only see one problem: it is possible (and likely) that in the future processing power will increase at a different rate than network+storage, so you could still run into trouble. But since it's a "better approximation" of the optimal transaction limit than a constant limit, you'll probably run into trouble "less often".
Just make sure it's set low enough so that people can run full nodes at moderately small investments (maybe not for the average consumer, but at least for a non-profit hobbyist).
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
That's what I'm saying, you can't know what's enough or more than enough, so you can't simply set a mandatory amount per Mb.
You seem to be contradicting yourself in the same post...

True, but I haven't formed any final opinions on the whole issue, so that sort of explains it. I'm still throwing around ideas and scenarios in my mind, like many others here I suppose.
legendary
Activity: 1106
Merit: 1004
As a protocol rule?
I find that worse than an automatic adjustment of the max block size. You can't set prices like that. You can't predict what 50BTC will represent in the future.

No, it does make some sense. With this type of system in place, the security of the network would be based on a BTC based reward. This way we would always have similar security level relative to the actual value of the monetary base.

You can't predict how much the monetary base will be worth nor how much security is enough. You can't set prices like that.

I do agree with Mike again that it's a bit questionable to just set it at 50 BTC. We really don't know how much mining is "high security", we only know that what we have now is quite enough. It would be wasteful to pay more fees for more mining if we don't actually need it.

That's what I'm saying, you can't know what's enough or more than enough, so you can't simply set a mandatory amount per Mb.
You seem to be contradicting yourself in the same post...
legendary
Activity: 3472
Merit: 4801
This isn't a hard problem to solve at a technical level. Have the nodes keep track of the version numbers they see on the network. When X% of the network has upgraded to a version which supports the new rules, and when Y% of the miners indicate their support via coinbase flags switch to the new rules. Until then use the old rules. Let the users decide when and if to switch.

So if my client only ever connects to a subset of clients that have chosen not to upgrade, then my client won't know to make the switch when the rest of the networks switches?
legendary
Activity: 2184
Merit: 1056
Affordable Physical Bitcoins - Denarium.com
I like jojkaart's idea, can you guys comment more on it?

"How about tying the maximum block size to mining difficulty?

This way, if the fees start to drop, this is counteracted with the shrinking block size. The only time this counteracting won't be effective is when usage is actually dwindling at the same time.
If the fees start to increase, this is also counteracted with increasing the block size as more mining power comes online.

The difficulty also goes up with increasing hardware capabilities, I'd expect that the difficulty increase due to this factor will track the increase of technical capabilities of computers in general."
full member
Activity: 150
Merit: 100
Second half-baked thought:

One reasonable concern is that if there is no "block size pressure" transaction fees will not be high enough to pay for sufficient mining.

Here's an idea: Reject blocks larger than 1 megabyte that do not include a total reward (subsidy+fees) of at least 50 BTC per megabyte.

"But miners can just include a never broadcast, fee-only transactions to jack up the fees in the block!"

Yes... but if their block gets orphaned then they'll lose those "fake fees" to another miner. I would guess that the incentive to try to push low-bandwidth/CPU miners out of the network would be overwhelmed by the disincentive of losing lots of BTC if you got orphaned.

The issue with setting an arbitary number like this is that it may not make economic sense to pay 50BTC in fees for 3600 transactions (~0.0138BTC) depending on how much Bitcoin is worth in the future. Should Bitcoin reach $100k, a transaction on average would cost in excess of $1000.

Im all for a floating block size limit but unlike difficulty, a change in the current block size affects "all" parties involved (clients, miners, all users via tx fees).
The core difference is that when someone wants to increase their hash rate, they bear the cost of investing in additional hardware. At most, it will bring up difficulty and force other miners to mine more efficiently or drive the less efficient ones out of business while increasing the security of the network. Users benefit(for "free") from increased mining competition via a more secure network. Free market forces will drive down mining profitability/transaction costs and increase hash rates to an equilibrium and everyone wins.

Now when there is a block size increase(whether it be floating or forked as a hard rule), things get messy.
Pros
Transactions are verified cheaper and processed as fast as they are today

Cons
Increased Storage costs for everyone(full nodes, miners)
Increased Bandwidth requirements (everyone including SPVs)
Reduced Network hash rate == Reduced security of Bitcoin(due to lower transaction fees, miners invest less in hashing power)

So lets break this down

Storage
Im surprised nobody is talking about storage issues. Sometimes when i launch the reference client after a few days, i start thinking how absurd it is that each SD bet has become a permanent cost(time to download, disk space, network usage) for every user of the full client now and in the future. Even at 1MB/block, we are looking at a block chain increase of about 55GB a year(including misc indexing files/USX DB). Increase this to 10MB and you start requiring a dedicated hard disk(500+GB) for every years worth of blocks. At 10MB, it starts requiring a considerable investment to run a full node after a year. This means your average user WILL NEVER run a full node after a year of this change. After a few years, running a full node becomes the domain of medium sized companies.

Solution:
Now lets assume that pruning is implemented and we start storing only the unspent output, at 10MB, a 2GB unspent output DB starts to seem reasonable. A few archived nodes containing the full blocks could be run by donation financed sites or the big Bitcoin businesses(MtGox, blockchain, etc..). Or the full client could be modified to include a DHT mode instead/in addition to the pruning mode to allow the average user to store a subset of the block chain.

Network Bandwidth
As easy as it is for Mike to say that 100mbit connections are widely available and bandwidth is not an issue, the fact is that not everyone lives in Kansas or Korea. If you look at Asia(excluding Japan/Korea/Singapore/Taiwan/HK), there are not many countries where you can get a stable 512kbps connection. Speed aside, even developed countries like US/Canada/Australia/New Zealand have many ISPs with puny bandwidth caps of 50GB - 100GB per month charging above $70. Some parts of Europe have extremely expensive internet with poor connectivity as well. This may or may not change. Some countries have government imposed monopolies allowing poor service and high prices while some countries do not have government investment/economies of scale to warrant an investment in internet infrastructure.

Still, having only miners having to worry about network bandwidth is fine in my opinion as it is a competitive business.
A full node used for verification should not need to worry about a 1-2min block download time as it is not in a race to find the next block. But it does mean that full nodes starting afresh may not be able to catch up with the current block if their download speeds are too slow.  For a 1mbit connection, even a 10MB block size would be pushing it. To me it becomes a serious issue when half the people in the world are unable to run a full node because the blocks are too large for them to catch up.

Security
This is something that we cannot account for because we have not had a precedent breach of security. Still, a single incident could be fatal to Bitcoin's reputation as secure form of money, something which it may not be able to recover from(infact this may lead to a loss of confidence in the currency and cause a collapse of it's value akin to Hyperinflation scenarios). I think this point should not be taken lightly, we know there are many parties who will benefit from Bitcoin's demise and would not mind mounting an attack at a loss.

My Take
Im with retep and max on this one for couple of reasons. Even if technically feasible, we should be extremely conservative in raising the block size limit(if at all) just because of security.

On a more philosophical level, i dislike wasteful systems.
I find it absurd that with ever more powerful hardware, software is getting slower and more bloated. In the past couple of decades, there has been a culture of waste in computing as we abuse Moore's Law.
Lets run virtual machines because the hardware is getting faster.
But we will need 1GB ram and a 1ghz dual-core processor with a fast GPU to swipe 16 icons on screen smoothly.
Compare this with the Genesis/Megadrive which was running the Sonic series on a 7mhz processor with 64KB RAM without a GPU and you start to realise just how inefficient and wasteful today's software has become.

Bitcoin as a decentralised p2p system is extremely inefficient as compared to a centralised system like Visa as has been pointed out by multiple posters. Now in Bitcoin's case, the inefficiency is a requirement to maintain a decentralised system, a necessary evil if you will. Competing centralised services built atop of Bitcoin to cater for micro payments will not only be more efficient and cheaper but also instant, something which Bitcoin will not be able to compete with and should not. Advocating the use of Bitcoin for volumes it is not optimised for just seems extremely wasteful to me.

It is important to understand that these centralised services will be more akin to the tech industry than with today's banking industry. Anyone can start a tech company in his basement or garage if he wants, you cant start a bank or fiat payment processor like Visa unless you have connections with people of power(Regulators, governments, big banks) because of many artificial barriers to entry. Unlike fiat currency, Bitcoin is an open platform and the anybody is free to build services atop of it for which Bitcoin is not well suited for.
Likewise anyone who can host a website can start a micro-payment processor. The big exchanges and hosted wallet services would(already) do it for free to save on Bitcoin TX fees and allow instant confirmation. Moreover, this is largely going to be for micro-payments where customers would maintain a small pre-paid deposit and processors would clear balances regularly(hourly-daily) with other processors. The losses in the event of a dishonest processor would be minimal.

In summary, leave the block size alone and let the free market work around it. Bitcoin's primary role was to liberalize money and that goal should not be compromised to support a micro-payment network for which it is ill-suited for.
Pages:
Jump to: