Pages:
Author

Topic: How a floating blocksize limit inevitably leads towards centralization - page 19. (Read 71590 times)

legendary
Activity: 1204
Merit: 1015
So that particular calculation for automatically "adapting" is simply an example of why I am doubtful that automation is the way to go, since basically all it amounts to is carte blanche for all miners who want mining to be limited to a privileged elite can simply spout the biggest blocks permitted, constantly, thus automatically driving up the size permitted, letting them spout out even bigger blocks and so on as fast as possible until the few of them left are able to totally control the whole system in a nice little cartel or oligarchy or plutarchy or kakistrocracy or whatever the term is for such arrangements. (Sounds like an invitation to kakistocracy actually, maybe?)

Basically automatic "adaptation" seems more like automatic aquiescence to whatever the cartel wants, possibly happening along the way to leave them with incentives to maintain appearances of controlling ess of the network than they actually do so that if/when they do achive an actual monopoly it will appear to the public as maybe pretty much any number of "actors" the monopoly chooses to represent itself as for public relations purposes. (To prevent panics caused by fears that someone controls 51%, for example.)

-MarkM-

If they manage to do that in such a way that keeps global orphan rates down and the difficulty at least stable (so this would have to be done slowly) all while losing boatloads of money by essentially requiring no transaction fee ever, good for them. Other 51% attacks would be more economical, especially since this attack would be trivial to detect in the non-global-state side of things. For example, people would notice a large amount of previously-unheard transactions in these blocks, or extremely spammy transactions. Worst-case, they could get away with including all legitimate transactions and a small amount of their own and not be detected, raising the limit to have a block be able to contain just above the amount of typical transactions made per block period.

However, other considerations can be added. That suggestion is by no means final. Some extreme ideas (not outside the box, I know, but just to prove that this attack can be prevented with more constraints):
*To increase max block size, global orphan rate must be below 1%.
*To increase max block size, 95% of the blocks in the last difficulty period must be at/very near the current max size.
*Max block size can only increase by a factor of 2 over four years.

For more ideas, think about the process of raising the block size limit in manual terms. What would/should we consider before manually raising the block size limit? Let's see if we can codify that...

legendary
Activity: 2940
Merit: 1090
I strongly suspect that even despite Moore's Law being thought by some to be running out, allowing an increase in the block size would not be able to be limited to increasing it only by 50% at first then another 50% each calendar year thereafter.

Some folk are already talking about 100 megabyte blocks as if maybe that might even be the smallest block size they consider changing the size at all should change it to right away, and since those same folk seem to argue it is exponentially increasing need that drives them to that number maybe we can expect 100 times the size per year thereafter too maybe.

It is ridiculously easy to spin off as many chains as their are fiat currencies, to proviide a cryptocurrency landscape as rich and possibly as varied as the fiat landscape everyone is already used to, and with Ripple transparent automatic exchanges as part of transfers should be even easier than it will be with the fiat currencies.

But hey lets give it a try: all in favour of a 50% increase in block size to start with, followed by further 50% increases yearly thereafter?

Edit: Oops misremembered the actual Moores law numbers typically cited. Maybe all in favour of doubling the size then every year and a half doubling it again is more palatable?

-MarkM-
member
Activity: 104
Merit: 10
What was the talk about "O(n^2) scaling problems" about? I fail to see this is relevant here. The total number of transactions is O(n), where n is the number of users. The number of transactions per user should be a finite constant and not depend on n. Or does anybody have a different assumption?

Note that this is meant to be the long term assumption, for the scenario in which each user conducts all his transactions with bitcoin. Before that happens, only a certain percentage of the people that a given user transacts with actually uses bitcoin. With growing n this percentage will also grow, giving you the illusion of a growth similar to O(n^2). But as the percentage is bounded (by 100%) this can only be a temporary phenomenon.

If the number of transactions can be assumed to be O(n) then it is reasonable to factor hardware improvements into the decision. You would not need Moore's Law to conclude that wristwatches will be able to do all the world's verifying in real-time.

That said, in reality, "the constant does matter". And it doesn't suffice as an argument for unlimited block size.

legendary
Activity: 2940
Merit: 1090
So that particular calculation for automatically "adapting" is simply an example of why I am doubtful that automation is the way to go, since basically all it amounts to is carte blanche for all miners who want mining to be limited to a privileged elite to simply spout the biggest blocks permitted, constantly, thus automatically driving up the size permitted, letting them spout out even bigger blocks and so on as fast as possible until the few of them left are able to totally control the whole system in a nice little cartel or oligarchy or plutarchy or kakistrocracy or whatever the term is for such arrangements. (Sounds like an invitation to kakistocracy actually, maybe?)

Basically automatic "adaptation" seems more like automatic aquiescence to whatever the cartel wants, possibly happening along the way to leave them with incentives to maintain appearances of controlling less of the network than they actually do so that if/when they do achieve an actual monopoly it will appear to the public as maybe pretty much any number of "actors" the monopoly chooses to represent itself as for public relations purposes. (To prevent panics caused by fears that someone controls 51%, for example.)

-MarkM-
legendary
Activity: 1204
Merit: 1015
legendary
Activity: 2940
Merit: 1090
Yeah maybe we should also do polls in deepest darkest Africa, both of groups of peasants who have not managed to afford a community phone yet and of groups of the individuals who have managed to afford a phone thus are able to go into the "letting someone use my phone briefly for a fee" business, and see whether they feel something requiring the vast expense of a 24/7 internet connected 386 to 586 grade machine to get into seems a better or worse grassroots empower the people thing than something that even the latest and greatest throwaways being handed out at furniture banks cannot handle?

Sample bias much?

-MarkM-
staff
Activity: 4284
Merit: 8808
Changing something as fundamental and delicate as this constant should require pretty hardcore quantitative data and at least 80% consensus.  Not gut feelings.
It's not like a San Jose poll would be terribly representative of Bitcoin users.  A lot of the tech startups have big booming enthusiasm far outpacing their basic technical and economic competence, and I expect them to be well represented there. It takes all kinds, sure, but if you ask people who have been presenting these crazy double exponential graphs to VCs all week if they want there to be MOAR TRANSACTIONS, of course they're going to say "YES. IF WE NEED MILLIONS FOR ANOTHER VALIDATING NODE, I KNOW JUST THE VC TO CALL" Tongue  (And these folks and their opinions are, of course, quite important... but that doesn't mean that a poll is a grant way to get a thoughtful response that reflects their real interest)
staff
Activity: 4284
Merit: 8808
PS: and if the "1MB-is-doom" scenario is correct, the beauty of not tackling problems until they are a clear and present danger, is that if a hard fork must take place, then it will be much easier to reach 100% consensus
Yea, two forks of my risk doomsday saying are "dorking with the rules will (rightfully) undermine confidence"  and "if the limit is lifted without enough txn we'll destroy the fees market", both of those go away if size is only increased in response to a situation where the necessity is obvious.

Though part of that also means that if we're to really reason about it in the future we get to have this debate every time it comes up so that we don't create an expectation that it must be increased even absent a need.
full member
Activity: 150
Merit: 100
Everyone has equal hashing power and bandwidth except for one person who hashes at half the speed of everyone else.  The slowpoke's mining speed and profitability is decreased and eventually they stop mining because they are no longer turning a profit.

No. If slowpoke is using the most efficient form of mining(ASICs atm), his running/capital costs will be half of everyone else. He will be as profitable as the bigger miners.

Once you start requiring a high barrier to entry for mining via a large block size which residential connections cannot keep up with, you will naturally see independant mining and p2p-pool failing with a migration towards centralised mining pools. If you look at the Bitcoin globe, is it any surprise that hardly any full nodes(usually independant miners) exist in countries with slow internet connections(most of Asia, Africa, South America)? Countries with bandwidth caps will also make mining unfeasible in many developed countries(I believe many US/Canadian/NZ/Australian ISPs have caps).
member
Activity: 118
Merit: 10
Consider the two scenarios:

Everyone has equal hashing power and bandwidth except for one person who has a half-speed connection.  The slowpoke's mining speed and profitability is decreased and eventually they stop mining because they are no longer turning a profit.  (This is what retep described in his first post.)

Everyone has equal hashing power and bandwidth except for one person who hashes at half the speed of everyone else.  The slowpoke's mining speed and profitability is decreased and eventually they stop mining because they are no longer turning a profit.

Why are they being treated differently?  Nobody cares about catering to the second person's lack of profitability so why are people suddenly concerned about the first?  If you're saying "oh, it's because we want non-miners to be able to validate in time" - take a look at retep's post, this isn't about the convenience of non-miners.  A well-connected miner with a fast upload speed adds value to the network just like hashing does - fast dissemination of blocks lets everyone get faster information about the latest updates to the ledger.  Requiring the lowest common denominator of connections doesn't add much value and takes away a whole lot more.
newbie
Activity: 42
Merit: 0
Startbitcoin.com is offering the blockchain delivered on DVD. As the blockchain continues to grow, this will make life a lot easier.

http://startbitcoin.com/blockchain-on-dvd/
legendary
Activity: 1064
Merit: 1001
Changing something as fundamental and delicate as this constant should require pretty hardcore quantitative data and at least 80% consensus.  Not gut feelings.

Maybe the Bitcoin Foundation can start up a university outreach program that encourages research on Bitcoin's difficult problems, like MAX_BLOCK_SIZE.
hero member
Activity: 588
Merit: 500
Doctors have the right idea: Primum non nocere.  Or if you prefer: if it ain't broke, don't fix it.

Clearly Bitcoin, as currently implemented, 1MB limitation included, is doing something right.  ...


Except that Satoshi had foreseen the limit being a temporary requirement and gave an example of raising it by block 115,000 (in the future). That is now the past.

I have 25 years experience in commercial systems, much of that processing $US billions per day, and am fully aware that software issues ignored invariably blow up badly. The human body is different as it has the capability to fix itself (in fact that is 99% of modern medicine).

Satoshi isn't infallible.

And like you, I also have many years experience developing commercial software products used by millions of people, and can attest to many fun anecdotes where trying fixing a bug or making an "obvious" improvement, resulted in disasters far worse and more costly than the original problem.

Changing something as fundamental and delicate as this constant should require pretty hardcore quantitative data and at least 80% consensus.  Not gut feelings.

Ultimately, I'd suggest Gavin do an informal poll at the San Jose conference.  If he can't get at least 80% agreement there, then he'll know that there is a definite risk of a community split between those who prioritize Bitcoin's transactional attributes versus its potential for cast-iron independent value storage.

The world already has plenty of fast, high-capacity media of exchange.  But only one digital, decentralized, and private store of value.  To me, risking the strength of the latter attributes to enhance the former, seems shortsighted.
full member
Activity: 150
Merit: 100
Doctors have the right idea: Primum non nocere.  Or if you prefer: if it ain't broke, don't fix it.

Clearly Bitcoin, as currently implemented, 1MB limitation included, is doing something right.  Otherwise demand wouldn't be growing exponentially.  Gavin: with each Bitcoin purchase, users are implicitly "voting" that they approve of its current embedded constants (or at least, they don't mind them, yet).  So why risk breaking that dynamic, unless a proven crisis arises?

As clearly evidenced in this thread, there is no consensus as to whether the block size limit will be a problem or not.  Both the "1MB-is-fine" as well as the "1MB-is-doom" scenarios are based around hypotheses, speculation, and behavioral modeling.  But: humans are notoriously crappy at predicting the future.  If anyone here could actually see 4 years out wrt. Bitcoin or anything else, why aren't they rich beyond belief, sitting on giant stashes of coins mined in 2009?

So here's my vote in favor of doing nothing until there is hard proof that any limit is, in fact, a deadly problem which cannot be worked around through overlay services and altchains.  Let's not be like politicians: always at the ready with preemptive and precautionary "solutions", 99% of the time resulting in disastrous unintended side-effects.

PS: and if the "1MB-is-doom" scenario is correct, the beauty of not tackling problems until they are a clear and present danger, is that if a hard fork must take place, then it will be much easier to reach 100% consensus when disaster is staring everyone in the face and modeled solutions are based on less fuzzy parameters.  As it stands right now, splitting the community over this issue seems far more dangerous to Bitcoin's future than any fall in transactional usage arising from increased fees.

To further add on to this,

we are unlikely to come to a generally accepted consensus on whether the 1MB size limit will be an issue, let alone how to work around it.
I suggest we continue observing how the market reacts once the artificial 250KB limit is constantly reached(probably a few months to go).
What % of TXs pay fees when free TXs have a delay in inclusion?
How much do fees rise by?
Is there a reduction in wasteful TXs(SD sending back 1 satoshi as cfm) by services?
Does SD start implementing a deposit feature to help cut down on TX fees?
Do the big wallets and exchanges introduce or promote internal transfers to help people save on TX fees?
How is TX volume affected?

Once we see how this plays out, we will be at a better position to gauge if there is indeed a problem or not.
Lastly id like to remind everyone that the day will come when there is no significant block reward, fees will be the only motivation to mine and secure the network. Eventually, fees for most if not all transactions will become a norm(however small they may be). If we have to increase the block size in future, it should ideally be to a point where your average low end PC with an average connection can still run a full node to sustain current levels of decentralization.
legendary
Activity: 2940
Merit: 1090
We won't even need a new hard fork, since everyone and their descendants will have centuries or millenia in which move their coins to another chain if they come to feel that the primary chain no longer serves their need.

Indeed they can pre-emptively start securing positions in additional chains before the time their own predicted doomsday-of-blocksize actually hits.

We might find that several merged mined chains that think they have not been getting enough attention will be eager to double their maximum block size and have so few current users, including full nodes and miners,  that doing so will be easy to get consensus on.

As already said, the primary chain is doing something right. Actually it is maybe doing several things right, maybe even in the right combinations, as so far none of the experiments in providing alternate ways of doing things secured by the same hashing-power as the primary chain have caught on even enough to actually get even half of the quantity of hashing power the primary chain has. No one is serious enough about needing larger blocks yet, it seems, to have triggered an increase of block size in response from the market, even from co-operating (merged mined) chains. Heck if it was truly massively needed / essential surely one or more of the non-cooperating chains would have done it by now and touted it as one of their "clear advantages over the primary chain / bitcoin".

-MarkM-
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Doctors have the right idea: Primum non nocere.  Or if you prefer: if it ain't broke, don't fix it.

Clearly Bitcoin, as currently implemented, 1MB limitation included, is doing something right.  ...


Except that Satoshi had foreseen the limit being a temporary requirement and gave an example of raising it by block 115,000 (in the future). That is now the past.

I have 25 years experience in commercial systems, much of that processing $US billions per day, and am fully aware that software issues ignored invariably blow up badly. The human body is different as it has the capability to fix itself (in fact that is 99% of modern medicine).
hero member
Activity: 588
Merit: 500
Doctors have the right idea: Primum non nocere.  Or if you prefer: if it ain't broke, don't fix it.

Clearly Bitcoin, as currently implemented, 1MB limitation included, is doing something right.  Otherwise demand wouldn't be growing exponentially.  Gavin: with each Bitcoin purchase, users are implicitly "voting" that they approve of its current embedded constants (or at least, they don't mind them, yet).  So why risk breaking that dynamic, unless a proven crisis arises?

As clearly evidenced in this thread, there is no consensus as to whether the block size limit will be a problem or not.  Both the "1MB-is-fine" as well as the "1MB-is-doom" scenarios are based around hypotheses, speculation, and behavioral modeling.  But: humans are notoriously crappy at predicting the future.  If anyone here could actually see 4 years out wrt. Bitcoin or anything else, why aren't they rich beyond belief, sitting on giant stashes of coins mined in 2009?

So here's my vote in favor of doing nothing until there is hard proof that any limit is, in fact, a deadly problem which cannot be worked around through overlay services and altchains.  Let's not be like politicians: always at the ready with preemptive and precautionary "solutions", 99% of the time resulting in disastrous unintended side-effects.

PS: and if the "1MB-is-doom" scenario is correct, the beauty of not tackling problems until they are a clear and present danger, is that if a hard fork must take place, then it will be much easier to reach 100% consensus when disaster is staring everyone in the face and modeled solutions are based on less fuzzy parameters.  As it stands right now, splitting the community over this issue seems far more dangerous to Bitcoin's future than any fall in transactional usage arising from increased fees.
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
I think that due to bitcoin's limited supply nature, the price will rise forever, so it will never be used at mass scale at retail market, the small transactions will more and more happen internally at exchange and web based service

It will take centuries to replace the currency system today. And since there are lot's of ways to improve the current currency system (for example FED can write off loans periodically to reduce the burden of debt-laiden government), that might never happen

But to function as a store of value and a method to do anonymous value transfer globally, that is the strength of bitcoin, and that does not require a lot of daily transaction



staff
Activity: 4284
Merit: 8808
You seem to fear the inevitable.
Something which is currently prohibited by the rules embodied in every bitcoin node can not be easily described as inevitable.

Quote
If Bitcoin is ever to become truly successful, a transaction throughput volume that is well beyond what the average end user is capable of, or willing to, committing the resources to maintain a full client must occur.
A fair amount of debuking this has already happened in the thread. There are sound fundamental technical reasons why Bitcoin cannot be fully successful without external (decenteralized if you like) transaction handling. Once you have that, there isn't obviosuly a need for an extreme capacity (beyond millions/day) in bitcoin itself.

Quote
There is no point in crying about this, it has already begun.  I dont run a full client anymore, myself.
Kind of an odd comment. A rasberry pi— significantly less powerful than most current smartphones— can keep up with the blockchain with current software. Modulo software bugs and inadequacies the computing hardware and storage I already own would be adequate for a hundred years of the current maximum rate of Bitcoin (though I tend to be a bit overpowered).  

Quote
While it's important that the blockchain be replicated many places across the Internet, and into the deep web such as Tor, there comes a point of rapidly dimminishing returns.  I think that we have around 10K full nodes that can be identified
There are about 20k IPv4 _listening_ nodes enumerated in the seeds database, but it's impossible to know how many full nodes there are, though its very likely that its substantially more than the number we can see listening.

Quote
There must be some degree of centralization, as the bitcoin network as it presently exists is too costly relative to it's current market size. We don't want the network to get smaller, really, but nor do we need it to grow more; we simply need the market to outgrow the network until the relative costs of running the netowrk are much lower than now
The cost of running a node relative to the value to the Bitcoin economy isn't actually a factor in decenteralization. The problem is that no matter how valuable the bitcoin economy is you can always save the cost of validating it by letting (hoping) someone else handles it.  What matters is the absolute cost relative to current technology and keeping it low enough that you get diversity and decentralization through indifference and altruism.  If you depend on profit motives you end up with a tragedy of the commons because its easier to freeload, or easier to monetize dishonest behavior. The system doesn't have any real incentives for honestly validating except the vague "if no one does a good job of it, the whole house of cards collapses".

Quote
Eventually, a live transaction on the main netowrk should become an uncommon event, relative to the number of off-network transactions that occur.
Great. Agreed! But that is why some people here are saying that being conservative with Bitcoin itself and keeping the costs low and decentearlization a top priority is the right path: we can have both scale and decenteralization through the use of off-chain trading and keeping the chain small... but if we bloat up the chain so that only some ten thousand central banks validate it, we'll lose decentralization.
legendary
Activity: 1078
Merit: 1003
I find the need to rely on centralized services more to send small transactions, as a much more unlikeable scenario than letting the requirements for running a full Bitcoin node or a mining node increase, and let Bitcoin continue to support small transactions. There might be a fundamental vision conflict regarding this issue but we'll see how it goes. This issue should be one of the priorities for the dev team for sure.

Thing is, with centralized services handling small transactions we most certainly face a much higher degree of centralization than if we raise the block size limit. Based on the calculations in this thread running nodes would still be within reach for dedicated hobbyists - NOT super computer level like some FUD-spreading people are trying to imply.

Not raising the limit could lead to a situation where Coinbase literally becomes the new PayPal, and nothing has changed. Currently everyone has the freedom to use the blockchain directly which is sensorship free.

It seems you have two choices:

a) A money system which allows you to validate the rules it functions under but at the same time eventually likely forces you to use a business for your day to day transactions, however again you yourself can easily audit this business at any time and make sure they aren't doing anything you didn't agree with when you signed up for their services

b) A mere payment system, similar to paypal, that eventually you will have no power to validate is following the rules you agreed to when you started using it


Which will it be?
Pages:
Jump to: