Pages:
Author

Topic: Why the fuck did Satoshi implement the 1 MB blocksize limit? - page 5. (Read 2188 times)

legendary
Activity: 1372
Merit: 1252
dinofelis continues to understimate the risk of having "10 datacenters running full validating nodes". It's easy to sit there and say everything would be fine, but that's in theory. Do you really want to risk it to find out what would happen if only 10 places in the world were validating for you what is and isn't a correct bitcoin transaction? Im not willing to find out. The higher number of nodes coming from different parties the more robust the network is against global attackers, this is rather obvious. If you aren't validating your own transactions, you are someone else's cuck.
newbie
Activity: 15
Merit: 1
I think it is just his choice or just first version of realization of blockchain technology, another blockchain has other sizes.
legendary
Activity: 1372
Merit: 1252
Last time i checked there were a few 2mb blocks got mined.

Where the fuck that 1mb is coming from?

Yep, there's one.  2070.925 KB.  I don't think anyone's claiming that we still have a 1MB cap (or at least I hope not, because I'll be questioning their reading comprehension).  This thread is asking why the old 1MB cap was implemented to begin with, but now that's been answered, we seem to be drifting off course slightly.  

So yeah, to sum up, it was Hal Finney's idea, but you can't ask him what the long-term plan was since he sadly passed away.  Satoshi was obviously part of the decision making process, but good luck figuring out who, exactly, to ask there.  Ray Dillinger is still around and you can read some of their thoughts about the early days and where things subsequently ended up here.  But we're now past all that and moving forward.  Not necessarily all in the same direction, so we'll see which chain gets it right in the long run, but moving forward.

Simple answer:

Satoshi never knew crypto will be this important.

Proof that NSA or CIA never created Bitcoin. They probabaly created Paypal or want Palpal like digital currencies?


Satoshi wanted bigger blocks eventually, as big as needed actually, and he didn't want most people running nodes but SPV wallets:

The current system where every user is a network node is not the intended configuration for large scale.  That would be like every Usenet user runs their own NNTP server.  The design supports letting users just be users.  The more burden it is to run a node, the fewer nodes there will be.  Those few nodes will be big server farms.  The rest will be client nodes that only do transactions and don't generate.

So in a way, BCash would be closer to what he was trying to accomplish, which would make the conspiracy of trying to get full nodes away from users and into the hands of governments/corporations and therefore Satoshi being part of this program more feasible.

But he also predicted how people would be against it:

Piling every proof-of-work quorum system in the world into one dataset doesn't scale.

Bitcoin and BitDNS can be used separately.  Users shouldn't have to download all of both to use one or the other.  BitDNS users may not want to download everything the next several unrelated networks decide to pile in either.

The networks need to have separate fates.  BitDNS users might be completely liberal about adding any large data features since relatively few domain registrars are needed, while Bitcoin users might get increasingly tyrannical about limiting the size of the chain so it's easy for lots of users and small devices.

Fears about securely buying domains with Bitcoins are a red herring.  It's easy to trade Bitcoins for other non-repudiable commodities.

If you're still worried about it, it's cryptographically possible to make a risk free trade.  The two parties would set up transactions on both sides such that when they both sign the transactions, the second signer's signature triggers the release of both.  The second signer can't release one without releasing the other.

So... we will never understand what his true intentions were. As far as I know, what we know is having full nodes in the hands of a few corporations only would centralize the system at layer 0 (BCash). If LN centralizes or not, it's yet to be seen. I trust no one.

member
Activity: 210
Merit: 26
High fees = low BTC price
If you want to pay less fees, use segwit. It has big blocks.

Segwit not save money unless all outputs are Segwit

Why should people pay $30 to convert to new address in extortion fees to miners
because development team broke existing client code and few exchanges use Segwit ?

You must be using it so even if you know someone that can receive these payments
then was the saving 50% or more like 20% like I am hearing ?

Since Swegwit came out in August 2017 fees continue to go up and that mempool
using queuing theory is always just out of reach so why do you think that is ?

Ucy
sr. member
Activity: 2674
Merit: 403
Compare rates on different exchanges & swap.
Simple answer:

Satoshi never knew crypto will be this important.

Proof that NSA or CIA never created Bitcoin. They probabaly created Paypal or want Palpal like digital currencies?

It's several times I read the same answer on this topic, why would Satoshi Nakamoto create something that is not meant to be used worldwide by the mass? It's not like in today's economy Bitcoin has a big place anyway. Please read the original paper of Satoshi posted on page one.

I have some reasons why i think he didn't.

One of them is the use of ASICs for mining.  I bet he didn't anticipate that too.
legendary
Activity: 3934
Merit: 3190
Leave no FUD unchallenged
Last time i checked there were a few 2mb blocks got mined.

Where the fuck that 1mb is coming from?

Yep, there's one.  2070.925 KB.  I don't think anyone's claiming that we still have a 1MB cap (or at least I hope not, because I'll be questioning their reading comprehension).  This thread is asking why the old 1MB cap was implemented to begin with, but now that's been answered, we seem to be drifting off course slightly.  

So yeah, to sum up, it was Hal Finney's idea, but you can't ask him what the long-term plan was since he sadly passed away.  Satoshi was obviously part of the decision making process, but good luck figuring out who, exactly, to ask there.  Ray Dillinger is still around and you can read some of their thoughts about the early days and where things subsequently ended up here.  But we're now past all that and moving forward.  Not necessarily all in the same direction, so we'll see which chain gets it right in the long run, but moving forward.
full member
Activity: 196
Merit: 109
Simple answer:

Satoshi never knew crypto will be this important.

Proof that NSA or CIA never created Bitcoin. They probabaly created Paypal or want Palpal like digital currencies?

It's several times I read the same answer on this topic, why would Satoshi Nakamoto create something that is not meant to be used worldwide by the mass? It's not like in today's economy Bitcoin has a big place anyway. Please read the original paper of Satoshi posted on page one.
legendary
Activity: 3276
Merit: 2442
Last time i checked there were a few 2mb blocks got mined.

Where the fuck that 1mb is coming from?

If you want to pay less fees, use segwit. It has big blocks.
Ucy
sr. member
Activity: 2674
Merit: 403
Compare rates on different exchanges & swap.
Simple answer:

Satoshi never knew crypto will be this important.

Proof that NSA or CIA never created Bitcoin. They probabaly created Paypal or want Palpal like digital currencies?
member
Activity: 210
Merit: 26
High fees = low BTC price
As a protocol / consensus relevant param :  YESS!!!

Maybe we could try this

public static money MaxFees=1.50 // 20,000 miners is 19,000 too many

Far to many miners doing CPU-Wars with each other and if we had 1000 then $1.50 would be enough to keep
Bitcoin running and this proposal has the "consensus" of the community using Bitcoin but not the ears of the developers
because the miners and bankers have them.

We could even get complicated as a short term fix and add

if (Mempool>10000 && Amount<$10) Return Error.AmountToLow

Seven transactions a second and 20,000 full nodes ! We are being take for twats


 

hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
answer is simple. unnecessary   MAX-  block size is unnecessary Smiley

As a protocol / consensus relevant param :  YESS!!!
member
Activity: 96
Merit: 10
answer is simple. unnecessary block size is unnecessary Smiley
hero member
Activity: 770
Merit: 629
The contrast is here in the high amount of centralization of the code, and the relatively high amount of centralization of the proof of work (mining pools), which are the TRUE power elements in bitcoin's system; but one insists on the "decentralization of holding a copy of the sole document" in such a way that it becomes a technical burden.   This is a contradiction.  There's no more reason to have thousands of copies of the block chain, but only a few repositories of the code ; there's no reason to require thousands of independent verifications of the block chain by that code, but not thousands of people verifying the code itself.  All this is hugely contradictory in its requirements.
If you can trust a single, centralized code depository (github) and the few signatures of the contributors ; if you know that the entire block chain is produced by just 10 mining pools with such an amount of proof of work that nobody can do anything else, but then you introduce technical limitations because of the "need" to keep thousands of copies in all basements, that doesn't make much sense to me.

I agree it's a bizarre contradiction, but I'd make the opposite argument (even though it seems to be heresy around these parts), that there should be more than one dev team and repository precisely because decentralisation is more resilient to attack.  Many seem to take the stance that other dev teams are the attack, but I suppose that's what indoctrination does to the narrow minded.  Plus I'm still all for small adjustments to the blockweight.

You're perfectly right.  The "code" situation is far, far more centralized than the "mining" situation, and, as with the "mining" situation, that is NOT the fault of the central power in code.  The few people that have pushing rights on the Core repository are the oligarchy of bitcoin's code power structure, even though nothing stops others from forking the code, or from developing independent code that implements the protocol too.  You can say that bitcoin's code WAS forked a lot, within bitcoin's community, and in order to make alt chains.  In a certain way, this is probably where we have to look for real decentralisation: alt coins.  
Anyways, whatever the causes, the code situation is highly centralized, and that's what the community, or the game theory if you want, has brought us.  It is not the fault of the Core people, but it is the actual situation.  

Quote
Also, you appear to have glossed over the part about regulatory shutdown.  The fewer nodes Bitcoin has, the easier it would be for governments hostile to Bitcoin to either coordinate law-enforcement raids, or simply arrange shutting off the power, or otherwise targetting the remaining few mega-nodes to get rid of them.  If everything was done in 10 big datacenters, I'd bet there are a few governments out there who would happily justify the cost of trading 10 ballistic missiles for seeing the end of the greatest tool of monetary freedom available in the world today.  It's simply not worth risking.

I didn't miss this, but I think it is somewhat delusional.  Right at this moment, there ARE 10 "data centres" that are making the block chain: the 10 most important mining pools.  Now, if I talk about data centre, they can be geographically distributed: they can have many "nodes" all over the world, but they are under the single control of a single mining pool.  So, "hitting them with a strategic thermonuclear weapon" is going to be difficult.  However, bitcoin has now literally industrial proportions.  I don't know how accurate the estimations are, but bitcoin is said to consume about as much electricity as Denmark.  If there was a concerted effort to pull the plug, literally, of bitcoin's mining industry by all governments, in any case you wouldn't be able to do anything with your full copy of the blockchain.  You can easily spot bitcoin mining activity: look at a correlation of electricity consumption and infrared heat pictures !  You can't hide Denmark's electricity consumption !  

I would think that if "war with world governments" rises to the point that the *DATA* is not safe on a few 10 of data centers, then, first, the liberties of people in this world are very, very much eroded to the point of not even be allowed to have data on a data center ; second, how do you think the market would react if governments start bombing bitcoin infrastructure ?  Who is going to "store his wealth" in a thing governments bomb ?  Before governments bomb blockchains, they will have shut down all forms of *commercial activity* of course, and the market will crash to 2010 levels.  And before they bomb blockchains, they will of course first bomb the code repository with a cease-and-desist order to github, imprison all the core code writers, and will pull the plug of all mining equipment.  

So, again, even in this somewhat delusionary scenario of an all-out war against bitcoin on this planet by a concerted effort of governments, there are easier points of failure to attack before one will bomb the blockchain file repositories.  If putting the blockchain at disposal becomes illegal and opened to being bombed everywhere in the world, we have a bigger problem than bitcoin, and in any case, bitcoin's commercial value would be dead. "store of value" in something all governments bomb is not going to be very attractive.

Moreover, bitcoin is now 1/3 of crypto.  Governments could bomb blockchain centres, but then they would only promote another crypto.

This is why I think that the true decentralisation comes from the alt coin crypto market, which is in fact, breaking just as well the code monopoly, as the mining monopoly, as the block chain monopoly, but not as we expected.  "life finds its way".

So, again, if, for a defence against the IMO illusionary case of a massive world-wide government crackdown on *blockchain repositories*, we cripple the technical capacities of the system, that's madness.
legendary
Activity: 3934
Merit: 3190
Leave no FUD unchallenged
The contrast is here in the high amount of centralization of the code, and the relatively high amount of centralization of the proof of work (mining pools), which are the TRUE power elements in bitcoin's system; but one insists on the "decentralization of holding a copy of the sole document" in such a way that it becomes a technical burden.   This is a contradiction.  There's no more reason to have thousands of copies of the block chain, but only a few repositories of the code ; there's no reason to require thousands of independent verifications of the block chain by that code, but not thousands of people verifying the code itself.  All this is hugely contradictory in its requirements.
If you can trust a single, centralized code depository (github) and the few signatures of the contributors ; if you know that the entire block chain is produced by just 10 mining pools with such an amount of proof of work that nobody can do anything else, but then you introduce technical limitations because of the "need" to keep thousands of copies in all basements, that doesn't make much sense to me.

I agree it's a bizarre contradiction, but I'd make the opposite argument (even though it seems to be heresy around these parts), that there should be more than one dev team and repository precisely because decentralisation is more resilient to attack.  Many seem to take the stance that other dev teams are the attack, but I suppose that's what indoctrination does to the narrow minded.  Plus I'm still all for small adjustments to the blockweight.

Also, you appear to have glossed over the part about regulatory shutdown.  The fewer nodes Bitcoin has, the easier it would be for governments hostile to Bitcoin to either coordinate law-enforcement raids, or simply arrange shutting off the power, or otherwise targetting the remaining few mega-nodes to get rid of them.  If everything was done in 10 big datacenters, I'd bet there are a few governments out there who would happily justify the cost of trading 10 ballistic missiles for seeing the end of the greatest tool of monetary freedom available in the world today.  It's simply not worth risking.
hero member
Activity: 770
Merit: 629
Now, we have essentially 10 big mining pools that are the sole authors of the bitcoin block chain.  How many copies of that chain do we need to serve, all over the world, in order for me to be able to verify the authenticity (that a piece of it I download, is of the real block chain out there made by these 10 mining pools) ?  I would think that a few tens of copies that are publicly available are good enough.  That my neighbour cannot have a server in his basement, is, just like before, no problem.

I can easily check, from the moment that I have access to any public repository of the entire block chain, that the small piece I need, is authentic, that is, belongs to the sole and unique chain that is out there, made by these 10 mining pools, like I could verify that the document I downloaded was cryptographically signed by one of the 10 authors of these documents.

(...)

And nothing is gained by having thousands and thousands of identical copies ; if, in order to have those thousands and thousands of identical copies everywhere, we cripple the system, we're totally out of our minds.

I vaguely, sort of, kinda see your point, but the simple fact remains that thousands of copies is more resistant to regulatory shutdown than 10.  Thousands of copies is more resistant to bribery and corruption than 10.  Thousands of copies is more resistant to any other kind of collusion, manipulation or attack than 10.

The point is that you cannot lie.  You cannot present a FAKE block chain.  The only thing needed for that, is checking the chain of block headers.  There's no way you can invent a fake chain of block headers.  Of course, every participant (as Satoshi said) should download the full chain of block headers.  That chain is, well, a chain, and it contains the proof of work. It is very small.  You can also find the most up-to-date block header the mining pools are working on, so you can verify that this is the genuine chain they are working on.

That is not a lot of data, and independent of the block size.  

Once you have the header chain, and hence, the proof of work, you can verify the authenticity of every single transaction if one gives you:

1) the transaction
2) in what block it is
3) the path of the Merkle tree in that block to that transaction

That is what a light wallet does.  If you want to know whether someone paid you (which is the ONLY thing you want to know: do you possess a coin ? and did your transaction to someone else is part of the chain ?), you only need the proof that the transaction to your address is incorporated in the chain.  You don't care about all the rest.  You want the cryptographic proof that the few transactions that you are concerned with, are in the sole chain out there.  That's all.

Quote
 Strength in numbers and such.  The more distributed it is, the stronger it is.  We'd be totally out of our minds if we trusted such a small number of people to remain honest when the incentives to be dishonest become inexorably more profitable over time.

That's another matter.  If you want to *find out* whether the mining pools are following the rules, you need indeed, to check everything.  But the only thing you can do is to find that out.  You can't do anything about it: they came to consensus, they decided what the block chain is.

I'm repeating what I wrote in an earlier post: https://bitcointalksearch.org/topic/m.28591086
somewhat higher up in this thread.

TL;DR: the miner pools are kept in check, not because thousands of Joe's verify independently in their basement their "signature", but rather by the market, and by one another.  The miners invested a lot in their mining equipment, and are rewarded in coins, from which they have to get economic value in the market to pay their bills with.   They want to play by the rules, because the market would crash if they didn't.  Even if there was only one single miner out there, making the block chain all by himself.  He would still apply the rules.  Or his coins wouldn't be worth zilch.

Of course, some people have to check.  There needs to be a whistleblower to find out that the entire miner consensus decided to fail on the rules.  But only one whistleblower is enough.  

Not every Joe is reading the Core code in his bed, to verify independently whether the Core code implements what is said about Bitcoin's rules.  In the same way, not every Joe needs to check the validity of the mining consensus (.... with the Core code he didn't read, and got from a single repository !).

The contrast is here in the high amount of centralization of the code, and the relatively high amount of centralization of the proof of work (mining pools), which are the TRUE power elements in bitcoin's system; but one insists on the "decentralization of holding a copy of the sole document" in such a way that it becomes a technical burden.   This is a contradiction.  There's no more reason to have thousands of copies of the block chain, but only a few repositories of the code ; there's no reason to require thousands of independent verifications of the block chain by that code, but not thousands of people verifying the code itself.  All this is hugely contradictory in its requirements.
If you can trust a single, centralized code depository (github) and the few signatures of the contributors ; if you know that the entire block chain is produced by just 10 mining pools with such an amount of proof of work that nobody can do anything else, but then you introduce technical limitations because of the "need" to keep thousands of copies in all basements, that doesn't make much sense to me.

legendary
Activity: 3934
Merit: 3190
Leave no FUD unchallenged
Now, we have essentially 10 big mining pools that are the sole authors of the bitcoin block chain.  How many copies of that chain do we need to serve, all over the world, in order for me to be able to verify the authenticity (that a piece of it I download, is of the real block chain out there made by these 10 mining pools) ?  I would think that a few tens of copies that are publicly available are good enough.  That my neighbour cannot have a server in his basement, is, just like before, no problem.

I can easily check, from the moment that I have access to any public repository of the entire block chain, that the small piece I need, is authentic, that is, belongs to the sole and unique chain that is out there, made by these 10 mining pools, like I could verify that the document I downloaded was cryptographically signed by one of the 10 authors of these documents.

(...)

And nothing is gained by having thousands and thousands of identical copies ; if, in order to have those thousands and thousands of identical copies everywhere, we cripple the system, we're totally out of our minds.

I vaguely, sort of, kinda see your point, but the simple fact remains that thousands of copies is more resistant to regulatory shutdown than 10.  Thousands of copies is more resistant to bribery and corruption than 10.  Thousands of copies is more resistant to any other kind of collusion, manipulation or attack than 10.  Strength in numbers and such.  The more distributed it is, the stronger it is.  We'd be totally out of our minds if we trusted such a small number of people to remain honest when the incentives to be dishonest become inexorably more profitable over time.
hero member
Activity: 770
Merit: 629
It doesn't matter what "Satoshi" intended. On-chain transactions are scarce because the Bitcoin blockchain is designed to be deterministic (this is self evident) and to be highly distributed (this is also self evident).

This is absolutely not self evident.  It is a narrative that is repeated over and over, but it is entirely wrong, as I explained already many, many times.  

The number of different, identical copies of a cryptographically signed document that are stored world-wide, from the moment there are several (say, a few 10 or so) doesn't increase its cryptographic certainty.  

Suppose that 10 people are each in possession of a cryptographic secret key of which we all know the public key, and suppose that these 10 people sign documents.  We can all verify the signatures of these documents.  How many independent copies do there have to be in the world on public repositories, in order for you to feel secure that you are able to access some of them when you want and are able to verify their authenticity ?   Do we need thousands of repositories ?  Is it useful that Joe and Jack put up a server in their basement with those documents ?  Or can we live with a few tens of independent servers (for instance, of these people themselves) ?

It is clear that the cryptographic verification of these documents is independent of the number of independent servers there are in the world.  If my neighbour Joe cannot afford to set up a home server of these documents, that's not going to stop me from being able to verify their authenticity if I download them from one or other big data center, right ?

Now, we have essentially 10 big mining pools that are the sole authors of the bitcoin block chain.  How many copies of that chain do we need to serve, all over the world, in order for me to be able to verify the authenticity (that a piece of it I download, is of the real block chain out there made by these 10 mining pools) ?  I would think that a few tens of copies that are publicly available are good enough.  That my neighbour cannot have a server in his basement, is, just like before, no problem.

I can easily check, from the moment that I have access to any public repository of the entire block chain, that the small piece I need, is authentic, that is, belongs to the sole and unique chain that is out there, made by these 10 mining pools, like I could verify that the document I downloaded was cryptographically signed by one of the 10 authors of these documents.

And when there needs to be a few tens, or a few hundreds, of public repositories of blockchain in the world, then there is no "data burden".  Your average datacentre handles WAY WAY more data than the bitcoin block chain, even in your wildest dreams.  Compared to an IPTV data center with VOD, the bitcoin block chain, even with VISA-like transaction rates, is extremely small.  That's what Satoshi already told us in November 2008.  It was obviously right.

And nothing is gained by having thousands and thousands of identical copies ; if, in order to have those thousands and thousands of identical copies everywhere, we cripple the system, we're totally out of our minds.  That is similar to not wanting to broadcast TV shows over the internet at more than a 250x125 pixel resolution, because if every poor dude in the world has to keep all the TV shows on his basement server, we have to limit internet TV.  

I will ask something else if you want to put that in doubt: how many independent repositories are there, publicly available, of the bitcoin core software ?  There is essentially only one, as far as I know, that is on github.  I don't know of other servers of the git repository that are public.  At best there are the Torrent seeders, but that is not a genuine git repository.  How many are there ?  So if only a few software repositories are available, why should there be thousands of block chain repositories ?
The single github repository is enough, as long as it is working, because we have the cryptographic signatures of the few authors (think of my 10 people on the top of this post).  If github breaks down, I'm sure that each of these authors can set up another repository somewhere else, and by verifying their signatures, we know it is authentic.   Replace "github bitcoin repository" by "block chain repository", and replace "signing author of Core" by "mining pool", and "signature" by "proof of work", and you see the analogy.

hero member
Activity: 770
Merit: 629
Yes. And it happened at a time, where there was absolutely 0 fees and only a few txs.
The idea was coming from excluding a sheer brute zero cost spam attack, bloating the blockchain for free and crashing / eliminating bitcoin before it could take off. Very reasonable for an open blockchain in a open hacker world that time, handwaving the no-brainer to lift it as 'needed'.

As I pointed out earlier, there is no reason why "the consensus of mining nodes" would build upon a crazy large block, which implies an implicit block size limit every mining node, individually, decides for himself.  We are forgetting here the consensus role and the voting power that mining nodes have: they DECIDE on which block to mine.  They DECIDE what the consensus will be.  As such, if there is block 50 005 which is still normal, and someone propagates a 5 GB block on top of 50 005 as candidate 50 006 block, it is upon the rest of the mining nodes to decide whether or not they will orphan this block, and make their own 50 006 block, or whether they will mine on top of that 5 GB block.  If a consensus of miners decides to mine on top of that 5 GB block, then it means that bitcoin's consensus is that such a block is "good".  If they decide to orphan it, then it means that even if that 5 GB block is "legal", miners don't like it (essentially because it is spam).  Moreover, those mining on smaller blocks have a network burden advantage.  By the time you've downloaded that 5 GB block, you've lost a lot of precious hashing time - unless the network is so swift that 5 GB blocks are not an issue.  

So you can think that every mining node has his own "limit of block size on top of which he will not mine", for reasons of principle, for reasons of network efficiency, and for reasons of cost.  Of course, if he sets that limit too low, he will never mine.  If he sets it higher than his peers, no problem.  There will hence be a kind of unspoken "market size" of blocks that miners in general accept.  That automatic self-control mechanism would in any case have prevented that the blockchain would be entirely stupidly be filled with GB of nonsense per day.

There was strictly no reason to put in a hard limit.  Most probably, Satoshi thought that it would be a formalization of this "unspoken limit" on which miners would decide.  Maybe Satoshi only saw this as a "mining strategy" as he was also writing the sole mining software out there.

The real long term problem bitcoin was facing is that no block limit would lead to very low fees.  Very low fees would start to be a problem when bitcoin's issuance would start to dwindle, and wouldn't pay enough for the mining security. The finite amount of bitcoin issuance, which is its main publicity and selling point, is in fact a huge monetary design mistake which makes bitcoin unsuited as a currency ; but on top of that, it forces a delicate fee market that has the potential to kill the usability of the system as a whole.

But the finite block size never played a useful role, and its use as a "spam protection" is obviously wrong.  It has only negative effects.
member
Activity: 224
Merit: 11
i mean relatively, Satoshi probably didn't expect it to scale as fast and as large as Bitcoin has today
newbie
Activity: 4
Merit: 2
It doesn't matter what "Satoshi" intended. On-chain transactions are scarce because the Bitcoin blockchain is designed to be deterministic (this is self evident) and to be highly distributed (this is also self evident). Miners mint coins and process transactions, non-miner "full nodes" verify the integrity of the network (they are the only 'clients' that don't have to trust that the nodes they are talking to are behaving properly, using the desired protocol).

Increasing block size increases network bandwidth requirements (higher cost & quality required), compute/memory requirements and storage cost & complexity (UTXOs). Larger blocks increase the cost of running nodes. It is not known exactly how increasing block size affects the composition of the network, but the general argument that increasing block size reduces the number of possible participants in the network is obvious. Given that on-chain transactions carry a high cost (relative to suitable off-chain solutions), it is obvious that not all transactions warrant on-chain settlement. Therefore, an off-chain (L2) architecture and implementation are needed in order to 'complete' the Bitcoin architecture so that it can reasonably support real-time, lowest-cost payments. Bitcoin is still in 'beta', in with regard to low-value payments. Lightning is the spearhead for implementing a payments layer for bitcoin. You can check the status of Lightning on mainnet here: http://lnstat.ideoflux.com:3000/dashboard/db/lightning-network?refresh=5m&orgId=1

Enlarging blocks (pricing more participants out of the network) purely in the interest of lowering fees in the short term so we can lure more people onto a Beta network doesn't seem productive. It is better to start by (1) optimizing blocks so bits are used more effectively (SegWit), then (2) maturing a layer 2 implementation, then (3) reviewing block size increase requirements once the actual demand for on-chain transactions becomes clear (after all the low-value transactions have been moved off-chain and usage patterns with the L2 settle into patterns we can use for planning purposes).

Pages:
Jump to: