Pages:
Author

Topic: LN+segwit vs big blocks, levels of centralization. - page 10. (Read 8897 times)

legendary
Activity: 2674
Merit: 2970
Terminated.
Look at any test net explorer and you can find blocks that are 2-4 MB big. Separating the witness is the method used to achieve this (calculations are therefore done differently, so you seemingly are able to transact more without a hard fork block size increase).
look at any testnet and you will see blocks that are filled with certain pattern of tx's to get certain stats.
-snip-
the reality is that the 2mb 'assumtion' is based on a block that is 100% full of segwit key users that do whats deemed as average in-out spending habits.

but fighting against native key users. you wont get the 2mb 'assumption'
so even with segwit the 2009+ expectation of 7tx's is still not a guarantee/promise
What are you on about? Nobody claimed that you'd get double the TX throughput right away. I was pointing out to the user, that we would essentially see 'bigger blocks' post Segwit and refered them to a place where they should look at. Maybe you should start reading posts before you copy responses from your script.
legendary
Activity: 4424
Merit: 4794
Look at any test net explorer and you can find blocks that are 2-4 MB big. Separating the witness is the method used to achieve this (calculations are therefore done differently, so you seemingly are able to transact more without a hard fork block size increase).

look at any testnet and you will see blocks that are filled with certain pattern of tx's to get certain stats.

much like testnets of 2009-2016 where certain blocks had the most leanest tx's possible to show 7tx's was 'possible' .. yet reality of mainnet has never seen a 7tx/s block.

testnet has biased results because the tx's are not real life scenario randomness. they are not filled with random bloat from merchants which then fight for room against obvious native spammers and native quatratic creators.

the reality is that the 2mb 'assumtion' is based on a block that is 100% full of segwit key users that do whats deemed as average in-out spending habits.

but fighting against native key users. you wont get the 2mb 'assumption'
so even with segwit the 2009+ 'assumption' of 7tx's is still not a guarantee/promise
legendary
Activity: 4424
Merit: 4794
The only way I could imagine is that most competing pools could mine on the merchants' chain and then orphan the "block with changed rules". Is this the way it would work? But then we would depend on the assumption that the other pools are not agreeing to the change introduced by the "rule-changing pool".

this is how bitcoin has always worked
nodes set the rules and pools follow the rules or find their block rejected in 3 seconds.
if a couple pools colluded to keep the funky block alive by building ontop of it(within the pools harddrive not the network)
but still it would get rejected by the node network.

But what if 51% agree? I think that's what dinofelis already wrote so you can answer his post directly.

if a majority of pools colluded their blocks would be rejected by the network but COULD cause such a fiasco of making people wait for the few honest pools to make a good block it MIGHT be enough to blackmail node users into downloading a new node that would accept the pools new funky blocks

yes the pools would be building a long chain of funky blocks but the nodes wont sync to them. instead they would just wait for an acceptable block or at worse get blackmailed into downloading a new node version.

all pools can do is either give in and start making honest blocks that follow the rules so they can spend their block rewards. or waste months holding the network hostage if there was a majority of pools not following the rules HOPING its enough to pressure the nodes to download a new node version that accepts the blocks so that the rewards are then recognised and spendable

its really time people actually learn consensus
legendary
Activity: 2674
Merit: 2970
Terminated.
Are you familiar with calendars? What's impossible today might not be "tomorrow". Only way to not scale on-chain is saying we will never scale on-chain. If we don't try, we'll never know. If we reject it from the start, it's definitely not going to happen indeed.
I clearly see that you have nothing to do with software engineering, nor algorithm complexity. Scaling on-chain for mainstream is not possible without extreme levels of centralization. This is a fact. I don't plan on fantasizing about miracles in several areas of technology which may somehow enable *more scale* for *less* centralization.

No, SegWit isn't equal to big blocks. Big blocks are well... Bigger blocks. SegWit separates the Witness thus making more room in the same old 1MB block we know. One approach raises the space we have and the other uses the space we have in a smarter way, both approaches scale. Please correct me if I'm wrong.
You don't really understand it. My statement regrading Segwit being essentially a block size increase without a HF is correct. Look at any test net explorer and you can find blocks that are 2-4 MB big. Separating the witness is the method used to achieve this (calculations are therefore done differently, so you seemingly are able to transact more without a hard fork block size increase).

Saying that hardware won't keep up with network needs is implying that blocks would be 10GB each overnight if we allowed them to be like that.
1) No, I did not imply that.
2) Nonsensical exaggeration (from 2 MB to 10 GB).
3) Any attacker could utilize the maximum block size over night.

Price decreases are why I constantly delay migrating from HDD's to SSD's...
The statement wasn't properly written, that is also.
legendary
Activity: 3514
Merit: 1280
English ⬄ Russian Translation Services
You are not a shill, you are just a fountain of (mostly) empty verbiage

Coming from you, I take that as a compliment Smiley

I'm curious if you care that someone is actually reading your posts in their entirety. Personally, I don't read them beyond a few lines. In most case, this is more than enough to understand that they are not worth reading. You would fare a lot better if you kept them concise, coherent and simple

The point in that discussion was slightly different, but related to the point you raise here.  My argument there was that a centralized network (centralized, in the political sense, of central authority, but can be technically distributed, with nodes under central control) can ALWAYS perform technically as well as a decentralized network (which has no choice but to be also distributed), simply because the set of possible decentralized networks, concerning their technical implementation, is a SUBSET of the set of possible centralized networks

I told you already that all your arguments (even if they are technically correct) are invalidated by economics

But this thread is not about that question (which you seem to understand somehow), so if you want to continue to exercise in futility and hilarity, you can do that in that thread. This thread is about scalability (of Bitcoin scalability, more specifically), not centralized versus decentralized systems (read trusted versus trustless networks), and I basically claim that it is feasible to make such a payment system which would be effectively infinitely scalable because its processing capacity would always match its expansion in a self-sustaining way (i.e. it will maintain itself without external effort). Just like Internet works, at least as long as we don't take into account DNS issues
hero member
Activity: 770
Merit: 629
A truly scalable solution, as I recently explained to some dude here, means that processing capacity increases together with the network expansion without any specific action aimed at increasing it. In this way, it remains effectively unlimited at any given moment

Which is as of today, essentially impossible in a decentralized system and fungible units of account.  And if you are talking about me as the dude, no, you didn't invalidate what I was saying there, and as you couldn't find any rational arguments, you started accusing me of being unfair in the discussion.

The point in that discussion was slightly different, but related to the point you raise here.  My argument there was that a centralized network (centralized, in the political sense, of central authority, but can be technically distributed, with nodes under central control) can ALWAYS perform technically as well as a decentralized network (which has no choice but to be also distributed), simply because the set of possible decentralized networks, concerning their technical implementation, is a SUBSET of the set of possible centralized networks.

By that, I simply mean that if you can think of a decentralized network that can handle a certain activity, then you can set up a centralized, distributed network with exactly the same topology and hardware-like investment ; which, at that point, can of course handle the same load.  But moreover, in a centralized system, one has much less of a hassle of solving the trustlessness and the consensus problem.  So every decentralized network can technically also be a centralized network (the nodes are simply under one single command instead of having different untrusted owners).  But a decentralized network cannot take on every possible centralized structure, as each owner needs to have his own node of some kind.  This is why the technical solutions of a decentralized network are a subset of the set of centralized (but eventually distributed) networks.

But on top of that, the task to be run by a centralized network, where all of its nodes are of course trusted, is much simpler than the task of the decentralized network that needs to use protocols to protect from trustlessness, and come to a consensus, which is not an issue in a centralized network.

An argument was that the "processing power/network of the decentralized nodes is for free" while the centralized system has to explicitly handle this.  My point was that the hidden cost per decentralized node is higher than the per-customer cost in the centralized system (exactly for the same reasons as above that show that the network set is a subset).

However, the larger the network, the higher the per-user cost in any case in a decentralized system, which brings us to the impossibility of your statement above.  The reason is that in a decentralized system with fungible units of account, one way or another, the whole network, every user needs to be aware of the balances of every other user in some way or another.  This is not the case in a centralized network, where only the 'central authority' needs to know that.  As such, it is unavoidable that the burden of communicating the balances of a growing number of users and a growing number of transactions to a given user, increases as the number of users increases.  
The burden for me to know what the 10 other people on the small network did, is smaller than the burden for me to know what 1 billion people did.  This is unavoidable.   One can only hope that the value of the large network is worth more to me, than the extra cost of having to know what all these other people did on the network.
A form of centralization can scale down the needed cost of having to be aware of all the other user's balances.  In a totally centralized system, there's no such cost to a single user.  I profit from a larger network at no extra cost.  The more decentralized a network is, the more my cost increases with the number of users.

The two centralisation solutions to, say, bitcoin are:

- only a few central miners.  Then my burden is not increasing: my light wallet acts as my banking app on my smart phone. The miners are the central authorities keeping the block chain and hence, people's balances, and I only need to be aware of my own transactions (light wallet).

- only a few central LN hubs.  Similar, if I can engage LN with a light wallet.  The central hub keeps the information of all its customers in all its channels, and I only care about my channel.

The standard way of solving this is:
- normal banks.  

So, a fungible currency in a decentralized system will always have a cost per user, rising with the size of the network (naively, it is linear, but one can try to find sub-linear tricks) ; while this cost will be constant in a centralized system and not rise with the size.

This is why decentralized networks seem to outperform centralized ones when they are small, and run into cost problems when the grow bigger, to become totally non-competitive when they dream of going mainstream, unless they centralize, which economic forces in the system will make happen in any case.

So, ironically, the structure that allows a network expansion "that pays for itself" is a centralized one.  Each user simply has to pay for his own burden on the centralized structure (with a fee), and that burden doesn't increase if the network grows.  In a decentralized network, adding a user increases the burden for every node in one way or another.
hero member
Activity: 770
Merit: 629
You are not a shill, you are just a fountain of (mostly) empty verbiage

Coming from you, I take that as a compliment Smiley
legendary
Activity: 3514
Merit: 1280
English ⬄ Russian Translation Services
As I've already explained somewhere here, both approaches have centralization risks. But in both cases, if technology evolves rapidly, then it's more decentralized.

Big blocks can be decentralized if internet bandwidth, storage and CPU/RAM costs (needed to validate the tons of transactions in big blocks) go down significantly. At this very moment, a 100-MB-block Bitcoin could be enough to serve 1 billion users (the maximum potential I see today: first-world households and the upper-middle class of third world countries), but that would be definitively too much for consumer hardware, mid-scale businesses in first-world countries could probably handle up to 8-10 MB

Big blocks above certain limits are meaningless

And they are not just meaningless, they are in fact detrimental since once we start increasing them, we will see the same arms race as we already saw in mining, which would be a lot harder to escape than prevent. So, while we have some time ahead (maybe, a year or so), we should develop really scalable solutions. And no, increasing the block size is nowhere near that. A truly scalable solution, as I recently explained to some dude here, means that processing capacity increases together with the network expansion without any specific action aimed at increasing it. In this way, it remains effectively unlimited at any given moment

I have to say I never understood the priorities in the worries expressed in this debate, and I think it is because people start from some pre-conceived dogma to which they think they have to cling, and adapt all the rest around it.  I know that what I'm saying is considered controversial, and that I'm called a shill for that sometimes (I'm not)

You are not a shill, you are just a fountain of (mostly) empty verbiage
hero member
Activity: 770
Merit: 629
I always ask this question:  why not let the free market decide?  Why not give users uninhibited access* to both on-chain and off-chain solutions and let the LN hubs and the miners compete for the fees?  

I think that if the choice is between on chain and off chain transactions, if there are no limitations, everybody is going to prefer on-chain transactions, because with an on-chain transaction, you are free.  Your funds are not locked up in a channel with a well-determined partner.  It is an affair between payer and payee (and miner), and there are no "intermediaries".   I think that the reason why those in favour of segwit don't want larger blocks, is that they need an incentive to move off chain.  If the chain can handle it, there's no reason to leave the chain.  And if there is no hard limit feeding the fee market, there's no reason that these fees would be much higher, than the fees for those, locking up funds in channels with doubtful partners.
hero member
Activity: 770
Merit: 629
Then why not follow that model? What are the advantages and disadvantages of an algorithm and demand based block size solution like that of Monero in your opinion?

Well, my personal opinion on that, but which has no value beyond being my own opinion, is that alt coins were essentially invented to improve upon the "ford-T" that bitcoin is: the first cryptocurrency technology.  As such, instead of trying to "upgrade bitcoin", it is simpler to move to the alt coin, and leave bitcoin for what it is.  What's the use of transforming bitcoin into, say, monero, instead of just using monero ?

Concerning monero, I'm personally quite a fan of that currency even though I have nothing to do with them apart from owning a few self-mined coins for fun that wouldn't even be able to buy me a new laptop, because monero solved several, though not all, no-go fundamental issues I had with bitcoin, the very first and important one being fungibility (what comes down to "anonymity"/"privacy").

Essentially, I consider the big defects of bitcoin to be, amongst others:
- hard currency limit
- simplistic hashing algorithm that can easily be put into ASICS
- hard block limit
- transparency of the block chain
- slow block confirmations
- the principle of PoW by itself not being sustainable

I consider that more than enough for bitcoin not to be acceptable as a long term crypto currency.  I perfectly well understand that the "first crypto prototype" couldn't solve all these issues, and that it was already a great feat to have a system up and running, but it has too many fundamental defects to be seriously considered as "definitive".

Monero (actually, bytecoin, with cryptonote) has been invented to solve mainly the privacy issue of bitcoin, but solved other issues too.  It has:
- tail emission
- complicated hashing algorithm (cryptonight) that resists, for the moment, ASICS
- flexible block size
- obfuscated transactions
- faster confirmation

However, monero needs PoW still, I consider that a problem with monero (it is however very difficult to do PoS with anonymous coins as the stakers don't want to show their stake in principle).  Even though tail emission is better than "sound money hard limit", I think that the emission scheme of monero is still too much "speculative asset" and not enough "stable currency".  I also consider that the PoW reward system is fundamentally flawed.  So, even though monero solves the most urgent flaws in bitcoin, it still has problems.  Nevertheless, I consider it vastly technologically superior over bitcoin.  Monero (like many other coins) also have a totally different, but important difference with bitcoin: regular hard forks.  This is of course an open recognition of the centralization of the protocol, and hence leaves the notion of a decentralized system on the side: monero, like ethereum, DASH, etc... is a totally centralized protocol system, totally in the hands of the dev team.   But this is known and acted from the start.   The dev team can act on many things, but because it is an anonymous coin, however, cannot use its central decision power against specific users (which ethereum can, and did, see the DAO fork).  Given that monero hardforks every 6 months or so, and given that the dev team is a recognized central authority, there are no forking dramas.   Nothing stops users, however, from forking off and keeping a protocol the same, if they think that their dev team is screwing them with the next hard fork, making a new, and this time truly immutable and decentralized, coin, forked off from the "prototype producer".  A bit like ETC did when it didn't agree with the central power the ETH team used against a single user (the DAO hacker).

If you like these properties, and if you think that they are problems for bitcoin, then of course, one could transform bitcoin into a monero clone.  But one could also just leave bitcoin and use monero instead, that sounds more logical, no ?  Why trying to transform a coin into another one, while the other one exists ?

My idea of crypto is that it is a kind of contract that is subscribed to when the genesis block is laid down, and that it should, apart from some simple technical tweaking, remain the same, or have an open centralized culture, with no pretensions at being something else but a dev team's toy, until one forks off and keeps it immutable if one desires so.

legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
Can the block size be decided algorithmically depending on the demand? Why do the miners have to control it? That would make it open to a lot of bad actors colluding.

Of course.  Monero has such an algorithm for instance.  It is maybe not the best possible, but at least it solves the issue in a specific way.

https://monero.stackexchange.com/questions/35/is-there-any-block-size-limitation

Then why not follow that model? What are the advantages and disadvantages of an algorithm and demand based block size solution like that of Monero in your opinion?

I recently created a poll thread asking who should decide the maximum blocksize: miners, developers, or a code-based algo.
The latter won by a large margin.   There have been solid proposals ("flexcap") put forward.  I support this idea.

legendary
Activity: 2898
Merit: 1823
Can the block size be decided algorithmically depending on the demand? Why do the miners have to control it? That would make it open to a lot of bad actors colluding.

Of course.  Monero has such an algorithm for instance.  It is maybe not the best possible, but at least it solves the issue in a specific way.

https://monero.stackexchange.com/questions/35/is-there-any-block-size-limitation

Then why not follow that model? What are the advantages and disadvantages of an algorithm and demand based block size solution like that of Monero in your opinion?
legendary
Activity: 3906
Merit: 6249
Decentralization Maximalist
I just thought of that myself.  I don't know any real-world study about this stuff apart from the selfish mining paper which needs explicitly that miners are in the P2P network *without* direct connections to make it work.  It would be difficult to investigate without the help of the pool nodes themselves, and I don't think they would give this information, because I can imagine that it would show a reality that doesn't correspond to the "competitive image" they want to give.

Thanks for the answer. Well, I've asked in the mining subforum because I consider that topic to be crucial to evaluate the level of decentralization we have today.

I'm sympathizing not only with a Bitcoin change to PoS (that would be difficult, if not impossible) but also with a PoW change that would make pools more inefficient (and solo mining/non-Asic mining more efficient so this centralization risk could be mitigated a bit), but I would have to investigate further, had still not too much time for it and the topic is fairly new in the scaling discussion.

Edit: I just saw your answer, @franky1:

though there is a Fibre network of supernodes that propagate data between pools superfast and as the first tier outward to the node network, the node network does actually play a crucial part.

Is this a confirmation that they effectively connect directly to another, or did you want to write "though there was a Fibre network"?

Quote
all them 20 pools could make their own rules and make blocks how they like just betwen each other, but the 800+ main merchants nodes and backed up by the other 6000 user nodes can reject blocks and literally not want to see blocks for good reasons.

I don't understand how this mechanism could work - like dinofelis already wrote, I suppose the merchants you mention would have to split from the network, because if the pools are basically connected directly one to another, who would care about a merchant node not accepting a block? From my understanding, their (ultra-)minority chain would soon die.

The only way I could imagine is that most competing pools could mine on the merchants' chain and then orphan the "block with changed rules". Is this the way it would work? But then we would depend on the assumption that the other pools are not agreeing to the change introduced by the "rule-changing pool". But what if 51% agree? I think that's what dinofelis already wrote so you can answer his post directly.
hero member
Activity: 770
Merit: 629
though there is a Fibre network of supernodes that propagate data between pools superfast and as the first tier outward to the node network, the node network does actually play a crucial part.

all them 20 pools could make their own rules and make blocks how they like just betwen each other, but the 800+ main merchants nodes and backed up by the other 6000 user nodes can reject blocks and literally not want to see blocks for good reasons.

And would they then stop ?  Seriously ?
Of course it depends on the craziness of the changes, if it is too weird, indeed, exchanges might prefer to stop their nodes (which is what you are suggesting) rather than accept these totally crazy blocks, and put a notice on their website: "sorry, bitcoin block chain down, we stopped our nodes, so no more withdrawals or deposits for the time being, but you can continue trading with bitcoin IOU on the site".

But suppose, for instance, that all the miners decide to increase the block size to 2 MB, and that's it.  Do you think that exchanges are going to stop their nodes and not serve their customers any more ?  Do you think that you, as a bitcoin user, are not going to upgrade your software to be able to download that bigger block block chain, and just sit there with a node that has come to a halt ?

Because if all miners agree, there won't be any growth in the old chain.  Remember that we are in the case where all pools agree on a change.  If they all agree, the old chain simply stops growing, and the non-mining nodes that want to see the old chain, simply stop, cannot do any transactions any more and cannot confirm anything.  

I'm pretty sure that all economic users (which are not the non-mining nodes !) are going to adapt to whatever 'reasonable' chain the miners are going to mine, eventually connecting their light wallets directly to miner nodes, if they don't find peers in the P2P network that are still running, in order to be able to transact.  Simply because there is no other chain available.

And exchanges will do the same: they will update their full nodes to follow the miner's rules, because they don't want to kill their business, by having to say that they cannot accept any deposits or withdrawals, while their competitor upgraded.

Quote
meaning the pools end up wasting time making blocks that are seen as unspendable well before a pools reward matures...
making pools know they wont even get to spend it before they even get a chance.

That is perfectly true for SOME mining pools changing stuff.  But we are in the scenario where ALL mining pools agreed upon a change, remember.  If SOME mining pools change stuff, we have a hard fork.  If ALL mining pools change something, then there's only one block chain out there.  You use it, or you don't ?  If you don't, you simply have no access any more to your funds.
hero member
Activity: 770
Merit: 629
I've also thought about that scenario. But is it that easy as you describe? Mining pools, after all, in theory are competing entities and no pool would like to give a "faster connection to other miner"  for free - and if yes, then they are colluding and we are effectively 51%ed already.

What I doubt is if it is really "mutually beneficial", because if there is no direct link between both miners the benefit is the same than if both have a direct connection. That is only mutually beneficial if e.g. two pools unite against the other pools.

Well, it is this kind of mutual benefit that they have.  As a miner, I want to know as fast as possible, the block of my competitor that won, because if I continue mining on the old block, I'm wasting my hash rate *if that competitor block got first to most other competitors*.  So I want to *receive* as quickly as possible "the next block".  But I also want other miners to *know* MY solved blocks as quickly as possible, because it is only when they start mining on them that I am sure that my block is not going to be orphaned.

So even though I'm competing with other miner pools, I want to communicate with them as well as possible (at least, with the important ones), because any delays are not only bad for me on the receiving side, but also on the emitting side.  If I receive late, I waste hash rate, and if others receive my blocks late, my risk for orphaning is bigger.

In fact, this is the opposite as "selfish mining", because selfish mining was based upon honest miners' blocks being potentially slower on the network than my selfishly mined sequence of blocks ; in other words, me profiting from the fact that the other miners are badly connected between them.  So direct connections are also beneficial to protect against other miners applying selfish mining.  In fact, clusters of pools that are well-connected profit somewhat from "selfish mining" with respect to those that aren't in their backbone network.

Quote
That doesn't mean that your scenario isn't possible or even probable, but here I would like to see academic investigations about that topic. (If you have access to some, then I'll be grateful for a link).

I just thought of that myself.  I don't know any real-world study about this stuff apart from the selfish mining paper which needs explicitly that miners are in the P2P network *without* direct connections to make it work.  It would be difficult to investigate without the help of the pool nodes themselves, and I don't think they would give this information, because I can imagine that it would show a reality that doesn't correspond to the "competitive image" they want to give.

In fact, I don't even know if these mining pools have to "agree amongst themselves" (that is, if they have to phone one another).  If you know the IP address of your competing pools, as a pool owner, I would configure my node to connect *directly* to them as my network peers.  And they would probably include you as a direct peer.  And if all of you pay premium internet access, automatically, you get a kind of backbone mesh between pools.  Without "sitting in a room and agreeing". 
legendary
Activity: 924
Merit: 1000
LN is currently aiming at 60$

so there is your micropayment upper limit  Wink

I believe as long as on-chain transactions over 0.05-0.1 BTC are affordable (fee no more than 1/10 %), there is no need for larger blocks

Delusional.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
LN is currently aiming at 60$

so there is your micropayment upper limit  Wink

I believe as long as on-chain transactions over 0.05-0.1 BTC are affordable (fee no more than 1/10 %), there is no need for larger blocks

I always ask this question:  why not let the free market decide?  Why not give users uninhibited access* to both on-chain and off-chain solutions
and let the LN hubs and the miners compete for the fees?   I say we should.  I say if smaller transactions become problematic on-chain, they will become naturally slower
to confirm and more expensive on their own...and there is no need to force miners or users to behave a certain way for the good of the network.  

* meaning any blocksize limits must be well beyond market demand for space
sr. member
Activity: 1400
Merit: 269
The question is can the off chain network can still handle the bitcoin transactions?
Isn't it why there is an scaling issue where the users payment get delayed, what's the point of decentralized network if your transactions gets confirmed a week?
People saying that segwit it owned by bankers but is it really true or it's all a bunch of conpiracy theories planned to throw off people so bitcoin will have an scaling problem forever.
legendary
Activity: 1372
Merit: 1014
LN is currently aiming at 60$

so there is your micropayment upper limit  Wink

I believe as long as on-chain transactions over 0.05-0.1 BTC are affordable (fee no more than 1/10 %), there is no need for larger blocks
legendary
Activity: 1512
Merit: 1012
There's no way to scale bitcoin to mainstream levels on-chain, there's just no possible way, because it would take insane levels of block size
Insane levels? Why?
That's a very weird question. Are you familiar with basic math, i.e. one being done in primary school? 7 TPS in theory, 3 TPS in practice at 1 MB. That's ~300k TXs per day (current statistic). To serve 1 billion people, you'd need at least a capacity of 1 TX per person per day (which can be used as some average, as some will do a lot more while others won't transact each day). 1000 MB blocks brings us to 300 000k TXs per day or 300 Million per day (~3000 TPS). In comparison, Visa does 2000 TPS on average and is capable of doing at least 10x that. It's just nonsensical to even attempt this after a certain size if you want to retain decentralization.

Are you familiar with calendars? What's impossible today might not be "tomorrow". Only way to not scale on-chain is saying we will never scale on-chain. If we don't try, we'll never know. If we reject it from the start, it's definitely not going to happen indeed.

Hasn't SegWit been developed? Won't it tidy up transactions on the block? You have no confidence SegWit or similar can work in the future or that it can't be further developed or improved?
Do you even understand what SegWit does? It doesn't seem like you do. SegWit is equal to big blocks essentially (if used and ranges from 1 to 4 MB depending on said usage). SegWit scales down the complexity of validation time from quadratic to linear. That's the only significant improvement related to bandwidth, propagation, storage, RAM or processing which are the constraints for big blocks.

No, SegWit isn't equal to big blocks. Big blocks are well... Bigger blocks. SegWit separates the Witness thus making more room in the same old 1MB block we know. One approach raises the space we have and the other uses the space we have in a smarter way, both approaches scale. Please correct me if I'm wrong.

Blocksizes won't increase drastically overnight.
Nobody says that they are. You need to be prepared for the worst case though, even if it means someone filling up the maximum block size from day 1.

Saying that hardware won't keep up with network needs is implying that blocks would be 10GB each overnight if we allowed them to be like that. It's a recurrent theme on forum posts lately. I do agree we have to be ready for worst case scenario though (thus dynamic block approaches have been suggested).


New hardware will always appear and get cheaper, so consumers will eventually switch when their hardware becomes slow, unsupported or obsolete, so that's a plus for Big Blocks. Availability is pretty high (many people have slow, but stable connections), so that's a plus for LN.
Incorrect generalization. RAM and SSD prices are currently rising due to shortages. This is of course, temporary, but invalidates your statement.

SSD prices are decreasing in the long run and capacities are increasing. There's empirical evidence on this for those who usually browse hardware retailers websites: disks tend to get cheaper and lower capacity disks are being discontinued. One of the stores where I usually buy hardware just recently slashed prices and discontinued SSD's under 120GB. Prices do increase slightly every now and then due to chip providers going bankrupt, other shortages or simple increases in demand. Some price increases aren't noticeable by the end user because stores already have their prices set with quite a nice profit margin.

I'm sorry if I haven't made myself clear, prices do decrease in the long run, unless something really out of the ordinary happens. A quick search shows us this, if you have more recent data (anda data about RAM), please do post it (actually interested on it for things beyond Bitcoin).

Price decreases are why I constantly delay migrating from HDD's to SSD's...
Pages:
Jump to: