Pages:
Author

Topic: LN+segwit vs big blocks, levels of centralization. - page 9. (Read 8897 times)

hero member
Activity: 770
Merit: 629
its far better to have 20 brands of implementations from different teams. and they all are using consensus to stay on the same network. rather than thinking there needs to be a team "trusted to become the new core devs"

I fully agree with that.  If several software implementations, independent one from another, and on "equal footing", would have arisen, then one would have seen the clear distinction between the protocol on one hand, and the software implementations on the other.  A bit like HTML on one hand, and mosaic, netscape, firefox, internet explorer.... on the other.

But Satoshi, the central decider of bitcoin, put up a Pope, his heir: core.  As such, this centralized force continued to exist.  When other elements in the power structure of bitcoin arose, and put their hegemony into question, people were surprised.
hero member
Activity: 770
Merit: 629
The only way I could imagine is that most competing pools could mine on the merchants' chain and then orphan the "block with changed rules". Is this the way it would work? But then we would depend on the assumption that the other pools are not agreeing to the change introduced by the "rule-changing pool".

this is how bitcoin has always worked
nodes set the rules and pools follow the rules or find their block rejected in 3 seconds.
if a couple pools colluded to keep the funky block alive by building ontop of it(within the pools harddrive not the network)
but still it would get rejected by the node network.

But what if 51% agree? I think that's what dinofelis already wrote so you can answer his post directly.

if a majority of pools colluded their blocks would be rejected by the network but COULD cause such a fiasco of making people wait for the few honest pools to make a good block it MIGHT be enough to blackmail node users into downloading a new node that would accept the pools new funky blocks

yes the pools would be building a long chain of funky blocks but the nodes wont sync to them. instead they would just wait for an acceptable block or at worse get blackmailed into downloading a new node version.

It is not a matter of blackmailing.  Most users (that is, the people buying and selling bitcoins against goods, services and fiat) use light wallets/onlline wallets, and only care about their transaction getting through, and couldn't care less about any politics and technical strategies of bitcoin.  Most exchanges want users to trade, exchange, deposit, withdraw, because they take fees on all of that.  So, apart from totally crazy modifications, they just want "a node that works", and will point their light wallets to whatever works.  The 5000 or so bitcoin ful nodes that do not mine, are insignificant with respect to the millions of bitcoin owners/users/traders.

So if the large majority of these 5000 nodes simply decides not to download the only block chain available out there, then it is simply as if they stopped working.  The users will point their wallets to the few nodes that still process and confirm transactions, that is, the miner pool nodes.  Exchanges will want users to be able to deposit and withdraw, so they better upgrade their nodes such that they follow the "live" block chain. 

I'm pretty sure that nor users with their light wallets, nor exchanges, are going to sit down, stop their nodes, and wait for an eventual block according to their rules, that may never come, and stop their business/freeze their holdings for that undetermined time, while others happily continue.
legendary
Activity: 4424
Merit: 4794
they cannot be trusted to become the new core developers not because of their motives but because they do not have enough skill.

a few things to clarify
1. no dev team should have control..
its far better to have 20 brands of implementations from different teams. and they all are using consensus to stay on the same network. rather than thinking there needs to be a team "trusted to become the new core devs"

2. even core devs should not be trusted. devs come and go, they age, get bored, get bought, they retire or move onto different projects..
devs are temporary.

for reasons 1 and 2 no teams should have their asses kissed and held up as kings.

EG
real world
many people trust and adore apple and scream to the world that apple are the only phone for them and start insulting anyone else that wants a samsung/nokia/htc/ etc.

my view is lets any manufactur make a phone and all work on the same 4g network and all work together to find a consensus of what standards a 5g network should be.

but many say "no let apple be the ones that choose 5g standards"
legendary
Activity: 2898
Merit: 1823
Can the block size be decided algorithmically depending on the demand? Why do the miners have to control it? That would make it open to a lot of bad actors colluding.

Of course.  Monero has such an algorithm for instance.  It is maybe not the best possible, but at least it solves the issue in a specific way.

https://monero.stackexchange.com/questions/35/is-there-any-block-size-limitation

Then why not follow that model? What are the advantages and disadvantages of an algorithm and demand based block size solution like that of Monero in your opinion?

I recently created a poll thread asking who should decide the maximum blocksize: miners, developers, or a code-based algo.
The latter won by a large margin.   There have been solid proposals ("flexcap") put forward.  I support this idea.



Good to know you are not 100% behind the incompetent developers of Bitcoin Unlimited. You yourself should know by now that they cannot be trusted to become the new core developers not because of their motives but because they do not have enough skill.
legendary
Activity: 4424
Merit: 4794
P.S dont promise/promote 2mb or 4mb.. of data bloat
Which is something that I have not done.

check your own post history. the number of times you have been promoting 2.1mb
there was even a time that many mentioned 1.7mb max expectation as a better expectation. but you went 'full wetard' repeating 2.1mb endlessly

Here is a small example:
Quote
In practice, based on the average transaction size today and the types of transactions made, the block size limit is expected to have a maximum limit of ~1.7 MB post-SW.
They use outdated date to make it seem like Segwit brings only a little improvement for a 'lot of complexity'. Latest transaction pattern usage review shows that we can expect around 2.1 MB.

and for months i have been correcting your 2.1mb.... with.. IF 100% used segwit keys.. which you keep failing to understand
legendary
Activity: 4424
Merit: 4794
Big block proponents usually ignore the quadratic hashing problem,

im laughing

i have FOR MONTHS been telling people segwit does not help reduce/avoid/prevent/stop/fix native key quadratics, infact it makes things worse

core v0.12 maxtxsigops 4000 (10seconds process)
core v0.14 maxtxsigops 16000(8mins process)

i have for months been saying keep it at 4k or below..
legendary
Activity: 3514
Merit: 1280
English ⬄ Russian Translation Services
Big blocks above certain limits are meaningless

And they are not just meaningless, they are in fact detrimental since once we start increasing them, we will see the same arms race as we already saw in mining, which would be a lot harder to escape than prevent.

I fully agree and that's why I consider Bitcoin Unlimited and other "unlimited" proposals dangerous. I'm however not opposed to a conservative block size increase based on the technical evolution (e.g. the proposed 17,7% per year in BIP 103). In this scenario, centralization risk should be manageable

I'm afraid that this is not a very good solution overall

In fact, it is not a solution altogether since what you (they) suggest ("a conservative block size increase") basically comes down to freezing the current situation where it is now. I don't think it makes a lot of sense to mark time when we can move ahead and there is plenty of room for improvement at that. As Alice once said and then I repeated, we should run as fast as we can just to stay in place, and we should run twice as fast if we want to get anywhere
hero member
Activity: 770
Merit: 629
The way you downplay the importance of non-mining nodes is ridiculous imo. Non-mining nodes can keep the miners in check from acting in stupid ways. Non-mining nodes, can treat them with UASFs, and using any client they want to validate the transactions.

Well, that is the standard dogma, but I have challenged different people in finding a logical hole in the gedanken experiment that proves the contrary, and the only thing they come up with is what I'd call rhetoric, that is to say, non-logical arguments (like saying that I must be paid by someone, that I'm an idiot, that I'm wrong, that this and that such, but never a logical argument).

The closest come arguments that confuse non-mining nodes with users in the market.  But that's wrong, because users are people doing transactions, and they mostly use light wallets.  There are millions of bitcoin users, and only a few thousand of full nodes.  So full nodes are at most a promille of the user community, and are not representative of the users nor of their market weight.  As such, one clearly needs to distinguish the economic weight of users (who only care about the value of their token, and their ability to transact) and the "large majority of non mining nodes".  So non-mining nodes are not "the users", or "the market".  

My gedanken experiment is as I outlined above: suppose that all 20 important mining pools (and their miners) decide to make ONLY a block chain of 2 MB blocks, and that 99% of the non-mining nodes refuse that, but that the users only want to continue transact their tokens from their light wallets or online wallets, no matter what (to separate the effect of non-mining nodes from that of users).  As such, they will use whatever means they have, to continue to use bitcoin transactions, and don't care about any of these politics.

Tell me how those non-mining nodes are going to keep in check the miners and the users and force them to remain on the 1 MB blocks, if the users connect their light wallets directly to the miner nodes, and the miner nodes are connected directly amongst themselves.  We can also assume that exchanges cannot allow themselves to have a stopped node, as they would lose all their customers to the competition, so the exchanges are also going to use or a light wallet directly connected to the miner nodes, or upgrade their full node to the 2 MB protocol.

So, now we have:
- all miners making a 2 MB chain and they can continue do so because they send their blocks directly amongst themselves
- all users connect their light wallets directly to the miner nodes and send them their transactions
- exchanges are the less than 1% of non-mining nodes accepting 2 MB block chains and continue working
- all other (> 99%) non-mining nodes refuse these blocks, and want to "keep the miners in check, and menace them with a UASF"

Tell me how these non-mining nodes are going to succeed, which is your thesis, and would be the proof of the fact that non mining nodes keep the miners in check.

Once you finally realize (as Satoshi already told us in 2008) that "the only people needing to run full nodes are the people mining new coins", you might realize the futility of their existence in the power structures of bitcoin, and once you realize that, you might re-consider all arguments that are based upon the need of their maintaining in place and the compromises that go with that.

It is very simple.  Demonstrate, in the above Gedanken experiment, the full power of the network of non-mining nodes, keeping miners and users in check.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political


Big block proponents usually ignore the quadratic hashing problem,

This isn't a long term problem.  Flextrans has solved it in a way so that linear scaling is possible, regardless of block size. 

https://bitcoinclassic.com/devel/Quadratic%20Hashing.html
legendary
Activity: 3906
Merit: 6249
Decentralization Maximalist
@dinofelis: Have got an answer: There is the Fibre network, I think that's what Franky mentioned in his post.

Big blocks above certain limits are meaningless

And they are not just meaningless, they are in fact detrimental since once we start increasing them, we will see the same arms race as we already saw in mining, which would be a lot harder to escape than prevent.

I fully agree and that's why I consider Bitcoin Unlimited and other "unlimited" proposals dangerous. I'm however not opposed to a conservative block size increase based on the technical evolution (e.g. the proposed 17,7% per year in BIP 103). In this scenario, centralization risk should be manageable.

Just for fun, I did a little kindergarden math and if we apply BIP 103 which is based on the average internet bandwidth growth, then in 43 years we have 1 GB blocks (what according to Lauda would be necessary to have on-chain capacity for 1 billion people, if Bitcoin is used every day by the users). In 43 years a lot can happen, but I have slight doubts that the actual growth will be sustainable. So second-layer solutions are probably necessary, unless there's really a quantum leap in computing technology. LN is in my opinion good enough for micro- and minipayments up to 50-100 USD.

if a majority of pools colluded their blocks would be rejected by the network but COULD cause such a fiasco of making people wait for the few honest pools to make a good block it MIGHT be enough to blackmail node users into downloading a new node that would accept the pools new funky blocks

OK. Yes, I understand the basics and you are absolutely right that this way a totally "funky" change that breaks established consensus rules (e.g. changing the block size to 1 GB) is prevented. But for this mechanism to work we don't need non-mining nodes because the other pools would also be interested in invalidating too "funky" changes, to prevent a price crash because of the "unstability" of the consensus rules.

So as far as I understand, the case where non-mining nodes become important is mostly when a small majority (let's say 51-70%) of the mining pools collude for a consensus change, but a majority of the economic nodes rejects that changes, that would give the opportunity to the remaining pools to continue to mine an alternative chain which has still an acceptable block time and eventually becomes the longest if the dishonest "rule-changing" miners give up. If the majority of colluding pools is larger, however, then the "honest" network would directly come to a halt.

My doubts are, as I discussed above with dinofelis, that in the case miners are directly connected (as it seems to be the case with the FIBRE network) there are other control mechanisms that would not work. For example, there was a discussion where it was proposed to delay block propagation from miners that do not agree to a soft fork (it's basically a earlier form of UASF or an "UASF without explicit UASF mechanism") to give them an incentive to agree. That as far as I understand would have no effect if pools are using the Fibre network because they would be able to relay their blocks to other pools fastly anyway.
legendary
Activity: 1204
Merit: 1028
I always ask this question:  why not let the free market decide?  Why not give users uninhibited access* to both on-chain and off-chain solutions
and let the LN hubs and the miners compete for the fees?   I say we should.  I say if smaller transactions become problematic on-chain, they will become naturally slower
to confirm and more expensive on their own...and there is no need to force miners or users to behave a certain way for the good of the network.  

* meaning any blocksize limits must be well beyond market demand for space

The market has already decided: Majority of nodes support Core software (and segwit ready Core software). Most people running services already are 75%+ pro segwit and 70% are rejecting BU. Futures in bitfinex for BU dumped below $200... what else do you need?

If people really wanted what you want, they would use an altcoin that has those features.
You are answering a different question than the one I asked.  Anyone else?
Because leaving open at least 1 known attack vector is the right way to go (DoS attack via quadratic hashing anyone?). Let the passengers call the shots in a plane with at least 1 hole in it. What could possibly go wrong? Really smart, jonald.

Note: This is even ridiculous when one doesn't consider the immense incentive for centralization that this would create.

Big block proponents usually ignore the quadratic hashing problem, but to be fair, i've been told that the HK agreement included 1 year of segwit only, then after this year, upgrade to 2MB block size.

Then again, we are seeing some miners being extremely anti segwit, to the point of being stupid like ViaBTC which basically admitted they didn't even read what segwit is about before refusing it.

So it seems they have no technical knowledge whatsoever, I doubt they know what the quadratic hashing problem is.
legendary
Activity: 2674
Merit: 2970
Terminated.
I always ask this question:  why not let the free market decide?  Why not give users uninhibited access* to both on-chain and off-chain solutions
and let the LN hubs and the miners compete for the fees?   I say we should.  I say if smaller transactions become problematic on-chain, they will become naturally slower
to confirm and more expensive on their own...and there is no need to force miners or users to behave a certain way for the good of the network.  

* meaning any blocksize limits must be well beyond market demand for space

The market has already decided: Majority of nodes support Core software (and segwit ready Core software). Most people running services already are 75%+ pro segwit and 70% are rejecting BU. Futures in bitfinex for BU dumped below $200... what else do you need?

If people really wanted what you want, they would use an altcoin that has those features.
You are answering a different question than the one I asked.  Anyone else?
Because leaving open at least 1 known attack vector is the right way to go (DoS attack via quadratic hashing anyone?). Let the passengers call the shots in a plane with at least 1 hole in it. What could possibly go wrong? Really smart, jonald.

Note: This is even ridiculous when one doesn't consider the immense incentive for centralization that this would create.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
LN is currently aiming at 60$

so there is your micropayment upper limit  Wink

I believe as long as on-chain transactions over 0.05-0.1 BTC are affordable (fee no more than 1/10 %), there is no need for larger blocks

I always ask this question:  why not let the free market decide?  Why not give users uninhibited access* to both on-chain and off-chain solutions
and let the LN hubs and the miners compete for the fees?   I say we should.  I say if smaller transactions become problematic on-chain, they will become naturally slower
to confirm and more expensive on their own...and there is no need to force miners or users to behave a certain way for the good of the network.  

* meaning any blocksize limits must be well beyond market demand for space

The market has already decided: Majority of nodes support Core software (and segwit ready Core software). Most people running services already are 75%+ pro segwit and 70% are rejecting BU. Futures in bitfinex for BU dumped below $200... what else do you need?

If people really wanted what you want, they would use an altcoin that has those features.

You are answering a different question than the one I asked.  Anyone else?

legendary
Activity: 1204
Merit: 1028


The way you downplay the importance of non-mining nodes is ridiculous imo.
I've seen this talk from corporation sponsored agents before. Nothing surprising.


What's funny is, I've seen Roger Ver pushing this notion lately:

https://twitter.com/rogerkver/status/853250894162350080

Gavin CIA Andersen of course also pushing the "yo, full nodes don't matter, just let corporations run them" agenda. You know who is under government paychecks easily by looking at who defends that.
legendary
Activity: 2674
Merit: 2970
Terminated.
yet again all you can do is your standard reply of 'insult and your a shill"
That's because you are either delusional or a shill. There is no other way to explain your behavior.

P.S dont promise/promote 2mb or 4mb.. of data bloat
Which is something that I have not done.

dont promise/promote 2x or 4x.. of tx capacity
Which is something that I have not done either.

real world conditions wont result in it. try sticking to reality that has many more factors beyond the empty half promises you keep promoting
Again, delusional.

you really have wasted a year not learning these things. i even told you last year to not expect 2x-4x data bloat (todays statement) or 2x-4x tx capacity (your years promotion statements)
You are the one who doesn't understand anything here. I doubt that anyone will live to see the day that you stop using this forum.

The way you downplay the importance of non-mining nodes is ridiculous imo.
I've seen this talk from corporation sponsored agents before. Nothing surprising.
legendary
Activity: 1204
Merit: 1028
I have to say I never understood the priorities in the worries expressed in this debate, and I think it is because people start from some pre-conceived dogma to which they think they have to cling, and adapt all the rest around it.  I know that what I'm saying is considered controversial, and that I'm called a shill for that sometimes (I'm not), but I have posted it several times, I've been arguing over it several times, and I have never heard any convincing argument to the contrary.  It is this, in my eyes quite obvious, statement:

"the amount of non-mining full nodes is absolutely no measure for the decentralization of a coin." (at least, with a PoW coin).



The way you downplay the importance of non-mining nodes is ridiculous imo. Non-mining nodes can keep the miners in check from acting in stupid ways. Non-mining nodes, can treat them with UASFs, and using any client they want to validate the transactions.

This is the case only if the blocksize is small enough for the average joe to be able to run a node on his computer of course. Which is why evil actors that have ties within the mining monopoly industry want to raise the blocksize as high as possible to not let average people run nodes and be a present force within the game theory dynamics.

legendary
Activity: 1610
Merit: 1183
LN is currently aiming at 60$

so there is your micropayment upper limit  Wink

I believe as long as on-chain transactions over 0.05-0.1 BTC are affordable (fee no more than 1/10 %), there is no need for larger blocks

I always ask this question:  why not let the free market decide?  Why not give users uninhibited access* to both on-chain and off-chain solutions
and let the LN hubs and the miners compete for the fees?   I say we should.  I say if smaller transactions become problematic on-chain, they will become naturally slower
to confirm and more expensive on their own...and there is no need to force miners or users to behave a certain way for the good of the network.  

* meaning any blocksize limits must be well beyond market demand for space

The market has already decided: Majority of nodes support Core software (and segwit ready Core software). Most people running services already are 75%+ pro segwit and 70% are rejecting BU. Futures in bitfinex for BU dumped below $200... what else do you need?

If people really wanted what you want, they would use an altcoin that has those features.
legendary
Activity: 4424
Merit: 4794
Which still has nothing to do with either one of my statements. Are you really this delusional or are you just becoming very terrible at the shilling job?

lol goodluck with your life.
yet again all you can do is your standard reply of 'insult and your a shill"

P.S dont promise/promote 2mb or 4mb.. of data bloat
dont promise/promote 2x or 4x.. of tx capacity

real world conditions wont result in it. try sticking to reality that has many more factors beyond the empty half promises you keep promoting

you really have wasted a year not learning these things. i even told you last year to not expect 2x-4x data bloat (todays statement) or 2x-4x tx capacity (your years promotion statements)
legendary
Activity: 2674
Merit: 2970
Terminated.
lol nice way for you to avoid admitting that tx capacity does not increase to the previously promoted amounts of the last year of promoting segwit,
i see gmaxwell is teaching you well at twisting words.

so come on just say it.. segwit does not promise tx capacity growth to the assumed amounts previously promoted for the last year
Which still has nothing to do with either one of my statements. Are you really this delusional or are you just becoming very terrible at the shilling job?
legendary
Activity: 4424
Merit: 4794
Look at any test net explorer and you can find blocks that are 2-4 MB big. Separating the witness is the method used to achieve this (calculations are therefore done differently, so you seemingly are able to transact more without a hard fork block size increase).
look at any testnet and you will see blocks that are filled with certain pattern of tx's to get certain stats.
-snip-
the reality is that the 2mb 'assumtion' is based on a block that is 100% full of segwit key users that do whats deemed as average in-out spending habits.

but fighting against native key users. you wont get the 2mb 'assumption'
so even with segwit the 2009+ expectation of 7tx's is still not a guarantee/promise
What are you on about? Nobody claimed that you'd get double the TX throughput right away. I was pointing out to the user, that we would essentially see 'bigger blocks' post Segwit and refered them to a place where they should look at. Maybe you should start reading posts before you copy responses from your script.

lol nice way for you to avoid admitting that tx capacity does not increase to the previously promoted amounts of the last year of promoting segwit,
i see gmaxwell is teaching you well at twisting words.

so come on just say it.. segwit does not promise tx capacity growth to the assumed amounts previously promoted for the last year
Pages:
Jump to: