Pages:
Author

Topic: Why can Core not Scale at a 0.X% for blocksize in the S-Curve f(X) - page 4. (Read 1126 times)

legendary
Activity: 1456
Merit: 1175
Always remember the cause!
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
No matter what Satoshi did or said or intended to do, whatever. An increase in block size is not recommended because it won't help decentralization. I don't say it would escalate centralization (like what Core falsely claimed during the debate), it is just not helpful.

Instead I support block time decrease because of its proportional desired impact on decentralization.

Anyway, it is very important to understand, scalability is not addressable by either approaches because of nonlinearities involved, both should be considered like some complementary improvement.

Assuming block weight limit is still 4.000.000 weight, block time decrease also could lead to centralization on different way. Without proper research, block propagation could be slower than block time and increase block orphan rate.
It is the old Core fallacy I denounced up-thread. Neither block size increase nor block time decrease would have a negative impact on centralization as long as we are not speaking about a linear increase/decrease in large scales.
Core campus argument was that scaling bitcoin by increasing block size would continue to reach to insane orders of magnitude eventually and eliminate more and more full nodes gradually.

They could simply disagree with such a linear continuous increase, instead they preferred to reject any increase proposal. Why? Nobody knows!
They could consider up to 10 times increase without affecting anything in the network in terms of centralization, but they made a taboo out of it: You are in favor of on-chain scaling? You are a bitcoin cash puppet!

Mining centralization is a real fact, I've extensively discussed and analysed it and proved that there is no 'small miner' in the network right now (other than few hobbyists I guess), it is all about very large mining farms and pools with millions of dollars worth of investments. I got mathematical proof for this, undisputable.

So, handling say 10 times more transactions in a period of time either by increasing block size or reducing block time is not a big deal for mining nodes, they are wealthy enough to use more powerful nodes. Actually, they already use fault tolerant server grade systems for this.

And guess what? When it comes to my suggestion about block time decrease, it has direct, complementary positive effect on decentralization: More distributed rewards, more chance for smaller pools to compete and remain competitive.

Quote
Besides, this could tamper total coin supply/production rate (even though this can be solved easily) and could mess with Timelock script which use block number as it's duration/time.
Both coin supply and Timelock could be readjusted, they should. As of coin supply it is trivial, just reduce the reward amount relatively, and for time lock scripts, it needs a little tweak in verifying old scripts plus retiring current opcode, CLTV, and using a new opcode such that new scripts would use the new scale for nlocktime.

Quote
But looking at the bright side, people will feel Bitcoin is faster and there will less waiting time.
I'm just Surprised!

Finally in a Core dominated atmosphere somebody realized a non-Core related proposal got one useful feature, see? We are progressing  Cool
legendary
Activity: 1372
Merit: 1252
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
No matter what Satoshi did or said or intended to do, whatever. An increase in block size is not recommended because it won't help decentralization. I don't say it would escalate centralization (like what Core falsely claimed during the debate), it is just not helpful.

Instead I support block time decrease because of its proportional desired impact on decentralization.

Anyway, it is very important to understand, scalability is not addressable by either approaches because of nonlinearities involved, both should be considered like some complementary improvement.

Block time modifications would screw up other things too. I don't think it would make a difference. Do you have any models with numbers that show it would improve without tradeoffs?

And nontheless it would require a hardfork. At the end of the day it's not even about a blocksize change but the hardfork itself. There will always be some people disagreeing about it which will lead to the creation of another altcoin.
legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
No matter what Satoshi did or said or intended to do, whatever. An increase in block size is not recommended because it won't help decentralization. I don't say it would escalate centralization (like what Core falsely claimed during the debate), it is just not helpful.

Instead I support block time decrease because of its proportional desired impact on decentralization.

Anyway, it is very important to understand, scalability is not addressable by either approaches because of nonlinearities involved, both should be considered like some complementary improvement.

Assuming block weight limit is still 4.000.000 weight, block time decrease also could lead to centralization on different way. Without proper research, block propagation could be slower than block time and increase block orphan rate.
Besides, this could tamper total coin supply/production rate (even though this can be solved easily) and could mess with Timelock script which use block number as it's duration/time.

But looking at the bright side, people will feel Bitcoin is faster and there will less waiting time.
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
No matter what Satoshi did or said or intended to do, whatever. An increase in block size is not recommended because it won't help decentralization. I don't say it would escalate centralization (like what Core falsely claimed during the debate), it is just not helpful.

Instead I support block time decrease because of its proportional desired impact on decentralization.

Anyway, it is very important to understand, scalability is not addressable by either approaches because of nonlinearities involved, both should be considered like some complementary improvement.
legendary
Activity: 3934
Merit: 3190
Leave no FUD unchallenged
Also about the BLS signatures, this is what the lead dev of the schnorr signatures BIP had to say about them;

I think they're awesome.

However, they're also slower to verify, and introduce additional security assumptions. That roughly means there are additional ways in which they could be broken, which don't apply to Schnorr or ECDSA. As a result, I expect it to be much easier to find widespread agreement to switch to Schnorr (which has no additional assumptions, and is slightly faster than ECDSA).

So i highly doubt that we will see these any time soon..

I also have yet to fully understand why these BLS signatures are so "Awesome", the only thing i've really heard about them is that they're generally slower to verify, (as stated above).. Does anyone know what the general advantages of them are in (for example) comparison to schnorr?

I'm still trying to understand what the practical implications of the differences are, since it's all quite technical.  Apparently, BLS could combine every signature in a block into a single signature, which would save space and allow for more transactions.  A 2-of-3 multisig is more efficient in BLS, as key aggregation works without needing to generate a merkle tree of public keys.  Stuff like that.  But the impression I get is that if BLS sigs take longer to verify, it's likely too high a price to pay for such bonuses, which may prove less important than just having Schnorr's raw efficiency and speed.

It's one of those either/or deals, so we can only pick one.
legendary
Activity: 1372
Merit: 1252
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Some say satoshi expected consensus to be eventually reached in order to raise the blocksize, which seems pretty naive in retrospect, since it is clear now that it's almost impossible to reach said consensus, and most likely we are stuck with 1MB.

Other's say that this was completely expected by satoshi, and he knew perfectly well that the 1MB limit was set in stone, and this is just part of the Bitcoin game theory, so it can be immutable, while remaining open source and so on.

We will never know what his true intentions were, what do we know now is, no increases of the block size are going to happen anytime soon.
legendary
Activity: 1946
Merit: 1427
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....

Foresee what exactly? I'm quite missing your point.. Looks to me like you put as many buzzwords in a paragraph as possible and didnt really read the thread...

Also about the BLS signatures, this is what the lead dev of the schnorr signatures BIP had to say about them;

I think they're awesome.

However, they're also slower to verify, and introduce additional security assumptions. That roughly means there are additional ways in which they could be broken, which don't apply to Schnorr or ECDSA. As a result, I expect it to be much easier to find widespread agreement to switch to Schnorr (which has no additional assumptions, and is slightly faster than ECDSA).

So i highly doubt that we will see these any time soon..

I also have yet to fully understand why these BLS signatures are so "Awesome", the only thing i've really heard about them is that they're generally slower to verify, (as stated above).. Does anyone know what the general advantages of them are in (for example) comparison to schnorr?



jr. member
Activity: 82
Merit: 2
Yeah, I think Satoshi was smart enough to foresee this and he could've implemented the Core with 2MB or 4MB of blocksize himself but I think we knew back then that this would cause more harm than good so he stayed at 1MB and I think this is great as we dont need a cryptocurrency that is handled like fiat money where some big players controll the wealth of many many people. Bitcoin was created to give the people their wealth back because they work for it every day 9-5....
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
...
Bitcoin core, is what I've chosen too, for now, not forever! Naive forks are not of my type. But I don't belong to any secret brotherhood. Core is bad but is the best we got for now.

For now? What does that mean? I believe that means that you have no choice but to be in consensus in the social consensus that "Bitcoin is BTC". Hahaha.

You cannot escape. Cool

It simply means that I understand improving bitcoin is not a trivial job and shit-forking it is not a solution but on the other hand, unfortunately things have happened and Core is distracted from the cause and there is little, if not zero, hope for them to heal.

Quote
Quote
I think an average programmer with vision and open to change is far better than a super genius dogmatist who has nothing other than ultra-conservatism to offer. Average programmers learn and grow, arrogant dogmatists just sink.

Like the developers of Ethereum ICOs and "blockchain companies"? Hahaha.
Yes. And a lot more. Code is not the bottleneck, vision and dedication is.

Quote
Quote
Bitcoin is not a joke.
But yet you support an "urgent" hard fork to bigger blocks because "Bitcoin needs to scale now". The irony.
Who said anything about bigger blocks as an ultimate or urgent scaling solution? I support reducing block time and yet not as a scaling solution, rather a decentralization complementary one.

Like Core, god forgive me, I believe neither increased block size nor reduced block time deserve to be classified as scaling solution because they can't scale up safely. But unlike Core, thanks god, I do believe that such improvements can be employed to some extent, no need to be superstitious about magical 1 MB block size or 10 min block time. We have Moor's law still applicable.

Quote
Quote
What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


Do you truly believe that your ideas are very good?
100% sure! My PoCw proposal, in its current alpha phase, is unique in the whole crypto literature for being the only serious work on fixing pooling pressure related flaws like mining variance and proximity premium.

I did it when I started betraying Core, any serious work in bitcoin begins with getting rid of Core's (lack of) agenda, imo.



 
legendary
Activity: 2898
Merit: 1823
Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

It's not about loyalty to any developer, it's about recognising the fact that the users on this network have made their decision.  You and franky1 need to start some sort of special club for people who are wholly incapable of avoiding conflation between what developers do and what users and miners do.  Devs aren't in charge, they just make whatever they want to make.  It's the people who freely choose to run the code who you appear to have an issue with.  People had the choice.  This is what they chose.
Bitcoin core, is what I've chosen too, for now, not forever! Naive forks are not of my type. But I don't belong to any secret brotherhood. Core is bad but is the best we got for now.

For now? What does that mean? I believe that means that you have no choice but to be in consensus in the social consensus that "Bitcoin is BTC". Hahaha.

You cannot escape. Cool

Quote
I think an average programmer with vision and open to change is far better than a super genius dogmatist who has nothing other than ultra-conservatism to offer. Average programmers learn and grow, arrogant dogmatists just sink.

Like the developers of Ethereum ICOs and "blockchain companies"? Hahaha.

Quote
Bitcoin is not a joke.

But yet you support an "urgent" hard fork to bigger blocks because "Bitcoin needs to scale now". The irony.

Quote
What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


Do you truly believe that your ideas are very good?
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

It's not about loyalty to any developer, it's about recognising the fact that the users on this network have made their decision.  You and franky1 need to start some sort of special club for people who are wholly incapable of avoiding conflation between what developers do and what users and miners do.  Devs aren't in charge, they just make whatever they want to make.  It's the people who freely choose to run the code who you appear to have an issue with.  People had the choice.  This is what they chose.
Bitcoin core, is what I've chosen too, for now, not forever! Naive forks are not of my type. But I don't belong to any secret brotherhood. Core is bad but is the best we got for now.

Bitcoin started experimentally and transited to operational state gradually, when people observed that their experimental coins have come with costs and got a price.

We shouldn't be fooled yet, the centralization flaws and scaling problems are there waiting for  us to fix them and improve.

A critical problem is the Core dilemma: They are good for the ecosystem because of their expertise and they are bad because of their lack of vision and  vulnerability to sectarianism.

Once I might have to choose in favor of one because of his  expertise or against him because of his  dogmatism, my choice would be against him eventually.

I think an average programmer with vision and open to change is far better than a super genius dogmatist who has nothing other than ultra-conservatism to offer. Average programmers learn and grow, arrogant dogmatists just sink.

Bitcoin is not a joke. It can't stop evolution and stick with the past and community will pay the price whenever there would be a vigorous solid proposal, (not just blindly increasing block size, switching to equihash, ... ).

What we have to do, our mission, is discussing such improvements freely and without any hesitation,  instead of being worried about Core's (lack of) agenda. They should listen to us (unlikely) or they would be eliminated (more likely).

When I come to an idea, the last thing I care about is how Core guys may feel or react to that idea, otherwise  there would be no idea at all.


legendary
Activity: 3934
Merit: 3190
Leave no FUD unchallenged
Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

It's not about loyalty to any developer, it's about recognising the fact that the users on this network have made their decision.  You and franky1 need to start some sort of special club for people who are wholly incapable of avoiding conflation between what developers do and what users and miners do.  Devs aren't in charge, they just make whatever they want to make.  It's the people who freely choose to run the code who you appear to have an issue with.  People had the choice.  This is what they chose.

If you want total emphasis towards on-chain scaling, there are blockchains out there where you will find like-minded users who share the same ideals.  But you need to understand this is not one of those blockchains.  Users and miners already made that call.  Whine about it all you like, this is how it is until consensus says otherwise.  Take it or leave it, those are the options.  We can always come back to the idea of blockweight adjustments in future if there's a necessity for it, but it's not happening now.


FYI I'm not in favor of block size increase personally, I'm thinking of reducing block time which has more pros, among them, better variance and helping decentralization which would be hard for someone like you to approach.

I eagerly await your "Bitcoin" with a different block time, different algorithm, no pools and everyone magically agreeing with all your changes.   Roll Eyes

'Til then, cool story, bro.
legendary
Activity: 2898
Merit: 1823
OP, because there is a "blockchain trilemma", a term from Vitalik Buterin, that needs to be solved or in some way "balanced". I believe his thoughts on that matter are sound.

In the the blockchain trilemma, it is believed that only 2 of 3 "traits", which are decentralization, scalability, and security, can be achieved on a fundamental level. Adjusting one would mean affecting or giving up some of the traits of either one of the other two.

I believe the Core developers stayed on a 1mb block size to maintain "security" and node "decentralization", and the Lightning developers are solving "scalability" through an offchain layer, and therefore solving the "blockchain trilemma".

legendary
Activity: 1456
Merit: 1175
Always remember the cause!
Seriously? I specifically refuted your (Core's) reasoning about the infamous insistence on crazy 1MB limit by showing its irrelevance to non-miner and miner full nodes both, in details, using proven mathematical techniques.

1) You didn't refute anything.
I refuted your nonsense about:
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)

I am assuming you are talking about the block size?

Well, the most critical point probably is that increasing the block size is just a temporary scaling.
Doubling the block size brings a small benefit (lower fees) for a short period of time (until transactoun amount doubled).
But the price for this 'scaling' is, that you are sacrificing the possibility of decentralisation. If the block size would increase heavily, smaller nodes wouldn't be able to verify a block fast enough until another one has been mined.
by asserting that
We have two distinct class of full nodes in bitcoin ecosystem: mining nodes and non-mining ones. Right?

For non-mining full nodes, there is no rush to validate blocks. They have LOTS of time to spend and no opportunity cost is at stake, double it, triple it, make it 10 times, 100 times bigger and nothing bad happens in terms of opportunity costs involved.
...
See? Although our miner has almost 80XS9s installed already, plus an infrastructure for supplying power, air conditioning, monitoring, ... a minimum of 200K$ investment, he has to forget about solo mining and stick with AntPool, btc.com, ... Right? he doesn't run a full node, he typically has no idea what a full node is!

Actually our miner should install at least 40 times more hash power (40 PH/s) to have at least 0.001 share and mine 1 block every 3-4 days (yet with a rough 0.138 variance and praying to god while sacrificing a goat or something daily, perhaps) to choose solo mining, and a full node consequently.

How much the investment is now? 7-8 million dollars?

And Core is protecting him from what? Spending a few hundreds more on a full node capable of validating blocks in a reasonable time?
Summary: non-miner nodes won't hurt economically by block size increase. miner-nodes are millionaires and wouldn't mind spending few hundred dollars to upgrade.

In your strange language what is it called other than refutting? heh Huh

A small mining farm (not a home miner) with 1 petahash (0.000025 share in the network), installed power has a chance to mine a block every 6 months with a standard deviation of around 0.95 (look here for the calculation technique), hence it is definitively a requirement for such a miner to join a pool for keeping his site running as a normal business, paying bills, ...  unless he is essentially a gambler instead of a businessman.
Surprise! It was me and I used a more sophisticated technique than 'percentage calculations'

[..] what about you come up with an actual PRO argument (for a blocksize blockweight increase) ?

What about more transactions and less fees? It needs arguing? The onus is on you to prove such an increase having bad consequences not on me to prove it is good, of course it is good if there is no serious cost!

FYI I'm not in favor of block size increase personally, I'm thinking of reducing block time which has more pros, among them, better variance and helping decentralization which would be hard for someone like you to approach.

Stay in your box and enjoy your popcorn watching Core show.
legendary
Activity: 1624
Merit: 2481
Seriously? I specifically refuted your (Core's) reasoning about the infamous insistence on crazy 1MB limit by showing its irrelevance to non-miner and miner full nodes both, in details, using proven mathematical techniques.

1) You didn't refute anything.
2) It is not a 1MB limit, it is a 4.000.000 Weight limit.
3) The only 'mathematical techniques' you have used were simple percentage calculations..



Are you guys blind or something? I have proved [..]

You didn't prove anything. All you did was to express your opinion.



I have to prove that solo miners with tens of millions of dollars (minimum) investment can afford setting up an state of the art  full node?

Or I have to prove that non-mining nodes can afford 100 dollars for buying a multiple terabyte storage and they can wait a few more seconds, even minutes, to verify a block without losing anything?

What about the most obvious one:

[..] what about you come up with an actual PRO argument (for a blocksize blockweight increase) ?

Still not a single argument pro on-chain scaling.
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
Above post by @bob123 is almost the Core's scalability manifesto. I don't agree with even one paragraph of it, not a single idea but I just answer his claims about block size debate, here.

A big wall of text from you, but almost 0 serious content. I am not going to comment your view at the situation.
Seriously? I specifically refuted your (Core's) reasoning about the infamous insistence on crazy 1MB limit by showing its irrelevance to non-miner and miner full nodes both, in details, using proven mathematical techniques. It is 0 and your garbages about the fake and hypocritical concerns regarding centralization  are 'serious content'?

Oh, you guys are making me sick these days. What's happening here in bitcointalk? Legendary figures are marshaling and showing their loyalty to Core?

Are you guys blind or something? I have proved your pretext void, now what?

I have to prove that solo miners with tens of millions of dollars (minimum) investment can afford setting up an state of the art  full node?

Or I have to prove that non-mining nodes can afford 100 dollars for buying a multiple terabyte storage and they can wait a few more seconds, even minutes, to verify a block without losing anything?





legendary
Activity: 1624
Merit: 2481
Above post by @bob123 is almost the Core's scalability manifesto. I don't agree with even one paragraph of it, not a single idea but I just answer his claims about block size debate, here.

A big wall of text from you, but almost 0 serious content. I am not going to comment your view at the situation.
But, instead of just telling wild stories, what about you come up with an actual PRO argument (for a blocksize blockweight increase) ?
Or an actual argument AGAINST second-layer scaling solutions?

All you said was 'no thats not a con argument, because it doesn't matter for most people'.

If you want a serious discussion, bring up some serious arguments in favor of on-chain scaling (which by the way do not exist).



Who was in charge of keeping bitcoin decentralized and what's the outcome?

Actually, every single user.



Additionally, there are also :
1. MAST, which compress script/simple smart-contract while improve user privacy on some cases
2. BLS Signature, an alternative to Schnorr Signature. No idea whether bitcoin will use BLS though

I have already read about BLS, but i am not sure whether it will really be adopted instead of schnorr. Especially regarding the eventual (not well explored) security concerns (yet).

I haven't heard of MAST before. Thats definitely an interesting feature.
Unbelievable how much potential this whole system has (and how much still will be discovered).
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
Above post by @bob123 is almost the Core's scalability manifesto. I don't agree with even one paragraph of it, not a single idea but I just answer his claims about block size debate, here.

Being concerned about centralization consequences of block size increase is rather a pretext for Core.
They have not showed up with any solution for the situation with pools and ASICs and now they want us to be convinced about their commitment to the cause, I do not feel good.  Undecided  So, let's take a closer look at their excuse about block size increase prohibition, keeping block verification process, light :

We have two distinct class of full nodes in bitcoin ecosystem: mining nodes and non-mining ones. Right?

For non-mining full nodes, there is no rush to validate blocks. They have LOTS of time to spend and no opportunity cost is at stake, double it, triple it, make it 10 times, 100 times bigger and nothing bad happens in terms of opportunity costs involved.
The problem would be reduced to just 'scale' times more hard disk space requirement. Not a big deal. I mean are you kidding? Storage costs? 10 years after Satoshi and we are concerned about 1-2 fucking terabytes of storage?

For mining full nodes, it is different, there is opportunity costs involved and the infamous proximity premium flaw.

But, it doesn't help justifying this conservativism about block size as we will see:

A small mining farm (not a home miner) with 1 petahash (0.000025 share in the network), installed power has a chance to mine a block every 6 months with a standard deviation of around 0.95 (look here for the calculation technique), hence it is definitively a requirement for such a miner to join a pool for keeping his site running as a normal business, paying bills, ...  unless he is essentially a gambler instead of a businessman.

See? Although our miner has almost 80XS9s installed already, plus an infrastructure for supplying power, air conditioning, monitoring, ... a minimum of 200K$ investment, he has to forget about solo mining and stick with AntPool, btc.com, ... Right? he doesn't run a full node, he typically has no idea what a full node is!

Actually our miner should install at least 40 times more hash power (40 PH/s) to have at least 0.001 share and mine 1 block every 3-4 days (yet with a rough 0.138 variance and praying to god while sacrificing a goat or something daily, perhaps) to choose solo mining, and a full node consequently.

How much the investment is now? 7-8 million dollars?

And Core is protecting him from what? Spending a few hundreds more on a full node capable of validating blocks in a reasonable time?

Isn't it kinda hypocrisy? Remaining silent about pools, ASICs, ... and pretending to be #1 savior of decentralization?

Which decentralization? Where is decentralization? Show me one decentralized 'thing' in this ecosystem!

Who was in charge of keeping bitcoin decentralized and what's the outcome?

legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
As usual, bob123 already explain most things

why are they committed to only 1mb (or ~ 4 mb with segwit)

FYI, Bitcoin now use block weight limit with 4.000.000 weight limit. So, the maximum block size we can see is from 1MB to 4MB. But 4MB only happen on very specific case and usually the biggest block size is about 2MB.

Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)

I am assuming you are talking about the block size?

Well, the most critical point probably is that increasing the block size is just a temporary scaling.
Doubling the block size brings a small benefit (lower fees) for a short period of time (until transactoun amount doubled).
But the price for this 'scaling' is, that you are sacrificing the possibility of decentralisation. If the block size would increase heavily, smaller nodes wouldn't be able to verify a block fast enough until another one has been mined.
This would lead to a (heavy form of) centralisation where only big data centre could allow to run full nodes. This would then require average user to trust some public nodes, without the possibility of fast and efficient validation.
Additionally, it is hard to perform proper testing on bigger blocks (e.g. 8mb or 16mb).

I agree, but there's possibility of increase maximum block size weight with considering majority nodes or people still can run full-nodes without mid/high-end device. But most likely that won't happen because Deciding The line of mid/high-end device and backward-compability.

The next steps toward scaling in terms of transactions/block are
(1) the lightning network (which will allow to make an 'infinite' amount of transactions without paying the on-chain fee, as long your channel is open and filled) and
(2) schnorr signatures, which heavily reduce the size of transactions using multiple inputs (which most of the tx's are). They allow you to combine multiple signatures into one (and therefore saving space in the blockchain).

Additionally, there are also :
1. MAST, which compress script/simple smart-contract while improve user privacy on some cases
2. BLS Signature, an alternative to Schnorr Signature. No idea whether bitcoin will use BLS though

More info :
https://bitcointechtalk.com/what-is-a-bitcoin-merklized-abstract-syntax-tree-mast-33fdf2da5e2f
https://medium.com/@snigirev.stepan/bls-signatures-better-than-schnorr-5a7fe30ea716
legendary
Activity: 1624
Merit: 2481
Why can't core scale as some f(x) = S curve so that you would get a % increase that increased supply and demand?

why are they committed to only 1mb (or ~ 4 mb with segwit)

I am assuming you are talking about the block size?

Well, the most critical point probably is that increasing the block size is just a temporary scaling.
Doubling the block size brings a small benefit (lower fees) for a short period of time (until transactoun amount doubled).
But the price for this 'scaling' is, that you are sacrificing the possibility of decentralisation. If the block size would increase heavily, smaller nodes wouldn't be able to verify a block fast enough until another one has been mined.
This would lead to a (heavy form of) centralisation where only big data centre could allow to run full nodes. This would then require average user to trust some public nodes, without the possibility of fast and efficient validation.
Additionally, it is hard to perform proper testing on bigger blocks (e.g. 8mb or 16mb).

I recall there was a shitcoin fork which increased the block size, without even proper testing on how hardware/software/network reacts and with blocks being only filled to 0-10%.
Noone knows whether this network will still be fully functional when blocks start getting full, or which new attack vectors are being created out of this.


Segwit effictively already doubled (its at 2.3x at the moment, i think) the amount of transactions which fit into a block.

The next steps toward scaling in terms of transactions/block are
(1) the lightning network (which will allow to make an 'infinite' amount of transactions without paying the on-chain fee, as long your channel is open and filled) and
(2) schnorr signatures, which heavily reduce the size of transactions using multiple inputs (which most of the tx's are). They allow you to combine multiple signatures into one (and therefore saving space in the blockchain).

Pages:
Jump to: