Pages:
Author

Topic: Superspace: Scaling Bitcoin Beyond SegWit (Read 507 times)

legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
September 01, 2018, 02:47:25 PM
#24
I thoroughly reading the paper and i though your idea is similar with sidechain which have same rules with Bitcoin network.

I like the idea of increasing maximum block size without hard-fork/actually increase block size (from legacy nodes perspective), but besides that i don't see much advantage of your proposal since :
1. Complicate way/method, even though mostly only developer who feel it
2. Introducing another address standard which complicate UI/UX and probably confuse some people. Edit : instead of introduce new address prefixes/format, why don't you use use next witness version (bc1p)
3. Unlike SegWit which also solve malleability transaction, your idea don't have any

Im looking forward to another additional 12 to 24 months of drama comming from the so called community, so called devs, and miners alike, mixed with the big twitter megaphone guys trying to step in with so called "X place agreements" again.

I don't think we'll ever see another softfork similar to segwit. We somehow got segwit in and it seems to be working, but some still question that it is safe and always will (and have good arguments to think so)

The amount of controversy needed to get segwit in was incredibly insane. You would need to have a package of updates so good that it can be done again, and perhaps not without another round of transaction backlog either organic or spammed again.

I think no matter how good ideas are, unless bitcoin is pushed to its limits and this idea is presented as an acceptable solution by many relevant parties, we will not see further updates, definitely coming by way of hardfork, and very doubtfully by controversial softforks.

It naturally all depends on how contentious any new proposed fork is seen as.  The next softfork is likely to be Schnorr, unless I'm mistaken.  The only way I could see that one becoming controversial is if someone starts a bandwagon for BLS instead and a rift forms in the community again.  But then, knowing this community, that's a distinct possibility, heh.

Provided it doesn't become some polarised imbroglio like last time, it should hopefully be fairly straightforward.

A bit off-topic, but those who might start the bandwagon now currently have their own fight ongoing consensus.

Besides AFAIK, Schnorr verification time is slightly faster than current ECDSA while BLS is slower than both of them and people shouldn't have much decentralization concern.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
September 01, 2018, 09:03:41 AM
#23
I think guys like @Doomad and @cellard have not completely understood this proposal. It is simply a block size increase without hard fork.

These guys are interested in just its soft fork style of implementation and apparently don't care about the contents. it is weird.

It's probably just your reading comprehension that's off.  This proposal clearly requires a softfork and cellard's comment was simply stating that those might be more problematic after all the drama that came with the last one.  This may well be a moderate proposal, but many in the community are highly conservative, so even a moderate impact on centralisation is deemed unacceptable by some.
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
September 01, 2018, 08:21:23 AM
#22
I think guys like @Doomad and @cellard have not completely understood this proposal. It is simply a block size increase without hard fork.

These guys are interested in just its soft fork style of implementation and apparently don't care about the contents. it is weird.

Block size increase (no matter it is forked soft or hard) has definite centralization consequences because of its direct negative impact on progress and proximity premium.

Personally, I don't think a moderate increase in block size could have a disastrous centralization impact but the way op is suggesting his soft fork is ways beyond moderate.

legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
August 31, 2018, 06:04:30 PM
#21
Im looking forward to another additional 12 to 24 months of drama comming from the so called community, so called devs, and miners alike, mixed with the big twitter megaphone guys trying to step in with so called "X place agreements" again.

I don't think we'll ever see another softfork similar to segwit. We somehow got segwit in and it seems to be working, but some still question that it is safe and always will (and have good arguments to think so)

The amount of controversy needed to get segwit in was incredibly insane. You would need to have a package of updates so good that it can be done again, and perhaps not without another round of transaction backlog either organic or spammed again.

I think no matter how good ideas are, unless bitcoin is pushed to its limits and this idea is presented as an acceptable solution by many relevant parties, we will not see further updates, definitely coming by way of hardfork, and very doubtfully by controversial softforks.

It naturally all depends on how contentious any new proposed fork is seen as.  The next softfork is likely to be Schnorr, unless I'm mistaken [//EDIT:  and perhaps also including the SIGHASH_NOINPUT flag for Eltoo to save having two separate softforks].  The only way I could see that one becoming controversial is if someone starts a bandwagon for BLS instead and a rift forms in the community again.  But then, knowing this community, that's a distinct possibility, heh.

Provided it doesn't become some polarised imbroglio like last time, it should hopefully be fairly straightforward.
legendary
Activity: 3010
Merit: 3724
Join the world-leading crypto sportsbook NOW!
August 31, 2018, 08:44:37 AM
#20
Im looking forward to another additional 12 to 24 months of drama comming from the so called community, so called devs, and miners alike, mixed with the big twitter megaphone guys trying to step in with so called "X place agreements" again.

I don't think we'll ever see another softfork similar to segwit. We somehow got segwit in and it seems to be working, but some still question that it is safe and always will (and have good arguments to think so)

The amount of controversy needed to get segwit in was incredibly insane. You would need to have a package of updates so good that it can be done again, and perhaps not without another round of transaction backlog either organic or spammed again.

I think no matter how good ideas are, unless bitcoin is pushed to its limits and this idea is presented as an acceptable solution by many relevant parties, we will not see further updates, definitely coming by way of hardfork, and very doubtfully by controversial softforks.

Everything in hindsight seems big and mega, and I agree we may never get the perfect storm of 2017 again when it comes to something so politically-charged. They've all seen how damaging it can be, both financially and to reputations, but you just never know. If Bitcoin price again threatens to "explode" there may just be enough in it to motivate for yet more agendas aligned to X Y or Z direction.

Just need to wait for another culmination of all these aspects from dev sentiment, market action, and just general "tired of twiddling our thumbs" people.
legendary
Activity: 1372
Merit: 1252
August 30, 2018, 12:26:11 PM
#19
Im looking forward to another additional 12 to 24 months of drama comming from the so called community, so called devs, and miners alike, mixed with the big twitter megaphone guys trying to step in with so called "X place agreements" again.

I don't think we'll ever see another softfork similar to segwit. We somehow got segwit in and it seems to be working, but some still question that it is safe and always will (and have good arguments to think so)

The amount of controversy needed to get segwit in was incredibly insane. You would need to have a package of updates so good that it can be done again, and perhaps not without another round of transaction backlog either organic or spammed again.

I think no matter how good ideas are, unless bitcoin is pushed to its limits and this idea is presented as an acceptable solution by many relevant parties, we will not see further updates, definitely coming by way of hardfork, and very doubtfully by controversial softforks.
legendary
Activity: 3010
Merit: 3724
Join the world-leading crypto sportsbook NOW!
August 30, 2018, 11:38:21 AM
#18
Yet the percentage of legacy users and services still stubbornly holding on.
I think they will quickly reconsider when the next bull run happens, the network gets congested again, and the fees soar. Many people will adopt new things only when they absolutely have to, at the last moment, when they feel the pain and see the solution that eases it.

The demonstration of necessity being the mother of invention is certainly true in terms of Bitcoin development (which is one reason I always say "scaling" is a desirable problem), one might say that had we never reached those critical stages of congestion, we wouldn't be here at today's rate of Segwit adoption, or even today's progress with LN.

But I don't think we'll ever get that perfect storm again, where congestion coincided with heavy interest and high prices, and pressure from big blockers. If that wasn't enough to convince people to move over, I'm not sure what will. The pain wasn't, after all, unbearable.
legendary
Activity: 3122
Merit: 2178
Playgram - The Telegram Casino
August 28, 2018, 01:08:00 PM
#17
Interesting whitepaper!

I wonder though, are you (a) pessimistic on how long LN will take until it reaches more widespread adoption or are you (b) optimistic regarding how quickly one could implement and deploy Superspace?

Seeing how LN already hit mainnet earlier this year and how long it took for SegWit to reach maturity starting from its conceptional phase I believe that either must be the case; at least considering that we are looking at "a short-term proposal, intended to provide a temporary ease from scalability issues". Or would you see Superblocks as part of a long-term scaling approach, with temporary ease being a mere side-effect?
copper member
Activity: 85
Merit: 122
August 28, 2018, 11:23:18 AM
#16
Honestly, I hate SW exactly because of its tricky approach, it looks to me kinda cobbling things in a hacker way. I love hack but not when it comes to core algorithm, as a rule of thumb we should keep core components elegant and smart.
You do know that many innovations in Bitcoin were rolled out this way, right? P2SH is treated by non-P2SH-aware nodes just as a hash preimage lock. OP_CHECKLOCKTIMEVERIFY and OP_CHECKSEQUENCEVERIFY for non-aware nodes are just no-ops.
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
August 28, 2018, 10:41:17 AM
#15
... and I don't believe in LN being an alternative technology.
So what's your alternative? 500 GB blocks every 10 minutes? If you read my paper, you'd see I'm for a reasonable increase, not extreme.
My alternative is like 10 times decrease in block time in short term and improving PoW to a pooling pressure free version and providing a strong enough infrastructure for sharding in long term. I would propose a lot of more improvements meanwhile including and not limited to moving all signature data to witness space, using Schnorr signatures, ...
Quote
Quote
But as far as I understand, the main controversial issue with block size increase proposals is not their need for a hard fork rather it is  their centralization implications because of progress and proximity consequences
I think I've highlighted this enough times in the paper, but I'd like to point out an important point again: you can limit superspace blocks, by saying in the protocol the block is invalid if its size exceeds, let's say 10 MB. Increase from 1 MB to 2 MB in SegWit did not really affect decentralization so far, neither would 10 MB. 128 MB blocks definitely would.

Also, it doesn't mean that every block will be 10 MB from now on, only in the "rush hour".

Where did I get 10 MB number? Nowhere, it's just an example. We can set it to whatever limit the community deems reasonable. The point is that we can impose it. But now we ourselves will be able to decide what the block limit should be in today's circumstances, not 2010's Satoshi when he initially put 1 MB, looking at 2-3 KB blocks as they were back then.
Setting  10 MB limit on superspace blocks is just the same as putting a 11 MB limit. Still you are offering nothing more than a complicated algorithm tweak to reach to a point that is simply affordable by a hard fork.

Don't want to undermine your work but I think it is just about avoiding hard forks by mimicking SegWit approach.

Honestly, I hate SW exactly because of its tricky approach, it looks to me kinda cobbling things in a hacker way. I love hack but not when it comes to core algorithm, as a rule of thumb we should keep core components elegant and smart.
legendary
Activity: 3430
Merit: 3080
August 28, 2018, 08:55:57 AM
#14
If the main layer is too congested and you are in the situation where you
want to close a channel unilaterally, you are screwed.

eltoo solves this problem
copper member
Activity: 85
Merit: 122
August 28, 2018, 08:16:32 AM
#13
If the main layer is too congested and you are in the situation where you
want to close a channel unilaterally, you are screwed.
This conversation is more suitable for a LN-specific thread and for people who know more about LN low-level security constraints than me, but I'll just point out that the deadline for closing the channel is not one block, it is whatever period of blocks you put in the funding transaction's timelock. If you see the main layer become more congested, you do it earlier at a safe distance from the deadline. You could, for example, have a rule of opening channels for two weeks, but always closing and reopening the channel 24 hours before it expires. That said, I agree with you, in the process of maturing, LN has a lot of things to take care of, including this one.
sr. member
Activity: 658
Merit: 282
August 28, 2018, 08:04:52 AM
#12
...
or tell you to come get your money Monday through Friday from 2pm to 4pm. They don't have this power with LN, at no point your funds are at risk. You can close the channel back to the blockchain if you don't like how the counterparty behaves. I can't vouch for how effective it can ultimately be, we'll see, but it's a neat concept nevertheless.
...

Nonetheless this system doesn´t work in various situations.
E.g. you can broadcast a closing transaction if you don´t like how
the counterparty behaves, but it is possible that the main layer is too
congested for it to go through at the time or you don´t have the necessary liquid
funds for a transaction fee that ensures a fast confirmation.

If the main layer is too congested and you are in the situation where you
want to close a channel unilaterally, you are screwed.
copper member
Activity: 85
Merit: 122
August 28, 2018, 07:46:25 AM
#11
Anyway, you are the one who has come with an onchain scaling proposal and at the same time you are promoting LN?! Why should anybody even care about onchain solutions if he is a believer in LN or any 2nd layer alternative?

You kinda answered this question yourself:

Quote
But generally, I think there is no decentralized, trustless, secure solution in the horizon other than blockchain

My personal estimate is that it will take several years for LN to mature, but we need bigger blocks right now. This solution would provide time for LN developers (or sidechains, or whatever) to do their thing, and we would enjoy 5-10 MB blocks until that time.

Quote
and I don't believe in LN being an alternative technology.
So what's your alternative? 500 GB blocks every 10 minutes? If you read my paper, you'd see I'm for a reasonable increase, not extreme.

Quote
Why should we use blockchain for LN by the way. One could adopt LN to be run on fiat and traditional banking system.
Traditional banking systems are custodian systems, where they are kings and can do whatever they want: run off with your money, or shut down your account at any time for any reason, or tell you to come get your money Monday through Friday from 2pm to 4pm. They don't have this power with LN, at no point your funds are at risk. You can close the channel back to the blockchain if you don't like how the counterparty behaves. I can't vouch for how effective it can ultimately be, we'll see, but it's a neat concept nevertheless.

Quote
But as far as I understand, the main controversial issue with block size increase proposals is not their need for a hard fork rather it is  their centralization implications because of progress and proximity consequences
I think I've highlighted this enough times in the paper, but I'd like to point out an important point again: you can limit superspace blocks, by saying in the protocol the block is invalid if its size exceeds, let's say 10 MB. Increase from 1 MB to 2 MB in SegWit did not really affect decentralization so far, neither would 10 MB. 128 MB blocks definitely would.

Also, it doesn't mean that every block will be 10 MB from now on, only in the "rush hour".

Where did I get 10 MB number? Nowhere, it's just an example. We can set it to whatever limit the community deems reasonable. The point is that we can impose it. But now we ourselves will be able to decide what the block limit should be in today's circumstances, not 2010's Satoshi when he initially put 1 MB, looking at 2-3 KB blocks as they were back then.
copper member
Activity: 85
Merit: 122
August 28, 2018, 07:35:35 AM
#10
Yet the percentage of legacy users and services still stubbornly holding on.
I think they will quickly reconsider when the next bull run happens, the network gets congested again, and the fees soar. Many people will adopt new things only when they absolutely have to, at the last moment, when they feel the pain and see the solution that eases it.

Quote
but the fact that they can't transact/spend to native SW unless using the same client already must mean some users are shorn off
You can provide, alternatively, a Segwit P2SH address to those legacy clients, the net result will be just the same. With a text underneath a bech32 address: "If your software doesn't understand this, use this address instead: 3........ Oh, and by the way, consider upgrading your software to the latest version."
Or you can just provide only the P2SH address. Even if you use P2SH addresses alone, you still use SegWit.
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
August 28, 2018, 06:33:27 AM
#9
2nd layer scaling solutions are inherently vulnerable to centralization
It is important to have the same definitions in mind when discussing things like centralization...
... What's left is perceived centralization, however, cause such a network would tend to center around hubs with the biggest liquidity and connections, so it would look like hub-and-spoke topology.

It's a bit offtopic here though.
Indeed it is. But generally, I think there is no decentralized, trustless, secure solution in the horizon other than blockchain and I don't believe in LN being an alternative technology. Why should we use blockchain for LN by the way. One could adopt LN to be run on fiat and traditional banking system.

Anyway, you are the one who has come with an onchain scaling proposal and at the same time you are promoting LN?! Why should anybody even care about onchain solutions if he is a believer in LN or any 2nd layer alternative?

Quote
Quote
because of having a parallel block just doesn't sound very different than suggesting an increased block size
Very different. One is a hard fork resulting in a network split, the other one is a soft fork, not affecting legacy software in the slightest. That's the whole point.

So it is a block size increase solution without a need for a hard fork. But as far as I understand, the main controversial issue with block size increase proposals is not their need for a hard fork rather it is  their centralization implications because of progress and proximity consequences.
legendary
Activity: 3010
Merit: 3724
Join the world-leading crypto sportsbook NOW!
August 28, 2018, 06:05:53 AM
#8
Thank you for this OP. Severely limited in terms of technical knowledge here, but always keen to follow Bitcoin upgrades and have been happily using SW for a while now, enjoying the very obvious benefits. Just curious from the paper, which assumes (as I do) that SegWit has already reached the stage of significant adoption (at least through P2SH) and can be considered mature, yet the percentage of legacy users and services still stubbornly holding on. Will implementing this and possible further expansions cause even further risk of alienating legacy?

I know it doesn't affect them (no network partitioning as your paper says) but the fact that they can't transact/spend to native SW unless using the same client already must mean some users are shorn off... Or does it actually make no difference, or would it actually further push to encourage SW upgrade?
copper member
Activity: 85
Merit: 122
August 28, 2018, 05:36:39 AM
#7
2nd layer scaling solutions are inherently vulnerable to centralization
It is important to have the same definitions in mind when discussing things like centralization. In Bitcoin, decentralization means censorship-resistance (due to many parties involved that take random turns producing the blocks), impossibility to do any action to bring down the whole network (in centralized systems, this would be chopping off the head), and custodial decentralization (no need to trust someone that they won't steal your funds, because they can't). You would have the same in LN: censorship-resistance (if 10 routes don't want to do business with you, you route through 11th), can't bring down the whole network (you bring down big hubs, and the little ones take their place), and custodial decentralization (you still don't trust anybody other than the math, hubs are not custodians of your funds). What's left is perceived centralization, however, cause such a network would tend to center around hubs with the biggest liquidity and connections, so it would look like hub-and-spoke topology.

It's a bit offtopic here though.

Quote
because of having a parallel block just doesn't sound very different than suggesting an increased block size
Very different. One is a hard fork resulting in a network split, the other one is a soft fork, not affecting legacy software in the slightest. That's the whole point.
legendary
Activity: 1456
Merit: 1175
Always remember the cause!
August 28, 2018, 05:16:37 AM
#6
I don't think there is any consensus that we actually want to increase the block size any further for now - that is the main reason why extension blocks have not been deployed.  The hard problem is not how to do it technically, but whether we need to do it or whether we should stick to what we have now with Segwit and scale on layer 2 instead.

While the block-size increase was one of the effects of Segwit, it was not the only and perhaps not the most important one.  Segwit also solves transaction malleability in an elegant way, which makes the implementation of layer-2 techniques (including Lightning) easier.
Ethereum idol, Vitalik Buterin says it is impossible and there is a law like in thermodynamics that forbids onchain scaling.  It is a ridiculous claim and I've refuted it in many occasions.

2nd layer scaling solutions are inherently vulnerable to centralization and no matter how many bitcoin devs are working on it because once you are working on such a protocol you are an outlander for bitcoin community.

The basic axiom of bitcoin and cryptocurrency (unlike what Buterin is trying to sell us) is the possibility of achieving to all of the three characteristics (that he claims to be in a trilemma): security, decentralization and performance in a blockchain.

Axioms are not subject to debate, anybody who thinks we can't have a better performance without jeopardizing security or decentralization is not  bitcoiner or a member of cryptocurrency movement. Such a person is just a revisionist probably hired by feds or corps or like Buterin owns a corp or like some bitcoiner versions of buterin are planning for such a position. Fuck 2nd layer solutions, improve the actual blockchain.

As of this proposal:
I think the idea of having some double referenced blocks called Superblocks here won't help with the canonical drawback of bigger blocks. When the number of transactions grows the propagation delay increases because nodes should query and validate the transactions and it affects proximity related problems.

For now ,I suppose it won't have support not because of devs being so fond of LN, (bitcoin is not Ethereum, nobody has conquered it, nobody dares to resist against a brilliant onchain scaling idea because of his corporate's best interests in 2nd layer solutions) rather because of having a parallel block just doesn't sound very different than suggesting an increased block size. I have to read it more deeply to be sure, tho.






copper member
Activity: 85
Merit: 122
August 28, 2018, 05:07:36 AM
#5
Congrats on reaching the "over 1000 posts" mark!  Smiley

I don't think there is any consensus that we actually want to increase the block size any further for now - that is the main reason why extension blocks have not been deployed.
You can hardly find 100% consensus about anything these days, but I think there is a need. Every day I see topics here with people complaining about the block size and freaking out about the thought that full block incident could happen again on the next bull run.

I also remember conversations happening pre-SegWit, some people were also saying there's no need to increase the block size any further than 1mb and therefore no need for SegWit, and it got rolled out anyway, as a compromise solution. Likewise, there is no harm in rolling out superspace/extension blocks, and it would actually be a better compromise solution. All I'm saying, if we're doing this anyway, why not go all the way?

Quote
The hard problem is not how to do it technically, but whether we need to do it or whether we should stick to what we have now with Segwit and scale on layer 2 instead.
How optimistic are you about when layer 2 solutions will become usable? I am working on one implementation of LN, and I see some questions, like routing and liquidity, are still in the air. This does not diminish the existing efforts of developers, of course, we've come a long way, but we're still somewhere in the middle.

Quote
While the block-size increase was one of the effects of Segwit, it was not the only and perhaps not the most important one.  Segwit also solves transaction malleability in an elegant way, which makes the implementation of layer-2 techniques (including Lightning) easier.
Of course, and I did mention that. However, SegWit was sold to the community mostly as the backward-compatible block increase solution, and that aspect of it caught the most attention.
Pages:
Jump to: