Pages:
Author

Topic: Segregated witness - The solution to Scalability (short term)? - page 16. (Read 23172 times)

legendary
Activity: 994
Merit: 1035
Pieter Wuille is highly respected because he is one of the devs that made the right conservative approach during the 2013 fork. Still, his proposal can not be taken without careful review

We know that every large player here in bitcoin community never listen to anyone else but only themselves, so unless a proposal can be understand by them it will just be ignored. People ignore Gavin's solution just because they don't understand the potential risk for his radical change in block size limit. Similarly, if Pieter's solution is so complex (much more complex than Gavins) that it is not understandable for majority of the large players, it will just be ignored. You can never convince the large mining pools with those slides

Am I wrong to assume that the large mining pool owners aren't well versed in the rudimentary basics of bitcoin? If I can understand it, I am sure they can.
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
Pieter Wuille is highly respected because he is one of the devs that made the right conservative approach during the 2013 fork. Still, his proposal can not be taken without careful review

We know that every large player here in bitcoin community never listen to anyone else but only themselves, so unless a proposal can be understand by them it will just be ignored. People ignore Gavin's solution just because they don't understand the potential risk for his radical change in block size limit. Similarly, if Pieter's solution is so complex (much more complex than Gavins) that it is not understandable for majority of the large players, it will just be ignored. You can never convince the large mining pools with those slides
legendary
Activity: 994
Merit: 1035
ETA update-

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011875.html

"- Segwit BIP is being written, but has not yet been published.

  - Gregory linked to an implementation but as he mentions it is not completely
    finished yet. ETA for a Segwit testnet* is later this month, then you can test as well.

Wladimir"


I assume Wladimir is refering to rolling segwit into the main bitcoin testnet instead of being tested merely in elements sidechain testnet .



legendary
Activity: 1162
Merit: 1004
We welcome a 1,8x increase. But I say it will not be the solution. That's even not enough to survive until the halving date.

If this is your impression of being welcoming, then I'd certainly avoid any event where you were greeting the guests.

And more doomsday deadline scaremongering, after last time? Come on now.

 It is almost as if he believes that the only thing holding bitcoin back is capacity...and as soon as we build the 100k seat stadium it will be filled.

More projections out of thin air. Crazy.
legendary
Activity: 1162
Merit: 1004
You're thinking of proposals that people dislike, not proposals that have received a warm welcome. Pieter Wuille is highly respected, mainly because of his amazing computer science work so far. This sounds to be more of the same, and all you can do is lie and hate? I feel genuinely sorry for you Zara, it can't be much fun, for you or your friends, being such a bitter person.
Just ignore anyone who ignores this:


Ah, the ignoring user changed the title of his thread.
legendary
Activity: 994
Merit: 1035
We welcome a 1,8x increase. But I say it will not be the solution. That's even not enough to survive until the halving date.

If this is your impression of being welcoming, then I'd certainly avoid any event where you were greeting the guests.

And more doomsday deadline scaremongering, after last time? Come on now.

 It is almost as if he believes that the only thing holding bitcoin back is capacity...and as soon as we build the 100k seat stadium it will be filled. Even if these optimistic delusions are correct it would be disastrous for the bitcoin ecosystem to have that rapid of growth in such short order.

I am so glad there are enough rational and calm developers contributing and who are aware of the nuances and trade offs in decentralization and security.

Some say that a 'monster softfork' would be a dumber 'immediate' short term last minute solution than a simple increase of the block size, when everyone can see that the capacity is at the limit right now and the halving event is just 7 month away.

Define simple. Bitpays BiP 101 patch actually has more lines of code than the SW softfork.
legendary
Activity: 1162
Merit: 1004

If we make sloppy and dumb capacity upgrades in fear and haste we don't have the right incentives to make the right tradeoffs and develop optimal solutions which benefit all. There is no harm in having tested backup plans if the need arises and demand increases more than expected.

Some say that a 'monster softfork' would be a dumber 'immediate' short term last minute solution than a simple increase of the block size, when everyone can see that the capacity is at the limit right now and the halving event is just 7 month away.
legendary
Activity: 3430
Merit: 3080
We welcome a 1,8x increase. But I say it will not be the solution. That's even not enough to survive until the halving date.

If this is your impression of being welcoming, then I'd certainly avoid any event where you were greeting the guests.

And more doomsday deadline scaremongering, after last time? Come on now.
legendary
Activity: 994
Merit: 1035
projections. Are you crazy? We welcome a 1,8x increase. But I say it will not be the solution. That's even not enough to survive until the halving date.

No developer is suggesting it is "the solution". The fact that you cannot see that after we keep clarifying and providing contrary evidence suggests you are acting irrationally.

"I think that right
now capacity is high enough and the needed capacity is low enough that
we don't immediately need these proposals, but they will be critically
important long term."

Is this a joke? What is long term? 1 month before the halving date oder 1 month after?

Serious people do not pretend to know the how quickly we will need to scale the capacity needs in the future.

This is why a very responsible approach is outlined here-


Quote from: nullc
In Bitcoin Core we should keep patches
ready to implement them as the need and the will arises,
to keep the
basic software engineering from being the limiting factor.


If we make sloppy and dumb capacity upgrades in fear and haste we don't have the right incentives to make the right tradeoffs and develop optimal solutions which benefit all. There is no harm in having tested backup plans if the need arises and demand increases more than expected.

The paypal and Visa network have outages all the time and the ecosystem doesn't simply crumble in chaos. with bitcoin the worst fear is some transactions being delayed from confirmation while payment processors rely on 0 conf verification more heavily while an agreed upon and tested hardfork gets deployed. This is far less of an issue than the Visa payment network/paypal being down.   
legendary
Activity: 1162
Merit: 1004
All in all, seems to be even less than a x2 capacity increase:

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011869.html

For how many month will it be 'the solution' in 2016, the year of the Great Halvening? April or even May?

It was never presented or intended to be "the solution" and merely a small peice of a puzzle in a comprehensive and holistic approach to addressing Scalability and capacity.

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011865.html


"I think that right
now capacity is high enough and the needed capacity is low enough that
we don't immediately need these proposals, but they will be critically
important long term."

Is this a joke? What is long term? 1 month before the halving date oder 1 month after?
legendary
Activity: 1162
Merit: 1004
All in all, seems to be even less than a x2 capacity increase:

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011869.html

Lol you clearly still cant read with any proficiency.

For how many month will it be 'the solution' in 2016, the year of the Great Halvening? April or even May?

You're thinking of proposals that people dislike, not proposals that have received a warm welcome. Pieter Wuille is highly respected, mainly because of his amazing computer science work so far. This sounds to be more of the same, and all you can do is lie and hate? I feel genuinely sorry for you Zara, it can't be much fun, for you or your friends, being such a bitter person.

Your stupid projections. Are you crazy? We welcome a 1,8x increase. But I say it will not be the solution. That's even not enough to survive until the halving date.
legendary
Activity: 994
Merit: 1035
All in all, seems to be even less than a x2 capacity increase:

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011869.html

For how many month will it be 'the solution' in 2016, the year of the Great Halvening? April or even May?

It was never presented or intended to be "the solution" and merely a small peice of a puzzle in a comprehensive and holistic approach to addressing Scalability and capacity.

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011865.html

Quote from: nullc
TL;DR:  I propose we work immediately towards the segwit 4MB block
soft-fork
which increases capacity and scalability, and recent speedups
and incoming relay improvements make segwit a reasonable risk. BIP9
and segwit will also make further improvements easier and faster to
deploy. We’ll continue to set the stage for non-bandwidth-increase-based
scaling
, while building additional tools that would make bandwidth
increases safer long term. Further work will prepare Bitcoin for further
increases, which will become possible when justified, while also providing
the groundwork to make them justifiable.

Quote from: nullc
Concurrently, there is a lot of activity ongoing related to
“non-bandwidth” scaling mechanisms. Non-bandwidth scaling mechanisms
are tools like transaction cut-through and bidirectional payment channels
which increase Bitcoin’s capacity and speed using clever smart contracts
rather than increased bandwidth.
 Critically, these approaches strike right
at the heart of the capacity vs autotomy trade-off, and may allow us to
achieve very high capacity and very high decentralization.


Quote from: nullc
(http://diyhpl.us/wiki/transcripts/scalingbitcoin/hong-kong/overview-of-bips-necessary-for-lightning
is a relevant talk for some of the wanted network features for Lightning,
a bidirectional payment channel proposal which many parties are working
on right now; other non-bandwidth improvements discussed in the past
include transaction cut-through, which I consider a must-read for the
basic intuition about how transaction capacity can be greater than
blockchain capacity: https://bitcointalksearch.org/topic/transaction-cut-through-281848 ,
though there are many others.)

Quote from: nullc
Further out, there are several proposals related to flex caps or
incentive-aligned dynamic block size controls
based on allowing miners
to produce larger blocks at some cost.

Quote from: nullc
Finally--at some point the capacity increases from the above may not
be enough.
Delivery on relay improvements, segwit fraud proofs, dynamic
block size controls, and other advances in technology will reduce the risk
and therefore controversy around moderate block size increase proposals
(such as 2/4/8 rescaled to respect segwit's increase). Bitcoin will
be able to move forward with these increases when improvements and
understanding render their risks widely acceptable relative to the
risks of not deploying them. In Bitcoin Core we should keep patches
ready to implement them as the need and the will arises,
to keep the
basic software engineering from being the limiting factor.
legendary
Activity: 2674
Merit: 2965
Terminated.
You're thinking of proposals that people dislike, not proposals that have received a warm welcome. Pieter Wuille is highly respected, mainly because of his amazing computer science work so far. This sounds to be more of the same, and all you can do is lie and hate? I feel genuinely sorry for you Zara, it can't be much fun, for you or your friends, being such a bitter person.
Just ignore anyone who ignores this:
I don't understand any of this, maybe there can be something written for the less technical among us?
There already was an airplane analogy. Re-read the whole thread.

I've updated the thread title a bit.
hero member
Activity: 506
Merit: 500
I don't understand any of this, maybe there can be something written for the less technical among us?
legendary
Activity: 3430
Merit: 3080
All in all, seems to be even less than a x2 capacity increase:

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011869.html

Lol you clearly still cant read with any proficiency.

For how many month will it be 'the solution' in 2016, the year of the Great Halvening? April or even May?

You're thinking of proposals that people dislike, not proposals that have received a warm welcome. Pieter Wuille is highly respected, mainly because of his amazing computer science work so far. This sounds to be more of the same, and all you can do is lie and hate? I feel genuinely sorry for you Zara, it can't be much fun, for you or your friends, being such a bitter person.
legendary
Activity: 1162
Merit: 1004
All in all, seems to be even less than a x2 capacity increase:

http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011869.html

For how many month will it be 'the solution' in 2016, the year of the Great Halvening? April or even May?
legendary
Activity: 994
Merit: 1035
"What are the advantages of this? It allows you to drop the signatures from relay whenever you are relaying to a node that is not actually doing full-validation at the time. "

So this solution still does not help full-validation nodes, it only improve the relay speed for SPV nodes. But SPV nodes could just connect to the latest full node to get faster access, they are not the bottleneck for the whole network

https://www.reddit.com/r/Bitcoin/comments/3vurqp/greg_maxwell_capacity_increases_for_the_bitcoin/

Quote from: nullc
Segregated witness does several things; fixing malleability, improving upgradability; improving scaleability, and increasing capacity.

The improved scalablity comes from the new security models it makes available.. lite nodes with full node security (under specific conditions), fractional ('sharded' verification), quick bootstrapping by not fetching history data you're not going to verify and are only going to prune, and reduced data sent to lite clients.

Increased capacity comes from the fact that it takes roughly 2/3rds of the transaction data from participating transaction out of the part of the block that the blocksize limit counts and moves them to the witness, where they're counted (by the implementation) as 1/4th the size. The result for typical transactions is a better than 2x increase in capacity (more if multisig is heavily used). In the worst case, an strategically behaving miner might produce a 4MB bloat-block; since thats the largest size you can get if you were to make an all witness block. The typical increase in blocksize would be more like 2MB, but expressing it that way would underplay the worst case behavior..

Here is an exact breakdown of an example of the savings in overhead and why capacity can increase-

Quote from: nullc
Yea, the exact impact depend on usage patterns.

If your case is a counting one input, one output, pay to hash transactions the sizes work out to

4 (version) + 1 (vin count) + 32 (input id) + 4 (input index) + 4 (sequence no) + 1 (sig len) + 0 (sig) + 1 (output count) + 1 (output len) + 36 + (32 byte witness program hash, push overhead, OP_SEGWIT) + 8 (value) + 4 (nlocktime) = 96 non-witness bytes

1 (witness program length) + 1 (witness program type) + 33 (pubkey) + 1 (checksig) + 1 (witness length) + 73 (signature) = 110.

96x + 0.25*110x = 1000000; x = 8097 or 13.5 TPS for 600 second blocks; (this is without the code in front of me, so I may well have slightly miscounted an overhead; but it's roughly that)... which is around double if you were assuming 7 tps as your baseline. Which is why I said double the capacity in my post... but YMMV.

legendary
Activity: 2674
Merit: 2965
Terminated.
"What are the advantages of this? It allows you to drop the signatures from relay whenever you are relaying to a node that is not actually doing full-validation at the time. "

So this solution still does not help full-validation nodes, it only improve the relay speed for SPV nodes. But SPV nodes could just connect to the latest full node to get faster access, they are not the bottleneck for the whole network
What are you talking about? Advantages: 4x increase in traffic that is possible via a soft fork, fix to malleability, simpler script upgrades (quite important as well), fraud proof, less bandwidth for light nodes and historical sync. I think that it could help full nodes (not sure yet) by not syncing all of the historical data (i.e. signatures), because it isn't really needed. More discussion about this is definitely needed.

legendary
Activity: 1988
Merit: 1012
Beyond Imagination
"What are the advantages of this? It allows you to drop the signatures from relay whenever you are relaying to a node that is not actually doing full-validation at the time. "

So this solution still does not help full-validation nodes, it only improve the relay speed for SPV nodes. But SPV nodes could just connect to the latest full node to get faster access, they are not the bottleneck for the whole network
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
I have watched this video twice but still don't really get it

So far as I understand Pieter's proposal, a spending transaction is first validated by the mining nodes by checking its signature, then after it is validated, the node will include the transaction in the next mined block, but without that signature

However, this will create a security question for the other nodes receiving such a block: Without signature, how do they know the transaction is valid? Is it possible for a rougue node to forge a block with lots of invalid transctions but appears to be all validated by that node?

Normally all the adjacent nodes have similar mempool which contains similar transactions. If a transaction is already in their mempool they will know it is valid or not. However, if a new block arrives from far apart which contains many transactions not in their mempool, they need all the data of the transaction, especially the signature to validate it

Otherwise you would be able to spend bitcoin at any address without the signature

I'm not very sure how it works today, e.g. what kind of check other nodes do to make sure it is a valid block with all valid transactions, have to check it back


A side thought: I suppose the original Satoshi client design tried his best to be both secure and efficient. If there is such a large room of efficient increase without impacting security, why it has not been implemented before?

I remember 2-3 years ago there was a talk about the separation of transaction data using two chains to get rid of the forever growing block size pain, then after some deeper discussion it turns out you can not secure the separated part from being tampered, so it seems everything in a transaction is a whole and need to be included and secured by hash power altogether, which is the most efficient way
Pages:
Jump to: