Pages:
Author

Topic: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network" - page 9. (Read 18415 times)

legendary
Activity: 1904
Merit: 1037
Trusted Bitcoiner
A transaction will have one or more sigops (signature operations).  Let's denote these as S1, S2, ..., Sn.  Apparently verifying these involves considering pairs, e.g. (S1, S2), (S1, S3), ..., (S1, Sn), (S2, S3), (S2, S4), ..., (S2, Sn), ..., (Sn-1, Sn).  This leads to quadratic growth of the compute time.  To avoid running behind the live stream of blocks, transactions are limited in size; they cannot span multiple blocks.  Furthermore, the block size is limited; currently to 1MB.

The least compute intensive block is composed of a single transaction with only one sigop.  The most computational intensive block is also composed of a single transaction but with as many sigops as will fit.  A block with two or more transactions will be less computational intense.  A block full of many transactions, each with just a single sigop, is minimal in terms of compute power required to verify it.

Let's denote transactions, Tk, where k is the number of sigops.  A block is composed of a set of transactions, (Tk1, Tk2, ..., Tkn).  The verification time is then approximated by K12+K22+...+Kn2.

To scale we must either;

1) have fast enough hardware
2) improve the algorithm, i.e. verify without pairing
3) verify in the background (although the verification has to happen sometime)
4) constrain the sum of sigops/transaction in a block

Can someone please provide an example of transaction that requires two or more sigops?

this is a none issue
#1 #2 is has already been done

it's not like these crazy computational intensive TX are legit, you only get into trouble if you try to allow this crazy spam like TX with thousands of inputs.

miners are allowed to orphen a block for ANY reason. I think it's perfectly valid to not allow spam TX designed to slow down validation time of the block.
hero member
Activity: 709
Merit: 503
A transaction will have one or more sigops (signature operations).  Let's denote these as S1, S2, ..., Sn.  Apparently verifying these involves considering pairs, e.g. (S1, S2), (S1, S3), ..., (S1, Sn), (S2, S3), (S2, S4), ..., (S2, Sn), ..., (Sn-1, Sn).  This leads to quadratic growth of the compute time.  To avoid running behind the live stream of blocks, transactions are limited in size; they cannot span multiple blocks.  Furthermore, the block size is limited; currently to 1MB.

The least compute intensive block is composed of a single transaction with only one sigop.  The most computational intensive block is also composed of a single transaction but with as many sigops as will fit.  A block with two or more transactions will be less computational intense.  A block full of many transactions, each with just a single sigop, is minimal in terms of compute power required to verify it.

Let's denote transactions, Tk, where k is the number of sigops.  A block is composed of a set of transactions, (Tk1, Tk2, ..., Tkn).  The verification time is then approximated by K12+K22+...+Kn2.

To scale we must either;

1) have fast enough hardware
2) improve the algorithm, i.e. verify without pairing
3) verify in the background (although the verification has to happen sometime)
4) constrain the sum of sigops/transaction in a block

Can someone please provide an example of transaction that requires two or more sigops?
legendary
Activity: 4424
Merit: 4794
Satoshi was wrong about so many things.

And yet so many people are keeping to his words like it's some kind of sacred text :/

because bitcoins vision is about a decentralized currency not run by corporations who intend to screw users over.
the funny thing is that those denouncing satoshi's vision are doing so not because of logic of satoshi being wrong overall.. but because they are on the corporate/capitalist bandwagon and want to profit from other peoples misery

eventhis very topic, thinking bitcoin needs to be like Visa.. instead of just decentralized cash(cheques to be more precise), shows that people want to move away from the decentralized zero control premiss
legendary
Activity: 2786
Merit: 1031
Satoshi was wrong about so many things.

And yet so many people are keeping to his words like it's some kind of sacred text :/

That's because we came to bitcoin on Satoshi's vision, and we still care for that vision, Bitcoin should fail or succeed by that vision.

The people currently in charge have a different vision for bitcoin, they should instead, like so many other people, build their own alternative system and let bitcoin be what it was supposed to be.
sr. member
Activity: 462
Merit: 250
Satoshi was wrong about so many things.

And yet so many people are keeping to his words like it's some kind of sacred text :/
legendary
Activity: 4424
Merit: 4794
This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.

have you read a single line of bitcoin code yet or do you need to ask blockstream what language it is wrote in.


have you even got a full node running or still unsure how long it takes to sync the blockchain


would you run that future intended full node you want to have on your own computer, or run it remotely on amazon server?

i think Lauda has consumed too many people into his rhetoric, when in actual fact he knows very little and is just a mouth piece for blockstream

here is a summary of the debunks of literally every doomsday scenario Lauda has attempted to inflict on the community to sway people away from blockstreams agenda(roadmap)

though classic is one implementation. there is alot of background drama involved. so what could be done better is to get the programmers of bitcoinj, btcd, and the other main implementations to go for 2mb aswell. and find a way to get blockstream to come to their senses to have 2mb aswell.

things like debunking the 12month grace hard fork contention argument, by using luke jr's proposal of a different hard fork(difficulty drop) that he feels can happily become active in 3 months after code release.(if luke thinks 3months is acceptable. then no reason to go for 12, if luke wants hiscode in april then 2mb can be in april too)

things like debunking validation issues by highlighting that libsecp256k1 offers 5x validation speeds. making total of 10,000 signatures validate in april 2016, in the same time it takes 2000 signatures to validate in january 2016. thus allowing for more then a small bit of growth

things like debunking the hard drive storage bloat, with stats that 1mb has maxmimum yearly 100% filled blocks rate of 52.5gb. 2mb=105gb 4mb=210mb
so a 2tb hard drive at $100 can store 40 years of 1mb, 20 years of 2mb and 10 years of 4mb(2mb+segwit)

things like debunking user upload speeds causing relay delays. by stating that millions of people can happily play an online game, while in a voiceoverIP group chat. while livestreaming the game to youtube or twitch, all of which are upload activities. 750kbps= ~93 kByte/s = ~56mb every 10 minutes
legendary
Activity: 4424
Merit: 4794
What is so hard to be understood? This is the Bitcoin experience. When you first found out about Bitcoin, I am sure you were aware of all this and the fact that Hard Forks are a necessity in evolution of a great software.
They aren't. I have no idea where you got that from.

hard forks are not needed?
so the 2013 event due to the database bug. if not hard forked we would be stuck at 500kb blocks
so the earlier event where extra bitcoins were made if we didnt hard fork the rules of the 21mill cap would be broken

if any bug appears in the future? should we just live with it? by your logic yes, by my logic no
use logic not blockstream fanboyism

SegWit forked, that was it! Simple as that, they didn't even knew what was the reason behind it, it won't be ready for April and the roadmap will remain just dust in eyes.
This is a very bad FUD attempt. It happened due to some people running an older version (there were changes between v2 and v3). In other words, it is irrelevant as this won't and can't happen on the main net.
so hard forks can happen by some upgrading to cause a hard fork by voting for the change.. and some hard forks happen because people dont want the change.

so are you advocating that people should upgrade and accept change, or not upgrade and stick with old rules.. in both cases you have proven that a hard fork would still happen.
again use logic not blockstream fanboyism

The question is not whether Bitcoin can scale to the Visa network but whether Bitcoin can scale to just keep up the demand, because there is not such high demand for Bitcoin to be used for Visa levels in near future. And im sure the answer is yes.
You can't know that because of two reasons (as an example, they are more): 1) You don't know what the demand is represented as (e.g. TX volume? Not necessarily as somebody could be creating a lot of TXs themselves) nor how much demand there is going to be; 2) You don't know how the technology is going to improve over the years. Anyhow, with Segwit around the corner I don't understand any 'urgency' for a 2 MB block size limit.
and now you know why we need a BUFFER. to allow for natural growth when it happens instead of endless begging for minimal growth every 2 years.

2mb+segwit offers 4times the POTENTIAL.. meaning instead of 2000tx potential. there can be 8000tx potential. it does not mean blocks need to be filled by 8000 transactions as of summer 2017. it just means that blocks can grow slowly and naturally from 2000 to 8000 as and when needed at thier own natural pace without having to cry to blockstream every few months asking for an upgrade from 2000 to 2200, 2200 to 2400 but only getting them small movements every couple years
legendary
Activity: 2674
Merit: 3000
Terminated.
Setting a block size limit of 1MB was, and continues to be a hacky workaround.
It is certainly not a hacky workaround. It is a limit that was needed (it still is for the time being).

Theory drives development, but in practice sometimes hacky workarounds are needed.
If it can be avoided, not really.

The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.
So exactly what is the plan, replace one "hacky workaround" with another? Quite a lovely way forward. Segwit is being delivered and it will ease the validation problem and increase the transaction capacity. What is the problem exactly?

hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
legendary
Activity: 2576
Merit: 1087
No. You don't get to define what we allow in the system and what we don't, certainly not when it was possible all this time. What Gavin proposed is a hacky workaround, nothing more

Setting a block size limit of 1MB was, and continues to be a hacky workaround.

Theory drives development, but in practice sometimes hacky workarounds are needed.

I write code, I'd prefer it was all perfect. I run a business which means sometimes I have to consider bottom line. If a risk is identified and a quick fix is available it makes economic sense to apply the quick fix whilst working on a more robust long term solution.

That this has not been done inevitably leads people to question why. It's the answers that have been given to those questions that are causing the most difficulty. The fact that when those answers are challenged the story changes. The fact that the answers are inconsistent with what seems logical to any reasonably minded impartial observer.

The most important thing is that until about a year ago there was near unanimous agreement on what the purpose of the block size limit was, and how it would be dealt with. yet here we are today with this action having not been taken and a group of people actively trying to convince everyone that centralised enforcement of a block size limit is somehow the natural bahaviour if the system, despite it having never been so in its entire history.

The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.

Individual economic self interest is how Bitcoin is supposed to work.

It's time to remove the bandaid.

When the curtain is pulled back you will see how powerful the wizard really isn't.

legendary
Activity: 1162
Merit: 1004
Segwit is an improvement to scalability, a 2 MB block size limit isn't.

An improvement to scale offchain (altchain).
legendary
Activity: 1260
Merit: 1116
hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale

Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.

http://arstechnica.com/information-technology/2016/02/moores-law-really-is-dead-this-time/

First slowing in the mid-’90s to two-year gaps, the rate at which tiny transistors compute isn’t accelerating. Soon, too, they’ll have to be so small they’re just a few molecules, perhaps not even effective. And they’re not getting cheaper. Moore saw the future 50 years ago, but we may soon need a different rubric for predicting progress.


.. just wait for the next quantum jump. It's coming ...

          ( Hope for Bitcoin as well ! )
legendary
Activity: 1260
Merit: 1116

Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.

http://arstechnica.com/information-technology/2016/02/moores-law-really-is-dead-this-time/

First slowing in the mid-’90s to two-year gaps, the rate at which tiny transistors compute isn’t accelerating. Soon, too, they’ll have to be so small they’re just a few molecules, perhaps not even effective. And they’re not getting cheaper. Moore saw the future 50 years ago, but we may soon need a different rubric for predicting progress.
legendary
Activity: 2674
Merit: 3000
Terminated.
Good luck mate. For real I am sick of this. Bitcoin Classic will succeed I am sure of that and even more the whole community will he happy to get rid of them.


Good luck, you will surely need it. Now since you don't use technical arguments or anything I'd ask you kindly to not derail this thread further. These is at least one person that seems decent in it and worth talking to.
hero member
Activity: 812
Merit: 500

If you don't like that, please move to any shillcoins you wish and play with those.
Which is exactly what Classic is.


You're ignorance is amusing me. So that means Bitcoin Core = Shillcoin also? Cause it isn't Bitcoin .... is Bitcoin "Core" by Blockstream. So basically ....is ? C'mon say it. You trapped yourself into this one.  Cheesy

Good luck mate. For real I am sick of this. Bitcoin Classic will succeed I am sure of that and even more the whole community will he happy to get rid of them. 18% of nodes to convert to Bitcoin Classic and Core is over, losing majority. That simple and easy.

Kudos and don't get drunk with plain water is not good.
legendary
Activity: 2674
Merit: 3000
Terminated.
Resistence is futile my friend. We don't want to rag down Bitcoin, like an old car trying to make engine work with a simple rag. That's what SegWit is now for a solution that must be implemented. You guys are so delusional that you actually dissed Satoshi Nakamoto, it's clearly that your vision is not about Bitcoin anymore. But this is Bitcoin.
This is among the worst attempts at rebuttal that I've recently seen. Segwit is an improvement to scalability, a 2 MB block size limit isn't. You can't change the facts regardless of what nonsense you try to feed to the majority. The debate has become nonsense. 'Big blockists' wanted more capacity -> Core provides (will) this capacity with Segwit -> 'Big blockists' continue complaining. This does not make much sense?

If you don't like that, please move to any shillcoins you wish and play with those.
Which is exactly what Classic is.

Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.
hero member
Activity: 812
Merit: 500
What is so hard to be understood? This is the Bitcoin experience. When you first found out about Bitcoin, I am sure you were aware of all this and the fact that Hard Forks are a necessity in evolution of a great software.
They aren't. I have no idea where you got that from.

SegWit forked, that was it! Simple as that, they didn't even knew what was the reason behind it, it won't be ready for April and the roadmap will remain just dust in eyes.
This is a very bad FUD attempt. It happened due to some people running an older version (there were changes between v2 and v3). In other words, it is irrelevant as this won't and can't happen on the main net.


Resistence is futile my friend. We don't want to rag down Bitcoin, like an old car trying to make engine work with a simple rag. That's what SegWit is now for a solution that must be implemented. You guys are so delusional that you actually dissed Satoshi Nakamoto, it's clearly that your vision is not about Bitcoin anymore. If you don't like that, please move to any shillcoins you wish and play with those. But this is Bitcoin. Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
sr. member
Activity: 423
Merit: 250
of course Gavin is doing things differently in classic his primary goal is to scale the blockchain as much as possible.

this is NOT the main goal of Core, which is fine, there second layer solution is fine, but it's simply not what the majority want...


I agree second layer is fine, but people should choose it because it is better than decentralized onchain transactions, not just because the the blocksize is artifically limited so they have no other choice.



Bitcoin is up and running well (assuming one uses reasonable fees ~5¢/transaction) despite an onslaught of ill-intention persons.  Shame on all of us for not adjusting our marketing messages earlier to set expectations better.  Find me another system that has withstood as much and moves millions of dollars a day.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free?

My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.

give it a few more days of debate, you'll see your optimism and praise will turn to anger and disgust. LOL  Grin


This, unless one know the big picture it is easy to tell Bitcoin is working well. Because it is now, but it is supposed to be just settlement layer for offchain transactions with hundred times more expensive onchain fees because of artifficial blocksize limit. So say good bye affordable decentralized onchain transactions if this vision comes true, and be ready for just another centralized offchain solutions!

Here is the suggested plan explained by Bitmain's Jihan Wu at 8btc he got from core representatives at recent Hong Kong meeting:

https://np.reddit.com/r/BitcoinMarkets/comments/48kf18/daily_discussion_wednesday_march_02_2016/d0krl0w

Quote
During the Hong Kong meeting, the answer provided by Core reps is that the future Lightning Network would increase capacity a thousandfold - that up to tens of thousands of transactions can be completed on the lightning network and settled with one on chain transaction. Assuming that current transaction fees are 0.3 RMB, and assuming that 1000 lightning transactions can be settled by one blockchain transaction, then we can raise fees for on chain transactions to 30 RMB (100x increase), while each transaction on the lightning network would only cost a tenth of current fees and increase miner revenue a hundredfold.
legendary
Activity: 2674
Merit: 3000
Terminated.
What is so hard to be understood? This is the Bitcoin experience. When you first found out about Bitcoin, I am sure you were aware of all this and the fact that Hard Forks are a necessity in evolution of a great software.
They aren't. I have no idea where you got that from.

SegWit forked, that was it! Simple as that, they didn't even knew what was the reason behind it, it won't be ready for April and the roadmap will remain just dust in eyes.
This is a very bad FUD attempt. It happened due to some people running an older version (there were changes between v2 and v3). In other words, it is irrelevant as this won't and can't happen on the main net.

The question is not whether Bitcoin can scale to the Visa network but whether Bitcoin can scale to just keep up the demand, because there is not such high demand for Bitcoin to be used for Visa levels in near future. And im sure the answer is yes.
You can't know that because of two reasons (as an example, they are more): 1) You don't know what the demand is represented as (e.g. TX volume? Not necessarily as somebody could be creating a lot of TXs themselves) nor how much demand there is going to be; 2) You don't know how the technology is going to improve over the years. Anyhow, with Segwit around the corner I don't understand any 'urgency' for a 2 MB block size limit.
Pages:
Jump to: