Pages:
Author

Topic: btc (Read 5688 times)

hero member
Activity: 504
Merit: 504
btc
June 09, 2015, 07:28:26 AM
#84
I'm still not convinced, there are a lot of assumptions made, and some information is incorrect or not possible and needs some explanation further...

Also, bear in mind the 2 following quotes when I present the points made that don't seem to add up, conflict or simply don't make sense.

Quote
It should be noted that we are talking about the capability of an individual computer which is the ultimate bottleneck
Quote
BitShares 2.0 will be capable of handling over 100.000 (100k) transaction per second on commodity hardware with parallel architectural optimizations in mind.

So, lets proceed...

Quote
...we make the assumption that the network is capable of streaming all of the transaction data and that disks are capable of recording this stream...

That is a bad assumption to make if you intend BitShares to run on commodity hardware as is stated in various texts relating to BitShares 2.0....if you wish to achieve that, then you should really be assuming that the network, and disks are NOT capable of such a thing.

Quote
Todays high-end servers with 36 cores (72 with hyper-threading) could easily validate 100,000 transactions per second.

Erm....what happened to commodity hardware already?  Who has a 36 core CPU lying around?

Quote
The average size of a transaction is just 100 bytes

I just don't see how this is possible while maintaining the minimum amount of data required to ensure a validatable transaction.  A 256bit EC signature is ~70 bytes, 30 bytes sure doesn't seem like enough to specify 2 addresses, a transaction value, and anything else that is required.

Its worth noting that quoted figures for average BTC transactions are also incorrect as per this http://bitshares.github.io/technology/high-performance-and-scalability/

Quote
The average size of a transaction on competing networks, such as Ripple and Bitcoin, is about 250 bytes.

I recall a few respected members here doing research into this, and the average BTC transaction was at least 2x that, usually 3x and greater.

Quote
This is well within the capacity of most data centers and therefore not considered part of the bottleneck.

 Huh so we have gone from commodity hardware, to data-centers?  What about keeping things decentralized or on commodity hardware?

Quote
After creating the block chain we timed how long it took to “reindex” or “replay” without signature verification. On a two year old 3.4 Ghz Intel i5 CPU this could be performed at over 180,000 operations per second.

That is a statistic I can swallow, but my question is, WHO is doing the signature validation?  Only people with 32 core machines and 1TB of memory?  If so, how are the rest of the nodes in the network ensuring that this now centralized task is done properly?  How can I with my lowly 8 core, be sure that the transactions are indeed valid and signed correctly without having to also verify 100k transactions per second.

Quote
We set up a benchmark to test real-time performance and found that we could easily process over 2000 transactions per second with signature verification on a single machine.

Ahh so 100k per second really is only available to people who own 32 core CPU's in a spare data-center?  If this single machine consisted of commodity hardware, and thus most users of the network will have similar, its not 100k per second is it, its 2k.

Remember this from up top It should be noted that we are talking about the capability of an individual computer which is the ultimate bottleneck, which is in turn confirmed by this next statement

Quote
On release, the transaction throughput will be artificially limited to just 1000 transactions-per-second.

If the speed of the 100tx/s is really achievable on commodity hardware, why limit it to 1000 transactions per second on release?  Could it be that 100k/s on commodity hardware actually is not possible, and this 1k limit is actually to mitigate machines slower than the test bed machine that could achieve 2k/s

If that is not the case then I am totally confused. Is it limited by an individual computer to 2k tx/s, or is it not?  Do you need the suggested 32 cores to be able to process 100k tx/s and if so, what about my question for machines that are slower?  If the majority of machines are indeed only able to process 2k/s what purpose do they serve in the network?  Are they redundant in ANY transaction processing?

To me on the surface, it all seems like conflicting statements, and contradictory numbers.  If the system can process 100k/s, without having centralized nodes packing 32 cores and 1TB of RAM, then I'll take my hat off, but all this information is so confusing, I don't even know what to take away from it.



YOUR THOUGHTFUL RESPONSE IS MUCH APPRECIATED


Since you took the trouble to read some of our BitShares 2.0 documentation and have prepared a polite, professional response, I am pleased to join you in a serious exchange of ideas.  Smiley

I'll limit my first response to just one of your lines of questions, lest too many trees hide the forest.

You can think of the transaction rate setting in BitShares 2.0 as a form of "dial a yield".  It can be dynamically adjusted by the stakeholders to provide all the throughput they want to pay for.  Since BitShares is designed to be a profitable business, it only makes sense to pay for just enough processing capacity to handle peak loads.    

The BitShares 2.0 block chain has a number of "knobs" that can be set by elected delegates.  One of them is throughput.  The initial setting of that knob is 1000 transactions per second because right now that is plenty and allows the maximum number of people to provide low cost nodes.  A second knob is witness node pay rate.  If doubling the throughput requires slightly more expensive nodes, the stakeholders just dial up the pay rate until they get enough bidders to provide the number of witness node they decide they want (another dial).  Pay rate scales which throughput which scales with revenue which scales with transaction volume.

Now, suppose that a few big applications were to decide to bring all their transactions to the neutral BitShares platform one summer.  If we needed to double the throughput, here's what would happen.

The elected delegates would discuss it in public and then publish their recommended new knob settings.  Perhaps they pick 2000 transactions per second and $100/month pay for each witness node provider.  Everyone who wants to compete for that job then has the funds to upgrade their servers to the next bigger off-the-shelf commodity processor.

As soon as they change those knob settings, the blockchain begins a two week countdown during which time the stakeholders are given a chance to vote out the delegates from their wallets if they don't like the change.  If they are voted out, the blockchain automatically aborts the adoption of the new settings.  If not, the settings are changed and the BitShares network shifts gears to run faster.

There is enough reserve capacity in the software to double our throughput about 8 times - scaling by three orders of magnitude with a simple parameter adjustment.

The current knob setting gives us plenty of reserve capacity at the lowest possible witness node price.  It could absorb all of Bitcoin's transaction workload without needing to even touch those dials.  But, if we ever need to take on the workload of something like NASDAQ or VISA or Master Card, we can dial up the bandwidth to whatever level the stakeholders vote to support.

So, the BitShares 2.0 platform has plenty of spare bandwidth to handle the ledger bookkeeping functions of just about every blockchain currently in existence.  You are all welcome to join us and start using your mining expenses to pay your own developers and marketers instead of electric companies. Nothing else about your business models or token distributions would change.  Simply outsource your block signing tasks and get on with your more interesting earthshaking ideas.   Think of the industry growth we would all experience if most funds spend on block signers were used to grow our common ecosystem instead.  We could all share one common, neutral global ledger platform, where cross-chain transactions, smart contracts and other such innovations were all interoperable!

Or will we waste our industry's development capital on a never ending mining arms race?  Carpe diem!

Stan Larimer, President
Cryptonomex.com


legendary
Activity: 1050
Merit: 1016
June 08, 2015, 05:23:53 PM
#83
I'm still not convinced, there are a lot of assumptions made, and some information is incorrect or not possible and needs some explanation further...

Also, bear in mind the 2 following quotes when I present the points made that don't seem to add up, conflict or simply don't make sense.

Quote
It should be noted that we are talking about the capability of an individual computer which is the ultimate bottleneck
Quote
BitShares 2.0 will be capable of handling over 100.000 (100k) transaction per second on commodity hardware with parallel architectural optimizations in mind.

So, lets proceed...

Quote
...we make the assumption that the network is capable of streaming all of the transaction data and that disks are capable of recording this stream...

That is a bad assumption to make if you intend BitShares to run on commodity hardware as is stated in various texts relating to BitShares 2.0....if you wish to achieve that, then you should really be assuming that the network, and disks are NOT capable of such a thing.

Quote
Todays high-end servers with 36 cores (72 with hyper-threading) could easily validate 100,000 transactions per second.

Erm....what happened to commodity hardware already?  Who has a 36 core CPU lying around?

Quote
The average size of a transaction is just 100 bytes

I just don't see how this is possible while maintaining the minimum amount of data required to ensure a validatable transaction.  A 256bit EC signature is ~70 bytes, 30 bytes sure doesn't seem like enough to specify 2 addresses, a transaction value, and anything else that is required.

Its worth noting that quoted figures for average BTC transactions are also incorrect as per this http://bitshares.github.io/technology/high-performance-and-scalability/

Quote
The average size of a transaction on competing networks, such as Ripple and Bitcoin, is about 250 bytes.

I recall a few respected members here doing research into this, and the average BTC transaction was at least 2x that, usually 3x and greater.

Quote
This is well within the capacity of most data centers and therefore not considered part of the bottleneck.

 Huh so we have gone from commodity hardware, to data-centers?  What about keeping things decentralized or on commodity hardware?

Quote
After creating the block chain we timed how long it took to “reindex” or “replay” without signature verification. On a two year old 3.4 Ghz Intel i5 CPU this could be performed at over 180,000 operations per second.

That is a statistic I can swallow, but my question is, WHO is doing the signature validation?  Only people with 32 core machines and 1TB of memory?  If so, how are the rest of the nodes in the network ensuring that this now centralized task is done properly?  How can I with my lowly 8 core, be sure that the transactions are indeed valid and signed correctly without having to also verify 100k transactions per second.

Quote
We set up a benchmark to test real-time performance and found that we could easily process over 2000 transactions per second with signature verification on a single machine.

Ahh so 100k per second really is only available to people who own 32 core CPU's in a spare data-center?  If this single machine consisted of commodity hardware, and thus most users of the network will have similar, its not 100k per second is it, its 2k.

Remember this from up top It should be noted that we are talking about the capability of an individual computer which is the ultimate bottleneck, which is in turn confirmed by this next statement

Quote
On release, the transaction throughput will be artificially limited to just 1000 transactions-per-second.

If the speed of the 100tx/s is really achievable on commodity hardware, why limit it to 1000 transactions per second on release?  Could it be that 100k/s on commodity hardware actually is not possible, and this 1k limit is actually to mitigate machines slower than the test bed machine that could achieve 2k/s

If that is not the case then I am totally confused. Is it limited by an individual computer to 2k tx/s, or is it not?  Do you need the suggested 32 cores to be able to process 100k tx/s and if so, what about my question for machines that are slower?  If the majority of machines are indeed only able to process 2k/s what purpose do they serve in the network?  Are they redundant in ANY transaction processing?

To me on the surface, it all seems like conflicting statements, and contradictory numbers.  If the system can process 100k/s, without having centralized nodes packing 32 cores and 1TB of RAM, then I'll take my hat off, but all this information is so confusing, I don't even know what to take away from it.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
June 08, 2015, 03:54:38 PM
#82
You're barking up the wrong tree, trying to make Bitcoin like ripple, nxt, or bitshares....
what kind of nonsense is this?

We already know the solution-- short term simply increase the block size, and longest term let's see if we can get sidechains working.
legendary
Activity: 1764
Merit: 1018
June 08, 2015, 03:44:57 AM
#81
The idea that one can push an infinite number of transactions through the Monero network is utter nonsense. Monero uses adaptive limits that limit the blocksize dynamically. This is explained in section 6.2 of the Cryptonote Whitepaper https://cryptonote.org/whitepaper.pdf. This means that there is no fixed maximum number of TPS that cannot be exceeded regardless of the market conditions. This is the critical difference with not just Bitcoin, but with Litecoin, Dodgecoin, Dash and many other alt-coins.

Thanks for sharing, but even Monero has unlimited TPS, yesterday when I send Monero to poloniex.com I wait 16 confirmation and get it at balance after 30 min+, withdraw BitShares from poloniex.com take me 3 min until I get BTS in the wallet confirmed. So theoretically Monero can be very fast, practically it's even slower than Litecoin.
Sure I will be very happy to see fast Monero network in future.
sr. member
Activity: 770
Merit: 250
June 05, 2015, 08:48:23 PM
#80
The next big thing would be a thing that allows for all of the world's electronic transactions per second combined in a decentralized manner. It seems that Bitcoin will not be able to do it, so can't the rest of shitcoins, so for now Bitcoin is still king.
I am a proposal of doing less important shitty small transactions off blockchain, because who cares about 0.000001 type of transactions. All of them should go off blockchain to favour 1+USD ones. We should push in this direction to be able to fight against VISA, Mastercard, and all the banks combined. Bitcoin must be #1 in the world.

Huh, well you'd have to find all the credit card companies, all the online payment platforms, and add them up all together to determine "the world's electronic transactions per second". Besides that, it is possible for decentralized entities to do that already, such as Monero(has no hard cap).
legendary
Activity: 1610
Merit: 1183
June 04, 2015, 12:24:20 PM
#79
The next big thing would be a thing that allows for all of the world's electronic transactions per second combined in a decentralized manner. It seems that Bitcoin will not be able to do it, so can't the rest of shitcoins, so for now Bitcoin is still king.
I am a proposal of doing less important shitty small transactions off blockchain, because who cares about 0.000001 type of transactions. All of them should go off blockchain to favour 1+USD ones. We should push in this direction to be able to fight against VISA, Mastercard, and all the banks combined. Bitcoin must be #1 in the world.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
June 04, 2015, 10:08:41 AM
#78
Historically, the bitcoin community loves to eat whatever Gavin is serving up,  [...]

Huh? Stopped reading after that.

It is kind of true though. Not because he's a cult leader. Anyone in the lead dev position would have the same support unless he's a a loser that steers us in the wrong direction constantly. The surprising thing is that four members of his own team don't support him.

Which is why this may be the first time in human history that the bitcoin community stands up to tyranny and earns its freedom.

The bitcoin community has historically eaten every fork full that Gavin has fed them, and they lapped it up.  But now the community has woken up to see that if Gavin successfully pulls off this centralization fork to make the big miners stronger (more expensive for the little guy to compete), then that means that Gavin basically has the big centralized mining cartels in his back pocket meaning that each consecutive centralization fork will become more and more easier to implement until he is able to finally raise the total coin limit to compensate the mining cartels handsomely for following his forks year after year.  

But before you cast the first stone, tell me honestly that you would not seize absolute power if it was within your grasp.  

In other words, would you turn down the opportunity to have the hottest chick on the planet (and she was madly in love with you and wanted your baby) for an opportunity to save bitcoin anonymously?

I did not think so.  Gavin has already won.

Tinfoil hat much?  I don't know what it is about crypto that seems to attract conspiracy theorists, but that's some definite crazy right there.  This isn't about individual developers "winning" or "losing".  Grow up.  The choice we're being given is whether bitcoin can scale to support the masses and remain affordable for all, which it simply can't do at the moment unless the fork goes ahead, or whether it becomes a niche, elitist network that most people won't be able to afford to use.  I support a network that's open and accessible to all, not just the privileged few like the anti-fork crowd want.
legendary
Activity: 1050
Merit: 1016
June 04, 2015, 08:34:57 AM
#77

I know that 100,000 TPS sounds impossible, but so does an "automatic blockchain that pays for its own development with its profits"


100,000 TPS isn't impossible, its just impossible to do on a block chain in a true P2P decentralized manner without some form of centralization, or "magic trick".  It seems to me BitShares may be doing both as per these 2 quotes in the link you provided.

"...assuming that all the witness nodes have internet connections capable..." so it is indeed confined/centralized to a set of nodes

and

"...with an average transaction size of about 100 bytes."

The latter concerns me, the absolute minimum core basic requirements of a secure verifiable transaction are a sender, a receiver, a value and a signature.  Just the signature alone with a 256bit key is in the order of 70 bytes,  sender pubkey is 32, receiver RIPEMD160 address is 20, and a value is 8 which is a total of 130 bytes with the bare minimum.  If the key space is reduced to 160bit, then it will just fit in 100 bytes, but with a huge loss of security.

I'm assuming that to achieve this 100,000 TPS something similar to this happens:

Transactions are filtered through these "witness nodes" and I send 100 BTS to A.  If within a certain time frame, A moves it to B, and B to C, etc etc that TX isn't recorded on the block chain until such a time that it lives at X for longer than a specified period of time.  The 100 BTS movements between A -> B-> C....X are not recorded in full on the block chain (if at all), only the transaction A -> X

Taking that approach could indeed give you very high TX throughput, but if that is the method used (or something similar) it's a total hack in my opinion and may well lead to issues later.  

Of course I'm speculating as I don't have the time to research this properly, so if you could provide some links/docs/something that details how this works rather than me hunting, I'd appreciate it.  I stand by the fact that it is not possible to do, on a block chain, while recording all transactions to said chain, and allow any node to be a full node without special requirements.

Anyway, I'm getting off topic.  180,000 TPS might sound great to some, but Monero destroys BitShares in this arena with the best TPS ceiling of "infinity," so there's that option too of course.  In other words, the bitcoin community has infinitely more options than they are currently looking at, and I am just trying to show them that the state of crypto circa 2015 is "not your dad's crypto"

If Monero really has stated "infinity" as their TPS limit, then someone there really needs a reality check!  Regardless of what is possible on a block chain or not, the laws of thermodynamics will step in and dispatch a tough and thorough spanking waaaaay before "infinity" is even close Smiley

IMO the only way to achieve anything near a sustainable VISA level transaction throughput, stay in keeping with real decentralization (no special node sub-sets that are selected or voted), not perform any "tricks" which may compromise security, AND have all transactions on a public ledger is to scale horizontally, and NOT vertically!

Chain based ledgers can't scale vertically past a certain point, no matter how big your bag of tricks, nor your processing setup, horizontal is the only way and by that I mean a distributed and partitioned ledger of the ilk that we are doing over here.  No one has even attempted to do this, because its assumed impossible or too difficult, and if it is so be it, at least it was attempted.  However it's not impossible, we've ran it in many betas now and its is very close to being ready for use.


I like what you've done with e-munie.  You should come work for the BitShares blockchain.  Just submit a proposal, and the community would certainly vote you into a paid position (that's how BitShares members fund development).

https://bitsharestalk.org/index.php/board,61.0.html?PHPSESSID=2170a8f0b09b8fa2bdc7d35908ab4517


Heh thanks but no thanks, I've ploughed my life and everything I have into eMunie and I'm not jumping ship, ever Smiley


EDIT
----

So I did some more digging and came across this:

"...the idea being that if transactions have their signatures validated as they propagate across the network, a witness can have any number of computers surrounding him that validates all of these signatures, and then he gets a list of transactions and puts them in a block, and he doesn’t have to check those signatures himself, because he has got all these other nodes surrounding him that are dividing up the task."

Can someone clarify this?  Witness nodes, which build the blocks DO NOT check transaction signatures themselves, but rely on 3rd parties (which may be dishonest) to inform them that the signature for said transactions validate?  How does a witness node know if a 3rd party is providing false information regarding a transaction, claiming that it contains a valid signature when it may not?  If this happens, how then does the network resolve it, someone, somewhere must be doing a full validation of those 100,000 TPS to ensure that all transactions really are legitimate.

The issue is that sequencing and finalizing a block production is done single threaded, because validity is dependent on sequencing.  That makes that step the bottleneck, since it can't be scaled to more threads.  The signature checking doesn't have to be done by third parties, just separate threads.  Those separate threads could be spread across a cluster, or possibly handled by GPUs, or anything else, they just have to be running somewhere trusted by the Witness in order to feed the final Witness thread valid transactions to sequence and include in the block.

It's always a trade-off between decentralization and therefore redundant block validation and transaction fees.  Once you get to a globally useful scale, processing transactions has a real cost, and it has to be paid either by inflation of the currency, transaction fees, or subsidies from somewhere else.  The BitShares approach is to let the shareholders decide how much redundancy and decentralization they're willing to pay for, and where to set the transaction fees based on that.

Yes block production is single threaded, and multi-threaded signature verification is common sense, but even with the latest CPU's, performing 100,000 signature validations per second is an extremely tall order, commodity CPU's wont get anywhere close to that.   GPU's could be an option, but has anyone actually attempted to perform signature verification on a GPU?  I'm not aware of anyone doing it, or looking into it at the moment.

The statement made by BitShares was that witness nodes can rely on 3rd parties to verify the signatures, and that doesn't sit comfortably with me, because you have to trust these nodes.  Any of those nodes could become dishonest at any moment and then you have to resolve it in the network.  A proofing scheme could be created that proves the 3rd party is honest with regard to the transaction, but that will likely be just as expensive as simply doing the verification.  Seems like a lot of risk for a TPS value that will likely never be required, or at least not for many many years.
legendary
Activity: 1105
Merit: 1000
June 04, 2015, 12:15:01 AM
#76
Please show me how you are more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately, extracting all the energy from the universe to get close to infinity processing capability? Smiley

Now this is just getting retarded, you know very well what I meant with virtually infinite and I have shown by definition of both words its the correct term for Monero, now go play with your emunie.

Indeed I do, I'm just pointing out that by using the definition you presented yourself, is not an accurate representation of Monero's capabilities, and so, it is wrong.

You are only convincing yourself here Cheesy

He's actually quite right (on the definition).

As for Monero scaling to infinity: block size may have no upper bound, but you can be sure current processing efficiency will cause issues before a "high" TPS is achieved (define it as whatever). Lots of work to do...
legendary
Activity: 2156
Merit: 1393
You lead and I'll watch you walk away.
June 03, 2015, 10:34:57 PM
#75
Historically, the bitcoin community loves to eat whatever Gavin is serving up,  [...]

Huh? Stopped reading after that.

It is kind of true though. Not because he's a cult leader. Anyone in the lead dev position would have the same support unless he's a a loser that steers us in the wrong direction constantly. The surprising thing is that four members of his own team don't support him.
legendary
Activity: 1358
Merit: 1001
https://gliph.me/hUF
June 03, 2015, 10:20:16 PM
#74
Historically, the bitcoin community loves to eat whatever Gavin is serving up,  [...]

Huh? Stopped reading after that.
newbie
Activity: 31
Merit: 0
June 03, 2015, 09:40:08 PM
#73

I know that 100,000 TPS sounds impossible, but so does an "automatic blockchain that pays for its own development with its profits"


100,000 TPS isn't impossible, its just impossible to do on a block chain in a true P2P decentralized manner without some form of centralization, or "magic trick".  It seems to me BitShares may be doing both as per these 2 quotes in the link you provided.

"...assuming that all the witness nodes have internet connections capable..." so it is indeed confined/centralized to a set of nodes

and

"...with an average transaction size of about 100 bytes."

The latter concerns me, the absolute minimum core basic requirements of a secure verifiable transaction are a sender, a receiver, a value and a signature.  Just the signature alone with a 256bit key is in the order of 70 bytes,  sender pubkey is 32, receiver RIPEMD160 address is 20, and a value is 8 which is a total of 130 bytes with the bare minimum.  If the key space is reduced to 160bit, then it will just fit in 100 bytes, but with a huge loss of security.

I'm assuming that to achieve this 100,000 TPS something similar to this happens:

Transactions are filtered through these "witness nodes" and I send 100 BTS to A.  If within a certain time frame, A moves it to B, and B to C, etc etc that TX isn't recorded on the block chain until such a time that it lives at X for longer than a specified period of time.  The 100 BTS movements between A -> B-> C....X are not recorded in full on the block chain (if at all), only the transaction A -> X

Taking that approach could indeed give you very high TX throughput, but if that is the method used (or something similar) it's a total hack in my opinion and may well lead to issues later.  

Of course I'm speculating as I don't have the time to research this properly, so if you could provide some links/docs/something that details how this works rather than me hunting, I'd appreciate it.  I stand by the fact that it is not possible to do, on a block chain, while recording all transactions to said chain, and allow any node to be a full node without special requirements.

Anyway, I'm getting off topic.  180,000 TPS might sound great to some, but Monero destroys BitShares in this arena with the best TPS ceiling of "infinity," so there's that option too of course.  In other words, the bitcoin community has infinitely more options than they are currently looking at, and I am just trying to show them that the state of crypto circa 2015 is "not your dad's crypto"

If Monero really has stated "infinity" as their TPS limit, then someone there really needs a reality check!  Regardless of what is possible on a block chain or not, the laws of thermodynamics will step in and dispatch a tough and thorough spanking waaaaay before "infinity" is even close Smiley

IMO the only way to achieve anything near a sustainable VISA level transaction throughput, stay in keeping with real decentralization (no special node sub-sets that are selected or voted), not perform any "tricks" which may compromise security, AND have all transactions on a public ledger is to scale horizontally, and NOT vertically!

Chain based ledgers can't scale vertically past a certain point, no matter how big your bag of tricks, nor your processing setup, horizontal is the only way and by that I mean a distributed and partitioned ledger of the ilk that we are doing over here.  No one has even attempted to do this, because its assumed impossible or too difficult, and if it is so be it, at least it was attempted.  However it's not impossible, we've ran it in many betas now and its is very close to being ready for use.


I like what you've done with e-munie.  You should come work for the BitShares blockchain.  Just submit a proposal, and the community would certainly vote you into a paid position (that's how BitShares members fund development).

https://bitsharestalk.org/index.php/board,61.0.html?PHPSESSID=2170a8f0b09b8fa2bdc7d35908ab4517


Heh thanks but no thanks, I've ploughed my life and everything I have into eMunie and I'm not jumping ship, ever Smiley


EDIT
----

So I did some more digging and came across this:

"...the idea being that if transactions have their signatures validated as they propagate across the network, a witness can have any number of computers surrounding him that validates all of these signatures, and then he gets a list of transactions and puts them in a block, and he doesn’t have to check those signatures himself, because he has got all these other nodes surrounding him that are dividing up the task."

Can someone clarify this?  Witness nodes, which build the blocks DO NOT check transaction signatures themselves, but rely on 3rd parties (which may be dishonest) to inform them that the signature for said transactions validate?  How does a witness node know if a 3rd party is providing false information regarding a transaction, claiming that it contains a valid signature when it may not?  If this happens, how then does the network resolve it, someone, somewhere must be doing a full validation of those 100,000 TPS to ensure that all transactions really are legitimate.

The issue is that sequencing and finalizing a block production is done single threaded, because validity is dependent on sequencing.  That makes that step the bottleneck, since it can't be scaled to more threads.  The signature checking doesn't have to be done by third parties, just separate threads.  Those separate threads could be spread across a cluster, or possibly handled by GPUs, or anything else, they just have to be running somewhere trusted by the Witness in order to feed the final Witness thread valid transactions to sequence and include in the block.

It's always a trade-off between decentralization and therefore redundant block validation and transaction fees.  Once you get to a globally useful scale, processing transactions has a real cost, and it has to be paid either by inflation of the currency, transaction fees, or subsidies from somewhere else.  The BitShares approach is to let the shareholders decide how much redundancy and decentralization they're willing to pay for, and where to set the transaction fees based on that.
sr. member
Activity: 350
Merit: 250
June 03, 2015, 06:43:52 PM
#72
The idea that one can push an infinite number of transactions through the Monero network is utter nonsense. Monero uses adaptive limits that limit the blocksize dynamically. This is explained in section 6.2 of the Cryptonote Whitepaper https://cryptonote.org/whitepaper.pdf. This means that there is no fixed maximum number of TPS that cannot be exceeded regardless of the market conditions. This is the critical difference with not just Bitcoin, but with Litecoin, Dodgecoin, Dash and many other alt-coins.

yup, thats why I'm trying to make OP change to at least virtually infinite, its pretty clear that respecting laws of physics is a major hole in the XMR ark Cheesy
sr. member
Activity: 770
Merit: 250
June 03, 2015, 06:42:57 PM
#71
The only coins in that poll that are worth anything is Monero and Bitcoin of course.
legendary
Activity: 1050
Merit: 1016
June 03, 2015, 06:41:25 PM
#70
The idea that one can push an infinite number of transactions through the Monero network is utter nonsense. Monero uses adaptive limits that limit the blocksize dynamically. This is explained in section 6.2 of the Cryptonote Whitepaper https://cryptonote.org/whitepaper.pdf. This means that there is no fixed maximum number of TPS that cannot be exceeded regardless of the market conditions. This is the critical difference with not just Bitcoin, but with Litecoin, Dodgecoin, Dash and many other alt-coins.

Thats what I was looking for, thank you!
legendary
Activity: 2282
Merit: 1050
Monero Core Team
June 03, 2015, 06:31:31 PM
#69
The idea that one can push an infinite number of transactions through the Monero network is utter nonsense. Monero uses adaptive limits that limit the blocksize dynamically. This is explained in section 6.2 of the Cryptonote Whitepaper https://cryptonote.org/whitepaper.pdf. This means that there is no fixed maximum number of TPS that cannot be exceeded regardless of the market conditions. This is the critical difference with not just Bitcoin, but with Litecoin, Dodgecoin, Dash and many other alt-coins.
member
Activity: 63
Merit: 10
June 03, 2015, 05:06:04 PM
#68


haha ok...

just confirming you wanted to FUD Monero btw  Kiss

https://lab.getmonero.org/

Monero is one of the few coins besides Bitcoin with real mathematicians and cryptographers behind.

emunie is just a joke like XEM  Undecided

I'm a little biased by nature since i am a part of the emunie project, but i cannot see why asking critical questions regarding claims made by different devs/projects is speading FUD. There are several problems highlighted in this thread surrounding these claims, which i do not see you or anyone else have been able to answer to a satisfactory degree.

Then you have the audacity to go so low and proclaim that emunie is a joke. Its the same thing every time here on BTT. You go in and try and have a civil debate and all it turns out to, when people have no counter-arguments, is schoolyard bullying.

Either acknowledge that Monero's performance claims are not true or don't. Let that be your opinion and lets close it there. Do not downplay other peoples projects just because they found a hole in your Noahs ark.
legendary
Activity: 1050
Merit: 1016
June 03, 2015, 02:33:21 PM
#67
Please show me how you are more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately, extracting all the energy from the universe to get close to infinity processing capability? Smiley

Now this is just getting retarded, you know very well what I meant with virtually infinite and I have shown by definition of both words its the correct term for Monero, now go play with your emunie.

Indeed I do, I'm just pointing out that by using the definition you presented yourself, is not an accurate representation of Monero's capabilities, and so, it is wrong.

You are only convincing yourself here Cheesy

Not really, this is one of the many reasons that the mass market is confused, because people involved with projects make statements such as that which are fundamentally wrong, can not be achieved or proven, and only serve the purpose of "bragging rights" against other technologies.

Additionally statements such as that result in good projects such as Monero not being taken seriously by academics and other educated individuals in the real world, because statements are presented as fact, that can not be fact.

haha ok...

just confirming you wanted to FUD Monero btw  Kiss

https://lab.getmonero.org/

Monero is one of the few coins besides Bitcoin with real mathematicians and cryptographers behind.

emunie is just a joke like XEM  Undecided

I didn't want to FUD anything, just pointing these claims and the possibility of them being true or not.

Yeah yeah, eMunie is a joke, XEM is a joke, XYZ is a joke, awesome, lets descend to childish playground statements.
sr. member
Activity: 350
Merit: 250
June 03, 2015, 02:26:01 PM
#66
Please show me how you are more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately, extracting all the energy from the universe to get close to infinity processing capability? Smiley

Now this is just getting retarded, you know very well what I meant with virtually infinite and I have shown by definition of both words its the correct term for Monero, now go play with your emunie.

Indeed I do, I'm just pointing out that by using the definition you presented yourself, is not an accurate representation of Monero's capabilities, and so, it is wrong.

You are only convincing yourself here Cheesy

Not really, this is one of the many reasons that the mass market is confused, because people involved with projects make statements such as that which are fundamentally wrong, can not be achieved or proven, and only serve the purpose of "bragging rights" against other technologies.

Additionally statements such as that result in good projects such as Monero not being taken seriously by academics and other educated individuals in the real world, because statements are presented as fact, that can not be fact.

haha ok...

just confirming you wanted to FUD Monero btw  Kiss

https://lab.getmonero.org/

Monero is one of the few coins besides Bitcoin with real mathematicians and cryptographers behind.

emunie is just a joke like XEM  Undecided
legendary
Activity: 1050
Merit: 1016
June 03, 2015, 02:24:40 PM
#65
Please show me how you are more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately, extracting all the energy from the universe to get close to infinity processing capability? Smiley

Now this is just getting retarded, you know very well what I meant with virtually infinite and I have shown by definition of both words its the correct term for Monero, now go play with your emunie.

Indeed I do, I'm just pointing out that by using the definition you presented yourself, is not an accurate representation of Monero's capabilities, and so, it is wrong.

You are only convincing yourself here Cheesy

Not really, this is one of the many reasons that the mass market is confused, because people involved with projects make statements such as that which are fundamentally wrong, can not be achieved or proven, and only serve the purpose of "bragging rights" against other technologies.

Additionally statements such as that result in good projects such as Monero not being taken seriously by academics and other educated individuals in the real world, because statements are presented as fact, that can not be fact.
Pages:
Jump to: