Author

Topic: Gold collapsing. Bitcoin UP. - page 368. (Read 2032266 times)

legendary
Activity: 1764
Merit: 1002
May 10, 2015, 07:16:05 PM
Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

Yeah, the problem with those ideas is that you want to try and keep all TX's on the mainchain as possible to pay miners fees, imo, so as to minimize cannibalizing or even killing off Bitcoin. Any of those offchain alternatives are  likely to create friction as well.

So... what if you fragment the chain? Like a distributed computing project, so each node only had to do a fraction of the work. But then maybe you'd have problems with forks... but if there was a protocol that was very small, very fast that could help speed up consensus... Gavin talked about something like this once, invertable bloom filters, I think?



he's never talked about that, afaik.  IBLT is a totally different thing.  that is a set reconciiation strategy dependent on the fact that most full nodes are carrying a set of unconfirmed tx's in their RAM that are already pretty close to identical to each others across the network.  the smaller the differences, the smaller the IBLT has to be to reconcile those differences.  the strategy is apparently only about 4 yo invented after Bitcoin. 

Gavin has proposed using IBLT's by miners who solve a block who then, instead of retransmitting the block with all it's tx's across the network which involves considerable latency, only then have to transmit the IBLT with the header to other nodes who then can reconstruct their unconfirmed tx set using the IBLT to match that of the proposed block to then see if the POW was in fact valid.

that was pretty tortured language so i hope you understand.
legendary
Activity: 1512
Merit: 1000
@theshmadz
May 10, 2015, 06:46:08 PM
Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

Yeah, the problem with those ideas is that you want to try and keep all TX's on the mainchain as possible to pay miners fees, imo, so as to minimize cannibalizing or even killing off Bitcoin. Any of those offchain alternatives are  likely to create friction as well.

So... what if you fragment the chain? Like a distributed computing project, so each node only had to do a fraction of the work. But then maybe you'd have problems with forks... but if there was a protocol that was very small, very fast that could help speed up consensus... Gavin talked about something like this once, invertable bloom filters, I think?

legendary
Activity: 1764
Merit: 1002
May 10, 2015, 06:18:57 PM
Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

Yeah, the problem with those ideas is that you want to try and keep all TX's on the mainchain as possible to pay miners fees, imo, so as to minimize cannibalizing or even killing off Bitcoin. Any of those offchain alternatives are  likely to create friction as well.
legendary
Activity: 1512
Merit: 1000
@theshmadz
May 10, 2015, 05:34:30 PM
Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?
legendary
Activity: 1764
Merit: 1002
May 10, 2015, 04:56:48 PM
Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?



Thx for that.  It shows quite clearly that attempts to outgrow most of the scaling problems that vex Bitcoin by doing simplistic scaling are pretty futile which is a point of view that I held since before I bought my first BTC (on e-bay IIRC.)

I'm using the same computer now that I put together around the time Bitcoin was invented.  It's obsolete, but not horribly so.  i5 chipset (Nehalem), 4G, a few TB of mirrored encrypted storage, etc.  Sure, I could build a much better computer now (although not all that much better), but the ONLY reason I would have any need to do so would be to try to run a simple transfer node.  My network capacity has decreased by orders of magnitude since I moved out of the silicon valley so even at 7 TPS I probably would not try and if I did I would only activate it at hours when my data was not metered.

Upshot: I could still play a constructive support role in infrastructure support if I had good reason to.  One of the main reasons I do not is that if my contribution made much of a difference in a stressed situation where it would be of value is that my efforts could be nixed at a flip of the switch by simple injunctive measures (network censorship.)  Because Bitcoin dev has not focused on (or even acknowledged) this potential failure mode I feel little incentive to waste my time and money trying to support the robustness of the solution.

The chart shows that in roughly the short time I've been involved (mid 2011) we will be right back to where we are again with 20 MB (forgetting for a moment the little issue that is supposed to be forgotten that many people's 20MB has an exponential growth factor beyond that.)  There was a huge amount of low hanging fruit codebase-wise to harvest getting 1MB to work to the extent that it does.  That luxury will not be present moving into the 20 MB limit by the nature of how computer science is done.

I made several mis-calculations about Bitcoin at the time I put some actual fund into the blockchain:

1) That things would naturally centralize around such entities as blockchain.info, coinbase, etc, and thus alleviate the need to grow the transaction rate.  (A positive mis-calculation.)

2) That it would be so blatantly obvious that Bitcoin's only realistic trajectory would be as a backing store for similar solutions by the time we stressed the transaction rate limits that nobody could argue otherwise.  (A negative mis-calculation.)

edit: slight.  Also:

I would again note that the issue charted is UTXO which is not particularly related to transaction rate.  An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.  I've not 'done the math', but it seems somewhat intuitive to me that such an 'attack' would happen organically upon reasonable use rates (which we have yet to even approach in the real-world to this point if Bitcoin remains a 'one-size-fits-all' exchange currency solution.)



This could be a high visibility chart.
We are extrapolating a line. I want to point out a risk.

The drawn line extrapolates based on an assumption of linear growth from some point midway along the function into the future.
If we drew a linear line from the start of the dataset through the current time, we would hit 20MB at a different, earlier date.
If we drew the line of best fit as a polynomial function (which is currently above the line and returning to it), we would hit 20 MB at a different, still earlier date.
If we drew a sigmoid function where we are approaching a ceiling, it is possible that limit would never hit 20MB.

If it is within anyone's capacity, I think it would be worth throwing these data points into some statistical software and determining line(s) of best fit, with correlations and such.
It would turn something that is subjectively interpretible into something objective.
I think that's important in this debate, for many of the reasons Zangelbert mentioned above.

yeah, i agree.  the rate of tx growth could be highly variable.  almost as volatile or even dependent on the price.
hero member
Activity: 625
Merit: 501
x
May 10, 2015, 02:34:17 PM
Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?



Thx for that.  It shows quite clearly that attempts to outgrow most of the scaling problems that vex Bitcoin by doing simplistic scaling are pretty futile which is a point of view that I held since before I bought my first BTC (on e-bay IIRC.)

I'm using the same computer now that I put together around the time Bitcoin was invented.  It's obsolete, but not horribly so.  i5 chipset (Nehalem), 4G, a few TB of mirrored encrypted storage, etc.  Sure, I could build a much better computer now (although not all that much better), but the ONLY reason I would have any need to do so would be to try to run a simple transfer node.  My network capacity has decreased by orders of magnitude since I moved out of the silicon valley so even at 7 TPS I probably would not try and if I did I would only activate it at hours when my data was not metered.

Upshot: I could still play a constructive support role in infrastructure support if I had good reason to.  One of the main reasons I do not is that if my contribution made much of a difference in a stressed situation where it would be of value is that my efforts could be nixed at a flip of the switch by simple injunctive measures (network censorship.)  Because Bitcoin dev has not focused on (or even acknowledged) this potential failure mode I feel little incentive to waste my time and money trying to support the robustness of the solution.

The chart shows that in roughly the short time I've been involved (mid 2011) we will be right back to where we are again with 20 MB (forgetting for a moment the little issue that is supposed to be forgotten that many people's 20MB has an exponential growth factor beyond that.)  There was a huge amount of low hanging fruit codebase-wise to harvest getting 1MB to work to the extent that it does.  That luxury will not be present moving into the 20 MB limit by the nature of how computer science is done.

I made several mis-calculations about Bitcoin at the time I put some actual fund into the blockchain:

1) That things would naturally centralize around such entities as blockchain.info, coinbase, etc, and thus alleviate the need to grow the transaction rate.  (A positive mis-calculation.)

2) That it would be so blatantly obvious that Bitcoin's only realistic trajectory would be as a backing store for similar solutions by the time we stressed the transaction rate limits that nobody could argue otherwise.  (A negative mis-calculation.)

edit: slight.  Also:

I would again note that the issue charted is UTXO which is not particularly related to transaction rate.  An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.  I've not 'done the math', but it seems somewhat intuitive to me that such an 'attack' would happen organically upon reasonable use rates (which we have yet to even approach in the real-world to this point if Bitcoin remains a 'one-size-fits-all' exchange currency solution.)



This could be a high visibility chart.
We are extrapolating a line. I want to point out a risk.

The drawn line extrapolates based on an assumption of linear growth from some point midway along the function into the future.
If we drew a linear line from the start of the dataset through the current time, we would hit 20MB at a different, earlier date.
If we drew the line of best fit as a polynomial function (which is currently above the line and returning to it), we would hit 20 MB at a different, still earlier date.
If we drew a sigmoid function where we are approaching a ceiling, it is possible that limit would never hit 20MB.

If it is within anyone's capacity, I think it would be worth throwing these data points into some statistical software and determining line(s) of best fit, with correlations and such.
It would turn something that is subjectively interpretible into something objective.
I think that's important in this debate, for many of the reasons Zangelbert mentioned above.
legendary
Activity: 4760
Merit: 1283
May 10, 2015, 02:31:58 PM
An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.

Something like this was brought up on reddit. Why not have higher fees for these kind of "tvbcof transactions"? (Higher fees in proportion to how much they scatter the UTXOs. And perhaps lower or zero fees for transactions that consolidate UTXOs.)

That would be one solution.  Another would be to periodically do what I would call a 're-base'.  Neither could be achieved without either

 - a major-ish re-design (and more and more changes qualify as 'major-ish' as Bitcoin ages), OR

 - with some increased control over the solution which could probably only come with processing centralization.  When one can mandate a 'tvbcof transaction tax' through this mechanism we will be well beyond the point where it will have been possible to implement the 'Qaddafi block' which Hearn hypothesized back around the time I took an active(-ish self) interest in Bitcoin.

legendary
Activity: 1036
Merit: 1000
May 10, 2015, 02:17:41 PM
An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.

Something like this was brought up on reddit. Why not have higher fees for these kind of "tvbcof transactions"? (Higher fees in proportion to how much they scatter the UTXOs. And perhaps lower or zero fees for transactions that consolidate UTXOs.)
legendary
Activity: 4760
Merit: 1283
May 10, 2015, 01:53:24 PM
Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?



Thx for that.  It shows quite clearly that attempts to outgrow most of the scaling problems that vex Bitcoin by doing simplistic scaling are pretty futile which is a point of view that I held since before I bought my first BTC (on e-bay IIRC.)

I'm using the same computer now that I put together around the time Bitcoin was invented.  It's obsolete, but not horribly so.  i5 chipset (Nehalem), 4G, a few TB of mirrored encrypted storage, etc.  Sure, I could build a much better computer now (although not all that much better), but the ONLY reason I would have any need to do so would be to try to run a simple transfer node.  My network capacity has decreased by orders of magnitude since I moved out of the silicon valley so even at 7 TPS I probably would not try and if I did I would only activate it at hours when my data was not metered.

Upshot: I could still play a constructive support role in infrastructure support if I had good reason to.  One of the main reasons I do not is that if my contribution made much of a difference in a stressed situation where it would be of value is that my efforts could be nixed at a flip of the switch by simple injunctive measures (network censorship.)  Because Bitcoin dev has not focused on (or even acknowledged) this potential failure mode I feel little incentive to waste my time and money trying to support the robustness of the solution.

The chart shows that in roughly the short time I've been involved (mid 2011) we will be right back to where we are again with 20 MB (forgetting for a moment the little issue that is supposed to be forgotten that many people's 20MB has an exponential growth factor beyond that.)  There was a huge amount of low hanging fruit codebase-wise to harvest getting 1MB to work to the extent that it does.  That luxury will not be present moving into the 20 MB limit by the nature of how computer science is done.

I made several mis-calculations about Bitcoin at the time I put some actual fund into the blockchain:

1) That things would naturally centralize around such entities as blockchain.info, coinbase, etc, and thus alleviate the need to grow the transaction rate.  (A positive mis-calculation.)

2) That it would be so blatantly obvious that Bitcoin's only realistic trajectory would be as a backing store for similar solutions by the time we stressed the transaction rate limits that nobody could argue otherwise.  (A negative mis-calculation.)

edit: slight.  Also:

I would again note that the issue charted is UTXO which is not particularly related to transaction rate.  An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.  I've not 'done the math', but it seems somewhat intuitive to me that such an 'attack' would happen organically upon reasonable use rates (which we have yet to even approach in the real-world to this point if Bitcoin remains a 'one-size-fits-all' exchange currency solution.)

legendary
Activity: 1162
Merit: 1007
May 10, 2015, 12:58:07 PM
Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?

legendary
Activity: 1036
Merit: 1000
May 10, 2015, 12:27:07 PM
Then, as you've said before, we need to get going!

I do detect a trace of inevitability in the air recently.

Even this blocksize debate, with all the hypothetical perils it highlights going forward, has left me with one remarkable sensation I haven't experienced in all my years of online discussion. Although it may look like there is a lot of squabbling and trolling and facile arguments, when compared to the baseline Internet noise level I've never seen so many people come together in debate in such a serious way with the mental discipline that comes from having so much truly on the line, both financially and personally (or ideologically), united in basic vision even if disagreeing on the details for how to get there.

It hints at the formidable power of the world's first purely economic community, and the all-permeating effect of Bitcoin's allure - it's potency in aligning people with its agenda regardless of their organizational affiliations.
legendary
Activity: 1764
Merit: 1002
May 10, 2015, 11:57:13 AM
By an eyeballing it's about 6 years from now (tenfolding about every four years), but I kind of doubt extrapolation can hold up very well as the transactional currency aspect doesn't really start to come to the fore until later when the network effects reach critical mass. Perhaps the exponential growth is enough to account for that, but I suspect something faster unless a lot of the transaction volume moves off chain.

Assuming 20MB means about 100 TPS, ten years from now we'd be at 200MB blocks and 1000 TPS, then by around 2030 we'd be into the 2GB and 10000+ TPS range, which looks like pretty solid global adoption (roughly 4000x current TPS). I suppose that's not unreasonable.

By the way, if we consider price at that level to be something in the very general ballpark of $1M per BTC, which is about 4000x the current price, it all fits together somewhat cleanly. Though that means the price will only tenfold roughly every 4 years as well. Though who knows, target price could be $10M or even $100M for all I know - in which case the price would tenfold about every 3 years or 2.5 years, respectively, on average for the next ~15 years. Of course if we get an S-curve for the price growth, most of those gains will be front loaded over the next several years.

/speculation Grin

Then, as you've said before, we need to get going!
legendary
Activity: 1036
Merit: 1000
May 10, 2015, 11:54:15 AM
By an eyeballing it's about 6 years from now (tenfolding about every four years), but I kind of doubt extrapolation can hold up very well as the transactional currency aspect doesn't really start to come to the fore until later when the network effects reach critical mass. Perhaps the exponential growth is enough to account for that, but I suspect something faster unless a lot of the transaction volume moves off chain.

Assuming 20MB means about 100 TPS, ten years from now we'd be at 200MB blocks and 1000 TPS, then by around 2030 we'd be into the 2GB and 10000+ TPS range, which looks like pretty solid global adoption (roughly 4000x current TPS). I suppose that's not unreasonable.

I should note that some say the 1MB limit is already pushing down the increase, which would be another reason to doubt the extrapolation.

While I'm extrapolating, if we consider price at that level to be something in the very general ballpark of $1M per BTC, which is about 4000x the current price, it all fits together somewhat cleanly. Though that means the price will only tenfold roughly every 4 years as well. Though who knows, target price could be $10M or even $100M for all I know - in which case the price would tenfold about every 3 years or 2.5 years, respectively, on average for the next ~15 years. Of course if we get an S-curve for the price growth, most of those gains will be front loaded over the next several years.

/speculation Grin
legendary
Activity: 1764
Merit: 1002
May 10, 2015, 11:24:47 AM


^ Here's a chart that shows the historical blocksize growth along with the limits, the "block fullness," and some commentary on how the limits have changed in the past and may change in the future.

Credit to Solex for the 1MBCON Advisory System Status and DeathAndTaxes for digging up the GitHub commits that introduced the blocksize limits.

Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?
legendary
Activity: 1764
Merit: 1002
May 10, 2015, 11:04:19 AM
I agree. I am in support of Bitcoin. I think it is possibly its own trojan horse. And I think the masses are going to adopt Bitcoin no matter what we do. I posit our only hope is to establish an alternative (often anonymous) economy with sufficient usage that it will remain viable and useful. I have no delusions about entirely disrupting fiat nor Bitcoin, because the masses don't want to end their socialism (they depend on Big Brother for food, housing, education, healthcare, anti-terrorism shadows, etc).

However, my point against altcoins (and note I have not intensively evaluated for example Nxt's ecosystem, so there might be exceptions I am unaware of) is they have limited network effects because afaics network effects are mostly driven by the use as a medium-of-exchange. And it appears to me to be a chicken-or-egg dilemma in that without network effects then it can't be a viable investment because medium-of-exchange requires reasonable ubiquity (currency must be convenient and increase efficiency of trade). In short, there is no value in just buying digits that we trade as an investment. There has to be some use value to impart value to the digits.

Thus I posit there does come a point where we may not be able to get the economy-of-scale to offer a viable option to Bitcoin any more. The inertia could become too great to overcome and we might even already be there or approach that point on the next runup in price with Circle, Coinbase, and Paypal all ready to vest the masses in Bitcoin. I assert that the masses are complacent and they simply won't switch, no matter if all our ideological reasons for supporting Bitcoin disappear. We could try to leave into our own coin, but we will find that not enough of us can leave en masse at once in order to use the coin for anything. Thus most of us will thus throw in the towel and realize it is futile and we waited too long.

So I argue we can't just take it as a given and must be proactive and expedient on any altcoin experiments we want to do.


while digging thru all your other bullshit, i stumbled upon this.  i can't disagree much with anything you say here and am glad to see you say supporting things about Bitcoin.

Bitcoin does have the network effect and that is why i say don't waste time on altcoins.  we should spend more time strengthening Bitcoin to make it stronger and more resilient to attack.  specifically to your Sybil attack theory, there is the parallel meshnetworks that ppl are trying to bootstrap for Bitcoin.  that seems like a worthwhile project while at the same time continuing to build on Bitcoin. 

it's my strong sense that the general opinion out there is that Bitcoin is it of all the cryptocurrencies.  you can see it in the charts.  all the altcoins, incl Monero, have been more decimated by this latest bear market.  you can say it's just more volatile swings and to be expected but my reading of the news states that most investors are turning to the safest of all coin options, that being Bitcoin.

how about this?  let's say you're right about TPTB trying to get the entire world onto Bitcoin.  well, that's going to take several years and significant price pumping to do so, so why not ride the wave?  make lotsa money on the way up; profit.
legendary
Activity: 1764
Merit: 1002
May 10, 2015, 10:37:18 AM
I understand the probability equations, but am trying to understand the logic in how they are being used and how an attacker with less than 50% could have an almost 100% chance of forcing a new longer chain. I would expect that no matter what the probability of being successful would be less than 50%.

The reason is the attacker just keeps going with his attack until (with a tiny bit of luck) his chain is longer. At that point everyone else will join his chain and his need to "attack" is over, he just mines his chain along with everyone else.

Intuitively, realize that the success probability is 100% at >50%, because he can always be assured of outrunning the other fork. It doesn't just jump right from near-zero to 100% as soon as you get 50%, it rises gradually with significant shares <50%.





but for every "bit of luck" the 49% attacker gets (by that i'm assuming you mean a "spurt" of luck with several blocks in a row) the 51% honest chain has the same chances of that "bit of luck" of a block spurt.  not only does the 51% honest chain have the advantage of slowly pulling further ahead via percentages alone while the 49% attacker is withholding blocks, he has the advantage of the same block spurt of luck.  both of these factors as the 51% chain pulls further and further ahead will eventually force the 49% attacker to abandon his attack, start over, while suffering losses from the blocks he could have claimed by publishing them instead of holding them back.  in effect, you can neutralize the spurt of blocks from the analysis and just say that the 51% chain will always outrun the 49% chain on average.

The math says otherwise. The 51% chain doesn't need to do anything at all. The 49% chain just needs to get lucky to pull ahead briefly, and it eventually will, usually. That 2% lead isn't much. Occasionally the 51% chain will pull too far ahead and you will need to abandon your attack, that's what accounts for the 4% (or whatever number) chance of failure. But usually this doesn't happen.

If Satoshi's brief explanation or Peter R's use of the math isn't clear enough for you, there is more of a step-by-step explanation here, along with some pictures (Figure 3 in particular): https://bitcoil.co.il/Doublespend.pdf



that's interesting.  not that it matters.

did you know Meni Rosenfeld was a huge proponent of POS for several years?  smart guy but he was way off on that one.
legendary
Activity: 1764
Merit: 1002
May 10, 2015, 10:33:28 AM
I think we should also re-summarize whether pool control can be defeated by the new pool RPC (was it "getblockdata"?) that allows the miner to add or remove transactions. I must admit it has been a while since I looked at that and I might not have completely digested it (in a rush or whatever).

Perhaps someone can chime in so I don't have to go google it.

https://bitcoin.org/en/developer-guide#getblocktemplate-rpc

it's getblocktemplate and i've already pointed out that it gives miners the flexibility to construct their owns blocks.  as a former miner, moving to a new pool is one click away and all of us were watching carefully for any pool operators acting suspicious
Quote

So that is one argument that can be made against the pool's having control. Note it doesn't impact my other upthread (and very on topic) point that larger blocks favor centralization because higher orphan rates do.

larger bloat blocks have a higher chance of being orphaned
Quote

However does it really give control the miner? I don't think so. The users still need to forward transactions into the network and eventually the volume of transactions will be too great for miners to listen to and compare with what the pool is sending them. They will at some point be forced to delegate transaction compilation to the pools.

you'll need to give a citation on this.  i'm not aware of any problems with loading the size of the unconfirmed tx's data set into RAM at all on startup.  in fact, the set is fairly uniform across all nodes b/c of the speed of the network which allows proposals like IBLT from Gavin a chance to be implemented.  i've never heard about any concerns going forward on this.
Quote

So I argue both of my orthogonal points remain valid.

The author of the OP can still maintain the pools are not concentrated because he can look at a piechart of pool names and see that not one of the names has a significant market share. I can maintain that I have many names too.  Wink

Add: and the key point of distinction is that in Bitcoin in order to get a transaction to have a confirmation then it must be put into a block. In my novel new design, transactions don't have to be put in blocks in order to be confirmed. That is a very strong head scratching hint for you!

well that'll be a trick b/c the blockchain is Satoshi's fundamental contribution to tx security that was missing for all these decades of digital money formation.  each block cements the tx's into the chain via POW. 

you need to elaborate on your purported innovation to have any meaningful discussion.
sr. member
Activity: 260
Merit: 251
May 10, 2015, 10:17:10 AM
Surely the vitamin D3 is working because of the intense volcano energy I feel right now if one of those detractors would join me in the boxing ring right now.

But I have numb legs today from the knee down and a mild gut pain. But other than that, I am strong enough today. This is a significant improvement over March, where I couldn't even think or keep my body up out of bed most of the time.

Totally OT at this point, but upping your Vitamin D3 intake means you also need to keep your electrolytes in balance. If you start feeling sore muscles or inexplicable fatigue, that's your cue to load up on minerals. After I upped my D3 intake to 5000 IU per day, I also started supplementing 250mg magnesium oxide, 1000mg potassium chloride, and 2000mg sodium chloride.
legendary
Activity: 1764
Merit: 1002
May 10, 2015, 09:57:43 AM
Time we fucking wakeup sheeople!



Nom, nom, nom!
sr. member
Activity: 420
Merit: 262
May 10, 2015, 09:42:27 AM
Time we fucking wakeup sheeople!

http://armstrongeconomics.com/archives/30365

Quote
Hillary Clinton is already bought and paid for, She netted $400,000 for giving two speeches for a few minutes at Goldman Sachs.

Even the top five contributors to Hillary’s bid for the Senate back in 1999 were:

Citigroup Inc ….. $782,327
Goldman Sachs ….. $711,490
DLA Piper ….. $628,030
JPMorgan Chase & Co ….. $620,919
EMILY’s List ….. $605,174

Gary Gensler (born October 18, 1957) worked at Goldman Sachs for 18 years and at 30 became the youngest partner. He then, like most people from that firm, strangely seem to suddenly care about how government functions and then crosses over into public life after filling their pockets at Goldman,

http://www.cnbc.com/id/102634242

Quote
Goldman, IDG invest $50 million in bitcoin company Circle

http://www.nzherald.co.nz/technology/news/article.cfm?c_id=5&objectid=10456534

Quote
The story starts once Facebook founder Mark Zuckerberg had launched, after the dorm room drama that's led to the current court case.

Facebook's first round of venture capital funding ($US500,000) came from former Paypal CEO Peter Thiel. Author of anti-multicultural tome 'The Diversity Myth', he is also on the board of radical conservative group VanguardPAC.

The second round of funding into Facebook ($US12.7 million) came from venture capital firm Accel Partners. Its manager James Breyer was formerly chairman of the National Venture Capital Association, and served on the board with Gilman Louie, CEO of In-Q-Tel, a venture capital firm established by the Central Intelligence Agency in 1999. One of the company's key areas of expertise are in "data mining technologies".

Breyer also served on the board of R&D firm BBN Technologies, which was one of those companies responsible for the rise of the internet.

Dr Anita Jones joined the firm, which included Gilman Louie. She had also served on the In-Q-Tel's board, and had been director of Defence Research and Engineering for the US Department of Defence.

She was also an adviser to the Secretary of Defence and overseeing the Defence Advanced Research Projects Agency (DARPA), which is responsible for high-tech, high-end development.

It was when a journalist lifted the lid on the DARPA's
Information Awareness Office that the public began to show concern at its information mining projects.

Wikipedia's IAO page says: "the IAO has the stated mission to gather as much information as possible about everyone, in a centralised location, for easy perusal by the United States government, including (though not limited to) internet activity, credit card purchase histories, airline ticket purchases, car rentals, medical records, educational transcripts, driver's licenses, utility bills, tax returns, and any other available data.".

Not surprisingly, the backlash from civil libertarians led to a Congressional investigation into DARPA's activity, the Information Awareness Office lost its funding.

Now the internet conspiracy theorists are citing Facebook as the IAO's new mask.

Parts of the IAO's technology round-up included 'human network analysis and behaviour model building engines', which Facebook's massive volume of neatly-targeted data gathering allows for.

Facebook's own Terms of use state: "by posting Member Content to any part of the Web site, you automatically grant, and you represent and warrant that you have the right to grant, to facebook an irrevocable, perpetual, non-exclusive, transferable, fully paid, worldwide license to use, copy, perform, display, reformat, translate, excerpt and distribute such information and content and to prepare derivative works of, or incorpoate into other works, such information and content, and to grant and authorise sublicenses of the foregoing.

And in its equally interesting privacy policy: "Facebook may also collect information about you from other sources, such as newspapers, blogs, instant messaging services, and other users of the Facebook service through the operation of the service (eg. photo tags) in order to provide you with more useful information and a more personalised experience. By using Facebook, you are consenting to have your personal data transferred to and processed in the United States."

http://www.coindesk.com/peter-thiel-founders-fund-lead-2m-funding-round-in-bitpay/

Quote
Peter Thiel & Founders Fund lead $2 Million funding round in BitPay

http://www.ft.com/cms/s/0/b6f63e4c-a0af-11e4-9aee-00144feab7de.html

Quote
Coinbase lands $75m investment from NYSE and BBVA

Others in the round include Vikram Pandit, former chief executive of Citigroup

Ripple Labs, whose technology can be used to transfer Bitcoin and other currencies, announced on Tuesday that former White House economic adviser Gene Sperling would be joining its board.
Jump to: