Author

Topic: Gold collapsing. Bitcoin UP. - page 377. (Read 2032266 times)

legendary
Activity: 3920
Merit: 2349
Eadem mutata resurgo
May 08, 2015, 06:27:42 PM


^ Here's a chart that shows the historical blocksize growth along with the limits, the "block fullness," and some commentary on how the limits have changed in the past and may change in the future.

Credit to Solex for the 1MBCON Advisory System Status and DeathAndTaxes for digging up the GitHub commits that introduced the blocksize limits.

Wow! Very informative graphic, Peter. This needs to be seen more widely.

20/20 hindsight is always easy, and we can see that three things went wrong.

Satoshi implemented the limit back-to-front i.e. it should have required consensus to renew it, not remove it:
If block-height < 123,456
     max-block-size = 1MB

He disappeared before he could use his authority to modify the change.

He did not expect it to become a sacred cow:



C'mon that's totally unfair to the majority of developers (again). It is not simply a 'sacred cow' and that totally over-simplifies the problem while casting negative aspersions against most of the developers.

These guys have worked themselves to the point of mental exhaustion in some cases to keep this thing scaling up to handle the transaction growth rate thus far and have made huge advances. They would all simply agree (ya know, a consensus) to raise the limit if it was such an obvious 'fix', it is not and there are no obvious fixes, that is just wishful thinking by the unthinking majority. Raising the limit is the last ditch 'suck it and see' hail mary pass when everything else has been optimised as much as possible ...

If the limit gets raised substantially above the technological improvements growth rate I'll be pulling the vast majority of my investment out because it is not going to be operating like we thought it would, i.e. it won't be a clearing and settlement digital gold network but a paypal2.0, fun internet googlesque reversable, traceable payments network
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
May 08, 2015, 05:45:22 PM


^ Here's a chart that shows the historical blocksize growth along with the limits, the "block fullness," and some commentary on how the limits have changed in the past and may change in the future.

Credit to Solex for the 1MBCON Advisory System Status and DeathAndTaxes for digging up the GitHub commits that introduced the blocksize limits.

Wow! Very informative graphic, Peter. This needs to be seen more widely.

20/20 hindsight is always easy, and we can see that three things went wrong.

Satoshi implemented the limit back-to-front i.e. it should have required consensus to renew it, not remove it:
If block-height < 123,456
     max-block-size = 1MB

He disappeared before he could use his authority to modify the change.

He did not expect it to become a sacred cow:



legendary
Activity: 1764
Merit: 1002
May 08, 2015, 04:07:05 PM
dow up 275.. i guess dip is over

you're not safe until $DJT goes back up and sets new high which would invalidate the current non-confirmation. in the meantime, it's quite possible for the $DJI to go back up and set a new high but that by itself won't invalidate the non-confirmation.  in fact, that by itself would exacerbate it:

legendary
Activity: 1764
Merit: 1002
May 08, 2015, 04:06:04 PM
GBTC volume spiking.  closed @$49.  that's very good.
legendary
Activity: 1764
Merit: 1002
May 08, 2015, 03:52:03 PM


^ Here's a chart that shows the historical blocksize growth along with the limits, the "block fullness," and some commentary on how the limits have changed in the past and may change in the future.

Credit to Solex for the 1MBCON Advisory System Status and DeathAndTaxes for digging up the GitHub commits that introduced the blocksize limits.

geez, looks to me like we'd have to cut Gavin's adoption time by half to avoid problems.
legendary
Activity: 1764
Merit: 1002
May 08, 2015, 03:49:41 PM
so why did the /u/blockchain_explorer account get deleted over on Reddit, vigorous 1MB defender?  he first popped up just a coupla days ago after Gavin's proposal?
legendary
Activity: 1162
Merit: 1007
May 08, 2015, 03:45:22 PM


^ Here's a chart that shows the historical blocksize growth along with the limits, the "block fullness," and some commentary on how the limits have changed in the past and may change in the future.

Credit to Solex for the 1MBCON Advisory System Status and DeathAndTaxes for digging up the GitHub commits that introduced the blocksize limits.
legendary
Activity: 1153
Merit: 1000
May 08, 2015, 03:36:20 PM
from gmax's response this morning:

"Thanks Matt; I was actually really confused by this sudden push with
not a word here or on Github--so much so that I responded on Reddit to
people pointing to commits in Gavin's personal repository saying they
were reading too much into it.

So please forgive me for the more than typical disorganization in this
message; I've been caught a bit flatfooted on this and I'm trying to
catch up. I'm juggling a fair amount of sudden pressure in my mailbox,
and trying to navigate complex discussions in about eight different
forums concurrently."


anyone who is even half aware of his surroundings could see that Gavin' proposal was coming.  wasn't i on this thread talking about this immediately after this video last month?:  

https://www.youtube.com/watch?v=RIafZXRDH7w

but even way before then it was entirely clear that there was a schism developing btwn Gavin, who has been pushing for an even more radical progressive block size increase, and the rest of the Blockstream devs looking to continue capping at 1MB while at the same time pushing their version of SC's.  it was clear, to me at least, that he would have to pull rank at some point.  and that is not a bad thing; that is what leaders do.  he is the lead dev after all.

gmax is exhibiting typical behavior when someone suddenly finds their back up against the wall defending what clearly is a minority opinion.  "poor me, they suddenly sprung this upon me with absolutely no warning!"

We have an interesting situation here where a majority of developers seem to be against the blocksize increase, but a clear large majority of users seem to support it.

If this is the case, bitcoin will likely go with what the majority of users prefer to do. The devs do not control it.

This is opposite BTW to our FED system. Think of the FED board of directors as the same as bitcoin developers. Only here the FED's BoD are able to assert their views on users against their will.

This blocksize debate demonstrates very clearly how the motivations of those in "charge" are often opposite to the needs of those not in charge. The US has ceased to do what the majority of it's users (the people) have wanted for a while. But Bitcoin is a true democracy, it will ignore it's developers, and follow a clear majority of users.
legendary
Activity: 1260
Merit: 1002
May 08, 2015, 03:15:04 PM
since noone mentioned it yet: https://twitter.com/MagicalTux/status/596622731711352832?s=09

Yes, an actually decent suggestion from Mark Frappacino.

always been quite found of this guy. Grin
so why not auto adjust?
legendary
Activity: 1414
Merit: 1000
May 08, 2015, 03:03:59 PM
since noone mentioned it yet: https://twitter.com/MagicalTux/status/596622731711352832?s=09

Yes, an actually decent suggestion from Mark Frappacino.

"max size"  have gone missing
sr. member
Activity: 310
Merit: 250
May 08, 2015, 03:00:34 PM
since noone mentioned it yet: https://twitter.com/MagicalTux/status/596622731711352832?s=09

Yes, an actually decent suggestion from Mark Frappacino.

This was closer to Gavin's orginal suggestion too. The (an) algorithm isnt going to be perfect, but its going to be better than repeated hard forks (in that it may or may not require repeated hard forks). I'd prefer something that adjusts so we can have a little more time to as we scale up.
legendary
Activity: 1288
Merit: 1000
Enabling the maximal migration
May 08, 2015, 02:51:35 PM
since noone mentioned it yet: https://twitter.com/MagicalTux/status/596622731711352832?s=09

Yes, an actually decent suggestion from Mark Frappacino.
legendary
Activity: 4760
Merit: 1283
May 08, 2015, 02:47:31 PM
...
but even way before then it was entirely clear that there was a schism developing btwn Gavin, who has been pushing for an even more radical progressive block size increase, and the rest of the Blockstream devs looking to continue capping at 1MB while at the same time pushing their version of SC's.  it was clear, to me at least, that he would have to pull rank at some point.  and that is not a bad thing; that is what leaders do.  he is the lead dev after all.
...

Couldn't happen to soon.  Hopefully Gavin (who I've suspected for some time as being not much more than Hearn's mouthpiece to the world if not worse) will siphon off a huge count of the lumpenbitcoiners who constitute the main problem in the ecosystem in my humble opinion.

I've sat tight on my stash for well over a year for economic reasons and would be happy to sit tight for another few years until it is widely understood what a smoking crater is at the end of Gavin's plans for Bitcoin.  In the interim I intend to capitalize handsomely.  Or try to.

In looking for Maxwell's stuff on reddit I ran across something supposedly by Gavin (though I have my doubts.)  For one thing, nobody who has any bitcoin in significant quantity should have only just become aware of UTXO issues, and it's beyond absurd to suggest that the 'principle scientist' would.  For two, UTXO has nearly zero to do with block size anyway so the lesson he supposedly drew from it ("That is a very good reason to oppose increasing the maximum block size.") is either hopelessly ignorant or a weasel move to get his bloatcoin system in place when a 'solution' to the 'problem' pops out of the ether.

To expand on my point about UTXO issues, a grand total of one user worldwide could create a problem with UTXO size if they put (a fairly technically trivial) amount of effort into doing so.  Conversely, the UTXO problem could be managed with much or perhaps most of the world's population as users if it were addressed at the system design level.  Satoshi seems to have made some feeble stabs at doing this (which is why there is even such a thing as UTXO) but a proper solution to the problem would pretty much require a dynamic global circulation involving periodic re-issues.  That is, nobody socking their stash into deep storage or if they do wish to do so they pay a tax of some sort upon retrieval.  That would make for a fine system for some use-cases actually.  I (and I presume Maxwell and company) would be happy to see and happy to use such a solution but I would like to see it as a sidechain.

legendary
Activity: 1414
Merit: 1000
May 08, 2015, 02:45:59 PM

Limits the size of each value pushed while evaluating a a script (up to 520 bytes)
Limits the number of expensive operations in a script (up to 201 operations). All but pushs are considered expensive.

i need some definitions of the bolded terms:

value means BTC?
expensive means what?
pushs means what in this context?

1. :-)  value is not BTC.   It is value for script (sequence of bytes)  e.g. "The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"  http://libbitcoin.dyne.org/doc/blockchain.html.

2. expensive for processor (script evaluator ... uses processor time/resource e.g. memory)

3. "push" is instruction(op_code) in script for processor.
legendary
Activity: 1372
Merit: 1000
May 08, 2015, 02:43:09 PM
from gmax's response this morning:

"Thanks Matt; I was actually really confused by this sudden push with
not a word here or on Github--so much so that I responded on Reddit to
people pointing to commits in Gavin's personal repository saying they
were reading too much into it.

So please forgive me for the more than typical disorganization in this
message; I've been caught a bit flatfooted on this and I'm trying to
catch up. I'm juggling a fair amount of sudden pressure in my mailbox,
and trying to navigate complex discussions in about eight different
forums concurrently."


anyone who is even half aware of his surroundings could see that Gavin' proposal was coming.  wasn't i on this thread talking about this immediately after this video last month?:  

https://www.youtube.com/watch?v=RIafZXRDH7w

but even way before then it was entirely clear that there was a schism developing btwn Gavin, who has been pushing for an even more radical progressive block size increase, and the rest of the Blockstream devs looking to continue capping at 1MB while at the same time pushing their version of SC's.  it was clear, to me at least, that he would have to pull rank at some point.  and that is not a bad thing; that is what leaders do.  he is the lead dev after all.

gmax is exhibiting typical behavior when someone suddenly finds their back up against the wall defending what clearly is a minority opinion.  "poor me, they suddenly sprung this upon me with absolutely no warning!"

It is a sad thing but as things go good developers stop doing good work and start being average PR managers.
Gmax accuses Gavin of not doing code development and so not taking a lead development role.

The irony is gmax is falling into the same trap, more of his time and energy is going into PR than is going into his strong suite.

Good leader don't manage people, people willingly follow there example.
legendary
Activity: 1764
Merit: 1002
May 08, 2015, 02:41:18 PM
I'm not too worried about the blocksize debate now that I know even Peter Todd supports an increase "eventually." To me this says the first sign of actual delays and price impact will see all these guys change their tune.

I don't think it will change much, lots of resistance is routed in fear of what could happen, also based on subjective assertions.
There are legitimate concerns and efforts should be focused there.

Why we need to change now is to kill the FUD, so developers can work on the real issues.

i think the thing we all have to be worried about is the extent to which these anti-Gavin types are actually willing to let Bitcoin die.  i hate to consider that but seriously.  if you don't have much invested in the coin or somehow missed the train, or just simply don't believe it is setup properly, would $21M invested in your startup incentivize you to let Bitcoin die in favor of a new blockchain (SC)?  after all, they said in the whitepaper, Bitcoiners may be forced to migrate to a superiorly performing SC.
legendary
Activity: 1764
Merit: 1002
May 08, 2015, 02:36:03 PM

Limits the size of each value pushed while evaluating a a script (up to 520 bytes)
Limits the number of expensive operations in a script (up to 201 operations). All but pushs are considered expensive.

i need some definitions of the bolded terms:

value means BTC?
expensive means what?
pushs means what in this context?
legendary
Activity: 1372
Merit: 1000
May 08, 2015, 02:25:02 PM
I'm not too worried about the blocksize debate now that I know even Peter Todd supports an increase "eventually." To me this says the first sign of actual delays and price impact will see all these guys change their tune.

I don't think it will change much, lots of resistance is routed in fear of what could happen, also based on subjective assertions.
There are legitimate concerns and efforts should be focused there.

Why we need to change now is to kill the FUD, so developers can work on the real issues.
legendary
Activity: 1414
Merit: 1000
May 08, 2015, 02:19:56 PM
can one of you tech guys comment on Peter Todds claim that there was an entire book embedded in the BC a month ago?  how is it done?

https://www.reddit.com/r/Bitcoin/comments/34uu02/why_increasing_the_max_block_size_is_urgent_gavin/cqystoo?context=3

Single transaction can contain 10 kB of script  => 10 kB of data

Does maxing out that data limit impact the TX fee? Like including many inputs requires a fee.

Yes.
legendary
Activity: 1372
Merit: 1000
May 08, 2015, 02:11:27 PM
can one of you tech guys comment on Peter Todds claim that there was an entire book embedded in the BC a month ago?  how is it done?

https://www.reddit.com/r/Bitcoin/comments/34uu02/why_increasing_the_max_block_size_is_urgent_gavin/cqystoo?context=3

Single transaction can contain 10 kB of script  => 10 kB of data

Does maxing out that data limit impact the TX fee? Like including many inputs requires a fee.
Jump to: