Author

Topic: Gold collapsing. Bitcoin UP. - page 196. (Read 2032248 times)

legendary
Activity: 1162
Merit: 1007
June 23, 2015, 06:40:33 PM
Perhaps the bitcoin economy has matured enough that a set max block size is not actually needed anymore?!

What if we had some type of patch that eliminated "silly sized blocks" and allowed miners to set any soft limit they want? Have it something like: allow any size as long as it's not more than x3 (or other number) the previous months average block size.

Market incentives will drive each mining pools soft limits. Increasing the limits will stimulate growth (increase centralization if done too fast compared to technology because of resources needed) while smaller blocks will increase fees. Some may like smaller blocks because of risk of orphans while others will want large blocks to grow the user base of people using bitcoins. Allowing a x3 size block means that as long as a minority are making small blocks, those who want larger ones can still help grow the "silly size" limit and those who have weak internet connections just have to download the blocks and can continue making small ones.

A spam attacker could only create large blocks that were bigger than all the pools soft limits only by mining it himself and the largest block he could create would be a x3 of the blocks on the system. And just like it's in their economic self interest to not get bigger than 51% it would also be in pools own self interest to not have a soft limit that would risk too much centralization of the system by making it too large for current technology and allows for the system to rapidly adapt at the time. The fact that most mining is done by pools means that the soft limit can be more fluid.

Ultimately if we let pools set any limit they want would they set limits that would allow bitcoin to thrive? If we assume the majority of pools act in the interest of a growing decentralized system would max block size be needed anymore other than something to prevent a "silly size" block?


Welcome to the thread! 

I've seen ideas like "3x the average of the last N blocks" before.  I don't think it's necessarily a bad idea.  The problem I see, though, is that there's no hard limit at all in that case.  I predict that people will then want another limit on top of this 3x dynamic limit in order to feel safe.  Now we're back to the same place as we are now…but with increased complexity (dynamic + hard limit). 

That being said, I do think the risk of miners publishing "ginormous" blocks if there was no cap at all is small.  Based on real data, TradeBlock today estimated that it would take 137 seconds for a 8 MB block to propagate to half the nodes.  By the time a 100 MB spam-block propagated, miners would have found another two and the spam block would be orphaned.  The probability of success of such a spam attack is thus inversely proportional to the damage it can cause. 
newbie
Activity: 55
Merit: 0
June 23, 2015, 06:27:09 PM
Perhaps the bitcoin economy has matured enough that a set max block size is not actually needed anymore?!

What if we had some type of patch that eliminated "silly sized blocks" and allowed miners to set any soft limit they want? Have it something like: allow any size as long as it's not more than x3 (or other number) the previous months average block size.

Market incentives will drive each mining pools soft limits. Increasing the limits will stimulate growth (increase centralization if done too fast compared to technology because of resources needed) while smaller blocks will increase fees. Some may like smaller blocks because of risk of orphans while others will want large blocks to grow the user base of people using bitcoins. Allowing a x3 size block means that as long as a minority are making small blocks, those who want larger ones can still help grow the "silly size" limit and those who have weak internet connections just have to download the blocks and can continue making small ones.

A spam attacker could only create large blocks that were bigger than all the pools soft limits only by mining it himself and the largest block he could create would be a x3 of the blocks on the system. And just like it's in their economic self interest to not get bigger than 51% it would also be in pools own self interest to not have a soft limit that would risk too much centralization of the system by making it too large for current technology and allows for the system to rapidly adapt at the time. The fact that most mining is done by pools means that the soft limit can be more fluid.

Ultimately if we let pools set any limit they want would they set limits that would allow bitcoin to thrive? If we assume the majority of pools act in the interest of a growing decentralized system would max block size be needed anymore other than something to prevent a "silly size" block?
sr. member
Activity: 420
Merit: 262
June 23, 2015, 06:26:30 PM

I like Satoshi's quote on this:

Quote from: satoshi
...
Coins have to get initially distributed somehow, and a constant rate seems like the best formula.
...
http://www.mail-archive.com/cryptography%40metzdowd.com/msg09979.html

The nonchalance, to me, indicates a nice trust in markets to do their job of optimizing allocation over time. But - if Bitcoin ever starts to become a serious global economic force, mainstream economists are going to flip out over the above quote, given the initial-distribution algo didn't go through some deep analysis, etc...  

I think it's interesting how the engineering decision of making a simple/transparent (easy to implement, thus more secure) distribution algorithm trumped any potential detailed economic complexity, presumably due to Satoshi's understanding that the market would eventually allocate the capital optimally anyways, given the transparency of the current and future supply.

One of the nice things about having a 'benevolent dictator' is that it will be much easier to make decisions about these things going forward.

We already know that in the noble interest of getting a 'critical mass' in order to 'outrun regulation', it is critical  to subsidize transaction costs.  We also know with some reasonable certainty that one of the early attractions of Bitcoin was that people could 'make free money' with only a token effort.  People like free shit.  Always will.

In order to spread the wealth there are two choices.

 - make more wealth and give it away (e.g., screw the obsolete 21 million cap thingy.)

 - appropriate existing or lost money and hand it out.

Idea!  We can be pretty sure that if/when XT takes over, coin tainting is not far behind.  Why don't we use the otherwise wasted value to pass around to the masses.  Maybe like a dividend to be distributed to all existing addresses.  To be more fair and reduce gaming, however, it makes sense that people would need to appropriately register their true identities in order to receive the dividend though.

You've just restated Gresham's Law.

Hard money is a delusion.

What you really want is soft money that is decentralized and thus remains permission-less (censorship-free).

Kudos for this post because you've influenced some of my thinking about perpetual debasement in such an idealized system.

My prior comments on this issue:

Subsidies are great if you want to get applicants who qualify, It should be called a subsidiary not a reward for good reason, the problem is there are developers who feel the economics are wrong and Bitcoin needs to be fixed, I may not be able to express why but to my understanding the mechanism seems well balanced and considered in my view, the onus is on the people who have a problem with how Bitcoin works to prove its broken, and build a better mousetrap not change this one.

My position is that it would be great if we could have started Bitcoin up without a block subsidy, but since the currency has to be issued via some method, and since the only way to produce a truly optimal initial distribution would require an entity that was both omnipotent and omniscient, issuing the currency via block subsidy spread out over time is the least terrible way to do it.

Other than the wealth effect, much of the real capital (at least initially until the NWO-directed ecosystem was bootstrapped) went to the utility, hardware, and usury industry to which miners are beholden. If that was the optimum way to bootstrap an ecosystem, then it was correct.

In my view, it was the only way to build a global ecosystem amongst conflicting interests because it is unassailable until it scales to the point where it becomes distributed but centralized, then at that point it can only be assailed by the TPTB.

Similarly, because 95% has been mined, I am not so sure that people will use bitcoin over an altcoin as a means of p2p transfer. It's supply in 2024 will be far too small to facilitate hundreds of million of people buying in to actually use it. Because by 2024, either the legacy rail has taken over and bitcoin is far too expensive relative to it's supply OR, it will not have happened and bitcoin will be worth nothing. Perhaps this is too simplistic an argument to make, but 10 years is not actually very long.

It didnt use to concern me, but it does now.

Unless they are given BTC loans. The masses have always used leveraged money (fractional reserves) and not real money, because the capitalists have all the real money.

The real game here is not changing whether the masses will use leveraged money. (nothing will ever change that)

It is the game of protecting the (knowledge age) capitalists from the State (industrial age capitalists+masses).

I have argued that a Knowledge Age is replacing the Industrial Age and the age of high fixed capital is being replaced by active knowledge. Knowledge capitalists don't want to be dictated to by a State because it is incompatible with knowledge production.
legendary
Activity: 4690
Merit: 1276
June 23, 2015, 06:19:16 PM

One of the nice things about having a 'benevolent dictator' is that it will be much easier to make decisions about these things going forward.
...

My god, cheezes christ, what a socialist pile of crap.

These socialists should be forced to spread their wealth. 'Maybe like a dividend to be distributed to all existing addresses'.

Zikes!  You guys make some pretty strong arguments and have convinced me of the error of my ways.  Let it be known that I hereby drop my support (however brief) for switching Bitcoin to be under a 'benevolent dictator' and under XT.

Thanks for setting me right guys.


You're saying Mike Hearn was performing an altruistic power grab? That's not so terrible is it? Who wouldn't want one person to control the way their monetary system develops, the sort of person you can rely on to ignore everything that anyone else has to say and override it with a unilateral hard fork, all because that person knows better? What's wrong with Mike, with his superior understanding of what I want for myself, making more of my decisions for me? Why not all of them? He's a smart guy, and there's no way he'd make decisions on my behalf that benefit him.

My take was that Mike was nominating Gavin for the role, but they seem to have quite a good 'alignment' on solutions to 'problems'.  Apparently the two had been somewhat covertly meeting with various financial services and other captains of industry and there was a strong consensus that it would be better to have a 'point man' who they could rely on to make things happen in the Bitcoin ecosystem faster and more reliably.  One of the things I noted in Adam Back's characteristically polite and to-the-point note to Mike was him asking for more information on these meetings since most of rest of the developers were unaware of them and a bit surprised.

legendary
Activity: 1162
Merit: 1007
June 23, 2015, 06:15:39 PM
Ok, so if you accept that the blocksize is not the only solution to the problem, then would you also accept that your guess for the rate of block size increase can only be sensibly made when the overall context you are applying it to is known? i.e. design first how you intend to mitigate the problems that bigger blocks bring, then you come up with your goldilocks increase curve?

I think you changed the wording a bit.  I said no to this:

...you believe the whole thing scales up using block size increases only. Is this really what you're implying?

Which I took to mean that "I accept that off-chain solutions could play a role in the future to help with scaling."  I never meant to imply that I believed there could be a solution that doesn't involve blocksize scaling at all.  I think regardless of the role played by off-chain solutions, we must scale up the blocksize for all the good reasons DeathAndTaxes lists here.  Only later will we find out how much demand there is for off-chain TXs.    

legendary
Activity: 3430
Merit: 3080
June 23, 2015, 06:13:45 PM
But polls show that Gavin has a clear majority... its just certain core devs who object and not on technical grounds.  And most of them are working for a single organization.  How is that "hectoring a community"?


It isn't. Who said that?

There is a difference between "we need a hard fork to increase the block size" and "Gavin's plan is the way to do it, (which ever plan he settles upon)"

Most all the devs want a hard fork to increase block size.  Very few are on board with Gavin's plan.
Anyhow, it looks like the Bitcoin Core hard fork will be more likely to progress from gmaxwell's BIP, and the XT fork from Gavins perhaps.
They may remain compatible until there is a block that one would process and the other wouldn't.

You can include me in that contingent: we do need a hard fork to increase the block size (and only for what that will achieve, buying time). Very few people are debating that now, and I myself have been aware of the scalability issue for almost as long as I've been interested in bitcoin.

Re: which designs will be implemented on which fork; I thought Gavin had decided against the hostile fork?

the use of the term hostile is propaganda.

its not hostile, it's only hostile if you are feeling threatened like maybe the majority of Core developer working on another another hard fork and want to leverage block size increase with your proposed improvements.

Telling the entire core dev team, the commercial bitcoin players and the bitcoin user community that if they didn't like it, they were powerless to stop him? This is friendly gesture in your eyes?

I'm not even being rhetorical using the word "hostile", Gavin and Mike's attitude was exactly that: hostile.
legendary
Activity: 3430
Merit: 3080
June 23, 2015, 06:10:55 PM


As far as I remember the debate about the max block size cap started a "long" time ago,
an it risked to stall indefinitely.

In my opinion Gavin's conduct has objectively an accelerating effect on the decision
process. I dare to say that such "extreme" attitude had stimulated a lot devs
in focusing to find a solution and to come to a compromise.

I'm not saying that his proposal is perfect or better than any others on the table, just
stating that his strategy have moved the situation toward a new equilibrium by a long shot.

I agree. Gavin's threat of a unilateral hardfork did successfully concentrate minds on coming up with solutions to the problem. How do you know that his intention was not to stage a dev-team coup? He behaved exactly as if he really intended to, nothing suggested otherwise (although I don't claim to be aware of every bit of public commentary Gavin has made, as previously mentioned, I am not a twitter user  Grin)

I don't know as any other person who based his opinion on fact. I can for sure observe public community  reactions to Gavin's behaviour, though.

Well, the facts are that Gavin literally did just that; an alternative, hardforked client with a 20 MB block limit, and anyone from the core deve team that didn't like that could watch while he lobbies miners, services and merchants to accept the new client. That was what he said, he has not retracted it, nor confessed the ulterior motive you are affording him. Those are the facts, are they not?
legendary
Activity: 3430
Merit: 3080
June 23, 2015, 06:05:41 PM
I would rather see voting by miners done in the future for changes to be made in the future, than see guesses made today applied far into the future.

For readers who have studied feedback control systems:

   Q. How do you get a good closed-loop response?

   A. Start with a good open-loop one.  

The point is that the Bitcoin system will need tweaking (feedback) at some point in time in the future.  It is best if these tweaks are as small as possible.  That means that the guesses we make now about the future (realize that there's no way not to guess) should be as realistic as possible, thereby giving us a good open-loop response that needs as little feedback as possible to correct.  

There's a faint suggestion here that there is only one possible way to make bitcoin scale up, and it involves guessing which block size corresponds to which required rate of transactions. Which implies you believe the whole thing scales up using block size increases only. Is this really what you're implying?

No, I'm just trying to make the point that there's no way not to guess.  A lot of people think the "static" 1 MB limit is somehow neutral while a growing cap is more of a guess.  They're both guesses.  One will turn out to be more accurate than the other and thus require less tweaking going forward...

Ok, so if you accept that the blocksize is not the only solution to the problem, then would you also accept that your guess for the rate of block size increase can only be sensibly made when the overall context you are applying it to is known? i.e. design first how you intend to mitigate the problems that bigger blocks bring, then you come up with your goldilocks increase curve?

One that note, starting with a good open-loop response is not the only way to get a good closed-loop one.  There's another way: throw a lot of energy at your system!  Or in Bitcoin's case, a lot of debate and arguments!!    

If only raw hot air could solve this problem, we'd have finished with this probelm a long time ago  Cheesy
legendary
Activity: 1162
Merit: 1007
June 23, 2015, 06:02:04 PM
If he had done this, the blocksize limit would be set at 11.4 MB now and I bet everyone would be pretty happy.  It would just be taken for granted that "of course the limit grows with time because the systems grows with time too!"

More likely the 1 MB would have started at a much lower number as 1 MB was both overkill for near-term usage and hard to support with current technology at the time. 1 MB was clearly chosen with an expectation that it wouldn't be changed very soon. So given an autoscaling limit, it might well have been something more like 50K + 50%/year, giving us more like 600K now.

FWIW, I think 600K would still be a reasonable limit for the present (if that were in effect we'd likely see better fee management tooling), but I also think something like 2-3 MB is reasonable, or maybe 1 MB plus 50% per year (or a tapering growth rate or something). As you say getting it exactly right is difficult and unnecessary.

Good point.  He likely would have started at a cap less than 1 MB (not sure about 50k though Wink).  Seems we both agree that getting it exactly right is unnecessary and difficult. 
legendary
Activity: 2968
Merit: 1198
June 23, 2015, 05:59:04 PM
If he had done this, the blocksize limit would be set at 11.4 MB now and I bet everyone would be pretty happy.  It would just be taken for granted that "of course the limit grows with time because the systems grows with time too!"

More likely the 1 MB would have started at a much lower number as 1 MB was both overkill for near-term usage and hard to support with current technology at the time. 1 MB was clearly chosen with an expectation that it wouldn't be changed very soon. So given an autoscaling limit, it might well have been something more like 50K + 50%/year, giving us approximately 600K now.

FWIW, I think 600K would still be a reasonable limit for the present (if that were in effect we'd likely see better fee management tooling), but I also think something like 2-3 MB is reasonable, or maybe 1 MB plus 50% per year (or a tapering growth rate or something). As you say getting it exactly right is difficult and unnecessary.
legendary
Activity: 1372
Merit: 1000
June 23, 2015, 05:57:21 PM
But polls show that Gavin has a clear majority... its just certain core devs who object and not on technical grounds.  And most of them are working for a single organization.  How is that "hectoring a community"?


It isn't. Who said that?

There is a difference between "we need a hard fork to increase the block size" and "Gavin's plan is the way to do it, (which ever plan he settles upon)"

Most all the devs want a hard fork to increase block size.  Very few are on board with Gavin's plan.
Anyhow, it looks like the Bitcoin Core hard fork will be more likely to progress from gmaxwell's BIP, and the XT fork from Gavins perhaps.
They may remain compatible until there is a block that one would process and the other wouldn't.

You can include me in that contingent: we do need a hard fork to increase the block size (and only for what that will achieve, buying time). Very few people are debating that now, and I myself have been aware of the scalability issue for almost as long as I've been interested in bitcoin.

Re: which designs will be implemented on which fork; I thought Gavin had decided against the hostile fork?

the use of the term hostile is propaganda.

its not hostile, it's only hostile if you are feeling threatened like maybe the majority of Core developer working on another another hard fork and want to leverage block size increase with your proposed improvements.

if you're a user choosing the software to protect your investment is a way of expressing decentralization from a centralized control system run by the Core developers.  

legendary
Activity: 1414
Merit: 1000
June 23, 2015, 05:56:57 PM
that's quite encouraging.  the more i hear from this Rusty guy, the more i like him:

"Cheers,
Rusty.
PS. I work for Blockstream.  And I'm supposed to be working on
    Lightning, not this."


"this Rusty guy" is https://en.m.wikipedia.org/wiki/Rusty_Russell

to make a long story short he wrote Linux kernel packets filtering system ipchains/iptables (aka Linux firewall), lguest virtualization system (ancestor of docker/lxc) and contribute to samba/cifs  (a way to let Linux and ms win talk together), just to name a few.

edit: fix misattribution about samba/cifs projecf

no, i know who he is.  i like his independent spirit.

Wow, what happened. Do you like "for profit" and  "blockstream" developer ?

NO, I DON'T.

You have no clue how to develop software.
It seems you live in illusion and you even believe it is you and your "bitcointalk speculation thread" who developed bitcoin.
legendary
Activity: 1260
Merit: 1008
June 23, 2015, 05:53:47 PM


As far as I remember the debate about the max block size cap started a "long" time ago,
an it risked to stall indefinitely.

In my opinion Gavin's conduct has objectively an accelerating effect on the decision
process. I dare to say that such "extreme" attitude had stimulated a lot devs
in focusing to find a solution and to come to a compromise.

I'm not saying that his proposal is perfect or better than any others on the table, just
stating that his strategy have moved the situation toward a new equilibrium by a long shot.

I agree. Gavin's threat of a unilateral hardfork did successfully concentrate minds on coming up with solutions to the problem. How do you know that his intention was not to stage a dev-team coup? He behaved exactly as if he really intended to, nothing suggested otherwise (although I don't claim to be aware of every bit of public commentary Gavin has made, as previously mentioned, I am not a twitter user  Grin)

I don't know as any other person who based his opinion on fact. I can for sure observe public community  reactions to Gavin's behaviour, though.
legendary
Activity: 1162
Merit: 1007
June 23, 2015, 05:53:06 PM
Gavin's guesses about the future look pretty good to me in his current proposal.  what do you think?

I think it's fine.  20MB + 50%/yr was fine with me too.  There isn't one set of "magic numbers" that we need to find or everything comes collapsing down. 

Quote
i still think he is trying to automate out himself.

Me too.  The better job he does at guessing, the less energy he (and us) will have to throw at this debate in the future. 
sr. member
Activity: 350
Merit: 250
June 23, 2015, 05:51:15 PM
speaking of fallacies, what makes you think I'm a 1 MB block man?

To answer your question about who cares when 3rd world people can't get access to 1st world technology: they do. Who is this "who" that you're referring to, when you say "who cares?". What you really mean is: I'm in my little 1st world country group of people, and I don't consider these 3rd world peasants as fully fledged humans. Who cares about them? Everyone got everything solely by merit, ergo I am superior. Let me tell you, you're looking like less of a human to me.

And speaking of consensus polling, the unilateral (i.e. the perfect opposite of a consensus poll) forked client has been rejected in favor of going through the channels to consensus that already exist.

The reality is that the former XT supporters are the people who are in desperate need of a monetary despot; you can't paint unilateral forks any other way than exactly that.

Yes, its not about the 1MB forever which is insuficient for Bitcoin on the long run, its all about consensus, and how it got destroyed prior to the hardfork that removed the 21m cap, and it will happen mark my words.

totally disagree. i've followed Bitcoin and the core dev interactions for a very, very long time in Bitcoin years.  Gavin's built in increases are him trying to automate out himself as much as possible from the process going forward.  he wants to avoid, if possible, the contention that has arisen from this process.  and don't forget he's been lobbying for this increase for 3y but never got anywhere in discussions with the Blockstream folk.  so now he has introduced his first code on this matter in a BIP in Core, not XT.  

he will never propose an increase to the 21M coin cap.  these are two different things you're desperately trying to conflate to push ppl to Monero.

c'mon doc, as much as I would like to "push ppl to Monero" thats not what I mean, its not about Gavin, he is not even the lead core developer, that person is Wladimir J. van der Laan. And if someone can hijack Bitcoin to fork without consensus to increase blocks it will happen with the coin cap, its just a matter of time now I'm afraid.

This is obviously written by an agent of the state.


Yep, busted, in this bizarro world the G-man are behind the most anonymous and resilient e-cash ever invented and the starving (for freedom) masses are behind the biggest paradoxy financial panopticon ever shilled on the Internet.
legendary
Activity: 1162
Merit: 1007
June 23, 2015, 05:48:13 PM
I would rather see voting by miners done in the future for changes to be made in the future, than see guesses made today applied far into the future.

For readers who have studied feedback control systems:

   Q. How do you get a good closed-loop response?

   A. Start with a good open-loop one.  

The point is that the Bitcoin system will need tweaking (feedback) at some point in time in the future.  It is best if these tweaks are as small as possible.  That means that the guesses we make now about the future (realize that there's no way not to guess) should be as realistic as possible, thereby giving us a good open-loop response that needs as little feedback as possible to correct.  

There's a faint suggestion here that there is only one possible way to make bitcoin scale up, and it involves guessing which block size corresponds to which required rate of transactions. Which implies you believe the whole thing scales up using block size increases only. Is this really what you're implying?

No, I'm just trying to make the point that there's no way not to guess.  A lot of people think the "static" 1 MB limit is somehow neutral while a growing cap is more of a guess.  They're both guesses.  One will turn out to be more accurate than the other and thus require less tweaking going forward...

On that note, starting with a good open-loop response is not the only way to get a good closed-loop one.  There's another way: throw a lot of energy at your system!  Or in Bitcoin's case, a lot of debate and arguments!!    

Now imagine if Satoshi had taken another few moments when implementing the cap.  He writes this in the code:

- "Let's set the cap at 1 MB for now…"

- "…but since Nielsen's Law suggests 50% growth per year, let's make the cap increase by 50% each year too…"

- "…this may need some tweaking in the future, but it's better to make a guess that's slightly off, than one that ridiculously off (like a static limit)!"

If he had done this, the blocksize limit would be set at 11.4 MB now and I bet everyone would be pretty happy.  It would just be taken for granted that "of course the limit grows with time because the systems grows with time too!"
legendary
Activity: 3430
Merit: 3080
June 23, 2015, 05:39:26 PM
I would rather see voting by miners done in the future for changes to be made in the future, than see guesses made today applied far into the future.

That sounds sensible, but it also sounds like you're signed up to Greg Maxwell's prospectus on the issue. I like the idea of dynamic sizing, and of flexibility to make decisions only once that decision is required, but arguments have been made that this creates a perverse incentive for the miners to collude on the vote. Not to say I agree with that, but I haven't fully decided yet. This debate needs a really long time to be adequately digested. Influential people in this community arguably don't understand the system we have now for what it is, the idea that this debate is somehow ready for everyone to make an informed decision is not sensible.

As far as I remember the debate about the max block size cap started a "long" time ago,
an it risked to stall indefinitely.

In my opinion Gavin's conduct has objectively an accelerating effect on the decision
process. I dare to say that such "extreme" attitude had stimulated a lot devs
in focusing to find a solution and to come to a compromise.

I'm not saying that his proposal is perfect or better than any others on the table, just
stating that his strategy have moved the situation toward a new equilibrium by a long shot.

I agree. Gavin's threat of a unilateral hardfork did successfully concentrate minds on coming up with solutions to the problem. How do you know that his intention was not to stage a dev-team coup? He behaved exactly as if he really intended to, nothing suggested otherwise (although I don't claim to be aware of every bit of public commentary Gavin has made, as previously mentioned, I am not a twitter user  Grin)
legendary
Activity: 1260
Merit: 1008
June 23, 2015, 05:26:35 PM
I would rather see voting by miners done in the future for changes to be made in the future, than see guesses made today applied far into the future.

That sounds sensible, but it also sounds like you're signed up to Greg Maxwell's prospectus on the issue. I like the idea of dynamic sizing, and of flexibility to make decisions only once that decision is required, but arguments have been made that this creates a perverse incentive for the miners to collude on the vote. Not to say I agree with that, but I haven't fully decided yet. This debate needs a really long time to be adequately digested. Influential people in this community arguably don't understand the system we have now for what it is, the idea that this debate is somehow ready for everyone to make an informed decision is not sensible.

As far as I remember the debate about the max block size cap started a "long" time ago,
an it risked to stall indefinitely.

In my opinion Gavin's conduct has objectively an accelerating effect on the decision
process. I dare to say that such "extreme" attitude had stimulated a lot devs
in focusing to find a solution and to come to a compromise.

I'm not saying that his proposal is perfect or better than any others on the table, just
stating that his strategy have moved the situation toward a new equilibrium by a long shot.
legendary
Activity: 1764
Merit: 1002
June 23, 2015, 05:21:21 PM
legendary
Activity: 3430
Merit: 3080
June 23, 2015, 05:19:19 PM
I would rather see voting by miners done in the future for changes to be made in the future, than see guesses made today applied far into the future.

For readers who have studied feedback control systems:

   Q. How do you get a good closed-loop response?

   A. Start with a good open-loop one. 

The point is that the Bitcoin system will need tweaking (feedback) at some point in time in the future.  It is best if these tweaks are as small as possible.  That means that the guesses we make now about the future (realize that there's no way not to guess) should be as realistic as possible, thereby giving us a good open-loop response that needs as little feedback as possible to correct. 

There's a faint suggestion here that there is only one possible way to make bitcoin scale up, and it involves guessing which block size corresponds to which required rate of transactions. Which implies you believe the whole thing scales up using block size increases only. Is this really what you're implying?
Jump to: