Pages:
Author

Topic: Bitcoin 20MB Fork - page 38. (Read 154787 times)

sed
hero member
Activity: 532
Merit: 500
February 23, 2015, 07:19:34 PM
I'm learning a lot more from this conversation this week than I did in the previous weeks.  I think it's due to a change in tone by the posters here.  Thanks!
donator
Activity: 1218
Merit: 1079
Gerald Davis
February 23, 2015, 04:45:50 PM
With all due respect, contrast this limit (or any limit) with unlimited.

Nobody (well nobody with influence to make it happen) has proposed unlimited blocks.  20MB is just as finite as 1MB and so is 1GB.

Quote
It does indeed prevent spam attacks.
No it limits the severity of the damage that such an attack can cause.  It provides an upper limit not a prevention.  Many corporate wire account have an upper limit on the value of wire's that can be sent in one day.  That doesn't prevent a fraudulent wire transfer but it does prevent the most the company could lose due to such fraud.  The bean counters at a company will impose a limit that balances the needs of the company vs the loss the company could face.  If the upper bound of the loss is less than what would cripple the company than no single compromise could bring down the company.  The block limit is the same thing.  It is saying "worst case scenario, how much damage are we talking about?".  How much is a good question to ask but you have to be asking the right question.  The question of what limit prevents spam is the wrong question but the limit doesn't prevent spam.

Quote
and the proposal is for 16x the current risk and x16000 over 20 years.
In nominal terms but then again in nominal terms but the cost per unit of bandwidth (and cpu time, memory, and storage as well) falls over time.  I mean even 1MB blocks would have been massive 20 years ago as well when the most common form of connectivity was a 56K modem.

So the problem could expressed as both a short term and longer term problem.  What is the maximum block size that could be created today without exceeding the bandwidth resources of a well connected node?  If it is x and bandwidth availability per unit of cost increases by y per year, then in n years a block size of x*(1+y)^n presents no more of a burden than a block size of x today.

For the record I think Gavin's proposal is too aggressive.  It uses Moore's law but bandwidth has trailed moore's law.  A 20% YOY increase more closely resembles bandwidth availability over the last 20 years.  Also 20MB as "x" is pretty aggressive as well.  So something like 11MB * 1.2^n gets us to the same place (~16B) in 50 years instead of 20 and with a higher confidence that bandwidth requirements will grow slower than bandwidth availability.  Still I got a little off track no matter what limit is adopted it doesn't prevent spam.  Economics and relaying rules prevent spam.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 04:44:56 PM

Development says: "This problem...its a biggie, give us 20 years"


With my developer hat on, if I were Gavin I might ask for this also.  A good engineer can figure out what it would take to solve a problem.  A great engineer with some experience usually multiplies this by 3x before going to management and asking for what is needed.

Gavin may be thinking there's at least 6-7 years of development in this thing and we got very little done.  Maybe if we start now we can lick it in 6 or 7 years so lets ask for 20+.

Good management knows this, and knows that developers get sucked in to everything unless there is some pressure, so they ask the unreasonable.  Sometimes they get it, sometimes they just end up with frustrated developers.

But since we know that eventually we will be back to this same negotiation... It is better to have shorter intervals between checking in on this than 20 years (which is essentially passing it on to your successor).
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 04:30:41 PM
Think of it yet another way.

Development and Management go into a negotiation.  Management assumes the risk of development failing.  Management wants Bitcoin that helps people, does what is needed, and continues to exist.  So Secure it and make it useful.  Do so in a way that doesn't have to be fixed over and over forever.

Development says: "This problem...its a biggie, give us 20 years"

What development is asking for is some breathing room so that they can kick the can down the road on the real work that needs to be done to get a self-managing model in place that can dispense with the need for futzing with the limit, ever.  It is a pretty big thing to ask.  Basically it is saying "trust us, we will work on this over the next 20 years even though we know we won't be held accountable until that time is up or unless something bad happens from us ignoring it for so long."

Management has the incentive to keep the developer's feet to the fire, development want's slack.
Development and management may be the same folks here, so it is ultimately an exercise in self discipline.

These arbitrary limits are only phase 1.  1MB, 16MB+, 20MB, exponential growth formulas... all of these are arbitrary.  We have arbitrary now at 1MB.  No progress has been made to get past arbitrary yet.  This is understandable, lots of bugs to fix.  But this will always be true.  It is a developer's maxim: "the last bug is fixed when the last user dies".  So removing all incentive to fix this issue for 20 years?  It is better to fix things that require hard forks earlier than later.

Getting to the ability to measure and adapt the max block size is only phase 2.  There is a phase 3 also, it would be nice to get there.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 04:21:19 PM
In the short term...to high a limit and we get increased risk and spam attack risks, too low and we get transactions queued.  Both are very bad.  Whether you think one is worse than another is a matter of perspective.

Well I think this is a common misunderstanding.  The block limit never had and never will prevent spam.  When the 1MB limit was put in place the average block size was about 35KB.  So even after it was in place a miner could have made a block 3000% of the average block size.  The rules for standard transactions, transaction priority, dust threshold, and minimum fees is what prevents spam.  The block limit only provides an upper bound on how much damage a malicious user could do.  By your definitions the 1MB limit was "too high" as it never did anything to prevent spam.  In fact it failed utterly horribly at that job (well failed as much as something failing to prevent an event it was not designed to prevent).   Prior to the dust threshold being created users (one service in particularly) horribly wasted the most critical resource of the network.  No contrary to popular opinion that isn't block space, it is the UTXO set.  The UTXO set is critical because blocks are not used to validate new blocks or transactions.  Blocks are used to update the UTXO and the UTXO is used to validate all new transactions.  Under normal use the UTXO set grows slower than the overall blockchain.  Satoshi Dice sent low (as low as 1 satoshi) outputs to notify users of losses.  Those outputs will probably never be spent but the network can't "write them off".  They will remain perpetually bloating the UTXO set.  Even if the block limit was ultra conservative (say 100K instead of 1000KB) it wouldn't have prevented this poor use of critical resources.

So what did the block limit do.  It provided an upper limit on the damage that one or more malicious entities could do to the overall network.  If 100% of the network conspired it still put an upper bound on the blockchain growth of 30x the average block size.  If only 10% of the network conspired then it limited the bloat to no more than 3x the average block size.  A single malicious entity creating the occasional max block would have even less effect in the long run.  That was the purpose of the block limit.  The block limit is like fire insurance for your house.  It limits the damage in the event your house is destroyed but it would not be accurate to say that fire insurance prevents fires anymore than the block limit prevents spam

With all due respect, contrast this limit (or any limit) with unlimited.
It does indeed prevent spam attacks.  Do not confuse the scope of the threat with the existence of it.
We are looking at arbitrary amounts of risk of threat to accept, and the proposal is for 16x the current risk and x16000 over 20 years.

That may be a reasonable number, or it may not be.  We can't know from where we are today.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 04:16:19 PM

Finally, there is no such thing as a course of action to prepare for the future that is not based on some form of extrapolation or prediction.  But that doesn't mean failing to be prepared is a good idea.

And I still have no idea what miner would deliberately lose money by taking block size up to the very limit using bogus transactions, or why they would try to.  We're talking these days about people running actual businesses that they have invested major resources in and want to see some ROI.  This is not just set of randoms that's going to contain trolls who came along just to wreck things for lulz.    So I'm just not seeing the immediate threat model of a ridiculously enormous block-chain that you claim to see.

There is a course of action to prepare for the future that is not based on some form of extrapolation.  Many proposals have taken this form.  All of them have the same failing in that they are not implementable without adding some code for metrics.
What will be there for us in 2, 5, 10, 20 years that will know how big bitcoin blocks are and need to be?  The block chain will.  

Simply stated the threat model of a ridiculously enormous block chain is that the cost of bitcoin can be made to be unreasonably high, such that it becomes non useful, and even non-economic.
Perhaps you do not see it because you are considering only those within the Bitcoin Economy as important, but when you consider the larger scope of players in the game theory, and consider that there might be some who would like our experiment in crypto currency called Bitcoin to fail?

The Bitcoin Network Cost = Data Size * Decentralization.  

It doesn't take a 50+% attack to grow the data size if the protocol permits it.  It does not lose money to do this attack, except only at the margins with orphaned blocks.  This is a small fraction of the mining revenue, and it is important to those that are working to be the most competitive.

If you consider an 'attacker' who might be willing to absorb whatever losses might accrue from the very occasional orphaned block in order to grow the data size by as much as the new maximum allows with every block they solve.  This will have some bad effects.

First bad effect is to increase the cost for all miners and node operators (these are not necessarily the same folks).  The node operators take the biggest hit because there is no revenue for them anyway.
The miners also take a hit in increased cost (bandwidth, maintenance, storage).  The smallest and those with the most expensive bandwidth may fall below profitability and may leave the market.  This benefits the 'attacker' in several ways.
1) they get a larger share of the hash rate by knocking out competition
2) they increase the centralization of node operations and mining making Bitcoin ever easier to attack.

Second bad effect of the exponential growth plan is its perniciousness.
The greatest effect of an attack is done when it is sudden, persistent, and overbearing.
We may have a great majority of Bitcoin Economy miners that manage their block sizes and even though the max is 16MB-16GB or whatever the limit is of the period.  Average block size may continue to be less than 1MB for many years to come, or grow much slower than the extrapolation predicts.  We simply do not know what it will be.  If the attacker waits until a time when Bitcoin is particularly vulnerable, and then starts mining the huge bloated blocks to make it more expensive.  The risk will slowly increase until such time that it can be exploited.

Third bad effect.  The limit could be too low.  Ridiculously high may not be high enough.
Bitcoin could become wildly successful much sooner than expected.

Does you seriously think that it might take 20 years to solve the block size measurement problem?
I like that the new proposal does have a sunset provision (only 10 doublings so increases 2^10).  Each revision Gavin has improved the proposal, though it still seems so very pessimistic.  If we are postulating exponential Bitcoin transaction growth, why not also postulate exponential growth in Bitcoin Developer expertise?
donator
Activity: 1218
Merit: 1079
Gerald Davis
February 23, 2015, 04:01:27 PM
In the short term...to high a limit and we get increased risk and spam attack risks, too low and we get transactions queued.  Both are very bad.  Whether you think one is worse than another is a matter of perspective.

Well I think this is a common misunderstanding.  The block limit never had and never will prevent spam.  When the 1MB limit was put in place the average block size was about 35KB.  So even after it was in place a miner could have made a block 3000% of the average block size.  The rules for standard transactions, transaction priority, dust threshold, and minimum fees is what prevents spam.  The block limit only provides an upper bound on how much damage a malicious user could do.  By your definitions the 1MB limit was "too high" as it never did anything to prevent spam.  In fact it failed utterly horribly at that job (well failed as much as something failing to prevent an event it was not designed to prevent).   Prior to the dust threshold being created users (one service in particularly) horribly wasted the most critical resource of the network.  No contrary to popular opinion that isn't block space, it is the UTXO set.  The UTXO set is critical because blocks are not used to validate new blocks or transactions.  Blocks are used to update the UTXO and the UTXO is used to validate all new transactions.  Under normal use the UTXO set grows slower than the overall blockchain.  Satoshi Dice sent low (as low as 1 satoshi) outputs to notify users of losses.  Those outputs will probably never be spent but the network can't "write them off".  They will remain perpetually bloating the UTXO set.  Even if the block limit was ultra conservative (say 100K instead of 1000KB) it wouldn't have prevented this poor use of critical resources.

So what did the block limit do.  It provided an upper limit on the damage that one or more malicious entities could do to the overall network.  If 100% of the network conspired it still put an upper bound on the blockchain growth of 30x the average block size.  If only 10% of the network conspired then it limited the bloat to no more than 3x the average block size.  A single malicious entity creating the occasional max block would have even less effect in the long run.  That was the purpose of the block limit.  The block limit is like fire insurance for your house.  It limits the damage in the event your house is destroyed but it would not be accurate to say that fire insurance prevents fires anymore than the block limit prevents spam


legendary
Activity: 924
Merit: 1132
February 23, 2015, 03:28:41 PM

Bitcoin is as excellent as it is not so much because of what it can do, but also because of that you can not do.

Adopting an inaccurate and indefinite exponential growth proposal based on extrapolations is folly and hubris.  It introduces new failure modes.  It just isn't worth it.

Yup.  But that's not what anybody advocated, so why are you bothering to say it would be bad?  

The proposal is not indefinite exponential growth; it's exponential growth for 20 years.

Nor does it seem particularly inaccurate to me; as I pointed out already.  

Finally, there is no such thing as a course of action to prepare for the future that is not based on some form of extrapolation or prediction.  But that doesn't mean failing to be prepared is a good idea.

And I still have no idea what miner would deliberately lose money by taking block size up to the very limit using bogus transactions, or why they would try to.  We're talking these days about people running actual businesses that they have invested major resources in and want to see some ROI.  This is not just set of randoms that's going to contain trolls who came along just to wreck things for lulz.    So I'm just not seeing the immediate threat model of a ridiculously enormous block-chain that you claim to see.

sr. member
Activity: 346
Merit: 250
February 23, 2015, 03:23:56 PM
Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

The maximum maximum block size will become 16 GB on Jan 1 2035 with Gavins proposal.

So I guess that means that another hard fork is not needed until 2035?

If the population of Earth in 2035 is ~8 billion, that's 2 bytes per block, per person. 

Given 144 blocks a day, that's 288 bytes per person, per day.  If everybody uses the block chain twice a week or so, the block size proposed for 2035 should be adequate. 

Since most people don't do money orders, remittances, or savings-account transactions that often, honestly I think that ought to suffice for "ordinary" use by ordinary people, even with a very high rate of adoption.  I mean, some will be using it for every cuppa coffee, and some won't be using it at all, but it averages out at what I'd think of as a normal rate of usage for complete mainstream adoption.

Rather than extrapolate on what might happen, a better question to ask yourself would be:  "Would there be any negative effect if things are different from this result?"

Bitcoin is not improved by making it work.  It already does that.  Bitcoin is improved by making it not fail.  This must be the goal of the fork.

Consider if there is growth and shrinkage in either bitcoin use or populations?  Even a miner with a smallish percentage of the hash rate could significantly raise the costs for everyone else.

Bitcoin is as excellent as it is not so much because of what it can do, but also because of that you can not do.

Adopting an inaccurate and indefinite exponential growth proposal based on extrapolations is folly and hubris.  It introduces new failure modes.  It just isn't worth it.

Those profork are just used to toiletpaper money.
They want to transact like spoiled little kids.
They want everybody to agree with them, obliging their ignorance upon others, crying for democracy and freedom..
Well hey start by cleaning up your own backyard.

hero member
Activity: 658
Merit: 500
February 23, 2015, 02:47:17 PM
Go away.

If a standard size font doesn't work, maybe a big size font will get my message across.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 02:28:41 PM
Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

The maximum maximum block size will become 16 GB on Jan 1 2035 with Gavins proposal.

So I guess that means that another hard fork is not needed until 2035?

If the population of Earth in 2035 is ~8 billion, that's 2 bytes per block, per person.  

Given 144 blocks a day, that's 288 bytes per person, per day.  If everybody uses the block chain twice a week or so, the block size proposed for 2035 should be adequate.  

Since most people don't do money orders, remittances, or savings-account transactions that often, honestly I think that ought to suffice for "ordinary" use by ordinary people, even with a very high rate of adoption.  I mean, some will be using it for every cuppa coffee, and some won't be using it at all, but it averages out at what I'd think of as a normal rate of usage for complete mainstream adoption.

Rather than extrapolate on what might happen, a better question to ask yourself would be:  "Would there be any negative effect if things are different from this result?"

Bitcoin is not improved by making it work.  It already does that.  Bitcoin is improved by making it not fail.  This must be the goal of the fork.

Consider if there is growth and shrinkage in either bitcoin use or populations?  Even a miner with a smallish percentage of the hash rate could significantly raise the costs for everyone else.

Bitcoin is as excellent as it is not so much because of what it can do, but also because of that you can not do.

Adopting an inaccurate and indefinite exponential growth proposal based on extrapolations is folly and hubris.  It introduces new failure modes.  It just isn't worth it.
full member
Activity: 212
Merit: 100
Daniel P. Barron
February 23, 2015, 02:19:27 PM
People have discussed sidechains and pruning and other suggestions, but the fact is, they will take time to implement to make sure they actually work.  A 20MB blocksize is a far more simple solution that will work right now and until I hear something better, that's the one I'll be supporting.

Ok, but, see, we can go for 2-5-8 MB limit (wihtout exponential growth) before we hit the limit, and use the time gained by shooting the can to invent something better

It's not so urgent that our only choice is to implement exponential size growth

What you are saying now is, let's go exponential, maybe we can sustain it.

Blocks won't be instantly full when the limit is raised, so there's no point in going with “not so big blocks”. Unless you have another argument for doing it.

A lot of time would be saved if you muppets would actually read.

You. Have. Been. Soundly. Defeated.

XI. Raising the limit doesn't force the blocks to be filled. It just gives miners the option to make bigger blocks should market conditions make it to their advantage to do so.

This is not how economics work. Quoting Buffetti :

Quote
The domestic textile industry operates in a commodity business, competing in a world market in which substantial excess capacity exists. Much of the trouble we experienced was attributable, both directly and indirectly, to competition from foreign countries whose workers are paid a small fraction of the U.S. minimum wage. But that in no way means that our labor force deserves any blame for our closing. In fact, in comparison with employees of American industry generally, our workers were poorly paid, as has been the case throughout the textile business. In contract negotiations, union leaders and members were sensitive to our disadvantageous cost position and did not push for unrealistic wage increases or unproductive work practices. To the contrary, they tried just as hard as we did to keep us competitive. Even during our liquidation period they performed superbly. (Ironically, we would have been better off financially if our union had behaved unreasonably some years ago; we then would have recognized the impossible future that we faced, promptly closed down, and avoided significant future losses.)

Over the years, we had the option of making large capital expenditures in the textile operation that would have allowed us to somewhat reduce variable costs. Each proposal to do so looked like an immediate winner. Measured by standard return-on-investment tests, in fact, these proposals usually promised greater economic benefits than would have resulted from comparable expenditures in our highly-profitable candy and newspaper businesses.

But the promised benefits from these textile investments were illusory. Many of our competitors, both domestic and foreign, were stepping up to the same kind of expenditures and, once enough companies did so, their reduced costs became the baseline for reduced prices industrywide. Viewed individually, each company's capital investment decision appeared cost-effective and rational; viewed collectively, the decisions neutralized each other and were irrational, just as happens when each person watching a parade decides he can see a little better if he stands on tiptoes.

After each round of investment, all the players had more money in the game and returns remained anemic. Thus, we faced a miserable choice: huge capital investment would have helped to keep our textile business alive, but would have left us with terrible returns on ever-growing amounts of capital. After the investment, moreover, the foreign competition would still have retained a major, continuing advantage in labor costs. A refusal to invest, however, would make us increasingly non-competitive, even measured against domestic textile manufacturers. I always thought myself in the position described by Woody Allen in one of his movies: "More than any other time in history, mankind faces a crossroads. One path leads to despair and utter hopelessness, the other to total extinction. Let us pray we have the wisdom to choose correctly."

For an understanding of how the to-invest-or-not-to-invest dilemma plays out in a commodity business, it is instructive to look at Burlington Industries, by far the largest U.S. textile company both 21 years ago and now. In 1964 Burlington had sales of $1.2 billion against our $50 million. It had strengths in both distribution and production that we could never hope to match and also, of course, had an earnings record far superior to ours. Its stock sold at 60 at the end of 1964; ours was 13.

Burlington made a decision to stick to the textile business, and in 1985 had sales of about $2.8 billion. During the 1964-85 period, the company made capital expenditures of about $3 billion, far more than any other U.S. textile company and more than $200-per-share on that $60 stock. A very large part of the expenditures, I am sure, was devoted to cost improvement and expansion. Given Burlington's basic commitment to stay in textiles, I would also surmise that the company's capital decisions were quite rational.

Nevertheless, Burlington has lost sales volume in real dollars and has far lower returns on sales and equity now than 20 years ago. Split 2-for-1 in 1965, the stock now sells at 34-on an adjusted basis, just a little over its $60 price in 1964. Meanwhile, the CPI has more than tripled. Therefore, each share commands about one-third the purchasing power it did at the end of 1964. Regular dividends have been paid but they, too, have shrunk significantly in purchasing power.

This devastating outcome for the shareholders indicates what can happen when much brain power and energy are applied to a faulty premise. The situation is suggestive of Samuel Johnson's horse: "A horse that can count to ten is a remarkable horse, not a remarkable mathematician." Likewise, a textile company that allocates capital brilliantly within its industry is a remarkable textile company, but not a remarkable business.

My conclusion from my own experiences and from much observation of other businesses is that a good managerial record (measured by economic returns) is far more a function of what business boat you get into than it is of how effectively you row (though intelligence and effort help considerably, of course, in any business, good or bad). Some years ago I wrote: "When a management with a reputation for brilliance tackles a business with a reputation for poor fundamental economics, it is the reputation of the business that remains intact." Nothing has since changed my point of view on that matter. Should you find yourself in a chronically leaking boat, energy devoted to changing vessels is likely to be more productive than energy devoted to patching leaks.

So, no : infinite blocks to not give "the miners" any sort of option, because "the miners" as a collective noun do not exist. There exist individual miners exclusively, and the incentives of individuals are, should the Gavin scam actually be implemented, firmly oriented towards destroying the commons that is Bitcoin.

There's no way out of this problem, and simple ignorance of economy or game theory is not a solution.

XII. The current 1Mb limit is arbitrary. We want to change it. Please ignore the fact that the discussion is about whether to change or not to change, and please ignore that the onus is on whoever proposes change to justify it. Instead, buy into our pretense that the discussion is about "which arbitrary value". Because we're idiots, and so should be you!

Go away.

Go away.
legendary
Activity: 924
Merit: 1132
February 23, 2015, 01:18:06 PM
Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

The maximum maximum block size will become 16 GB on Jan 1 2035 with Gavins proposal.

So I guess that means that another hard fork is not needed until 2035?

If the population of Earth in 2035 is ~8 billion, that's 2 bytes per block, per person. 

Given 144 blocks a day, that's 288 bytes per person, per day.  If everybody uses the block chain twice a week or so, the block size proposed for 2035 should be adequate. 

Since most people don't do money orders, remittances, or savings-account transactions that often, honestly I think that ought to suffice for "ordinary" use by ordinary people, even with a very high rate of adoption.  I mean, some will be using it for every cuppa coffee, and some won't be using it at all, but it averages out at what I'd think of as a normal rate of usage for complete mainstream adoption.


legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 12:33:21 PM
Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

The maximum maximum block size will become 16 GB on Jan 1 2035 with Gavins proposal.

So I guess that means that another hard fork is not needed until 2035?

You are guessing.  So is Gavin.

Wouldn't it be better to measure and not have to guess?
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 12:27:16 PM
People have discussed sidechains and pruning and other suggestions, but the fact is, they will take time to implement to make sure they actually work.  A 20MB blocksize is a far more simple solution that will work right now and until I hear something better, that's the one I'll be supporting.

Ok, but, see, we can go for 2-5-8 MB limit (wihtout exponential growth) before we hit the limit, and use the time gained by shooting the can to invent something better

It's not so urgent that our only choice is to implement exponential size growth

What you are saying now is, let's go exponential, maybe we can sustain it.

This is the worst anti-fork argument, since it means you'd want have to have another hard fork each time you need to increase it.  If exponential adoption happens, either we find a way to cope with it, or another coin will.  Again, there are only two outcomes.  Increase the limit, or jump ship when Bitcoin can't cope and grinds to a halt.  

Wow.  
Take some time and consider what you are suggesting.
If you only do what is easy, rather than what is right, you will always be wrong.

Secondly, all of your premises here are incorrect.  There is no grinding to a halt.  The worse case scenario of not resolving this in a strategic way is that Bitcoin will continue to do what it already does better than anything else.  It does not "grind to a halt".  If your transaction is urgent, you pay a fee.

Important things are worth doing right.  Even if they are hard.  

Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

You want us to be afraid of some other crypto currency taking over, and that is why we need to fork?  Sell your fear elsewhere.

This isn't "selling fear", this is common sense.  This isn't about sending transactions without a fee, this is your transaction my not make it in even with a fee.  Full means full.  As in wait for the next one.  And if that one is full, wait for the one after that.  How long do you think people will wait before complaining that Bitcoin is slow and useless?  Bearing in mind this is the internet and people complain about the slightest little inconvenience like it's the end of the world.  I don't want to see the outcome of that fallout.

My argument is keep the number of forks to an absolute minimum, which is hardly controversial.  You can clearly see in the quote above, Sardokan said "see, we can go for 2-5-8 MB limit", which sounded like he wanted to have this discussion a few more times in future.  But now that he's clarified his position, you'll find I've already agreed with him that this fork is needed because we may not have a permanent solution ready before we hit the 1MB limit and the next fork should be the permanent fix to solve the problem of scalability once and for all.  Again, if someone comes up with a permanent fix before we start coming close to that limit, I'll happily listen to that argument.  Until then, we need to raise the limit.  Stop over-reacting to things.

Yes, I caught your later post afterward.  Thank you for clarifying that.
We will very likely have this discussion again in the future with the proposed fork also. 
As you recognize here, there is not yet any proposal that will prevent that.  We do not have a mechanism yet for a right-sizing max block size.

The 2-5-8 is not any less reasonable.  A road-map of how to get to that right-sizing mechanism should be the 1st goal.  We should have that before any significant fork anyhow.  Without it, a small increase would be just fine with me.    Once that road map is articulated, then comes picking an appropriate size to get Bitcoin that mechanism.  Without it, it doesn't make a lot of sense to guess how long or how many transactions per second there will be by the time we get to it.

Beyond that, ending the max block size limit with free-market economic incentives is a far more distant goal.

In the short term...to high a limit and we get increased risk and spam attack risks, too low and we get transactions queued.  Both are very bad.  Whether you think one is worse than another is a matter of perspective.
hero member
Activity: 687
Merit: 500
February 23, 2015, 12:09:16 PM
Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

The maximum maximum block size will become 16 GB on Jan 1 2035 with Gavins proposal.

So I guess that means that another hard fork is not needed until 2035?
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
February 23, 2015, 12:08:48 PM
People have discussed sidechains and pruning and other suggestions, but the fact is, they will take time to implement to make sure they actually work.  A 20MB blocksize is a far more simple solution that will work right now and until I hear something better, that's the one I'll be supporting.

Ok, but, see, we can go for 2-5-8 MB limit (wihtout exponential growth) before we hit the limit, and use the time gained by shooting the can to invent something better

It's not so urgent that our only choice is to implement exponential size growth

What you are saying now is, let's go exponential, maybe we can sustain it.

This is the worst anti-fork argument, since it means you'd want have to have another hard fork each time you need to increase it.  If exponential adoption happens, either we find a way to cope with it, or another coin will.  Again, there are only two outcomes.  Increase the limit, or jump ship when Bitcoin can't cope and grinds to a halt.  

Wow.  
Take some time and consider what you are suggesting.
If you only do what is easy, rather than what is right, you will always be wrong.

Secondly, all of your premises here are incorrect.  There is no grinding to a halt.  The worse case scenario of not resolving this in a strategic way is that Bitcoin will continue to do what it already does better than anything else.  It does not "grind to a halt".  If your transaction is urgent, you pay a fee.

Important things are worth doing right.  Even if they are hard.  

Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

You want us to be afraid of some other crypto currency taking over, and that is why we need to fork?  Sell your fear elsewhere.

This isn't "selling fear", this is common sense.  This isn't about sending transactions without a fee, this is saying your transaction may not make it in even with a fee.  Get the premise right before you decide you're disagreeing with it.  Full means full.  As in wait for the next one.  And if that one is full, wait for the one after that.  How long do you think people will wait before complaining that Bitcoin is slow and useless?  Bearing in mind this is the internet and people complain about the slightest little inconvenience like it's the end of the world.  I don't want to see the outcome of that fallout.

My argument is keep the number of forks to an absolute minimum, which is hardly controversial.  You can clearly see in the quote above, Sardokan said "see, we can go for 2-5-8 MB limit", which sounded like he wanted to have this discussion a few more times in future.  But now that he's clarified his position, you'll find I've already agreed with him that this fork is needed to "buy us some time" because we may not have a permanent solution ready before we hit the 1MB limit and the next fork should be the permanent fix to solve the problem of scalability once and for all.  Again, if someone comes up with a permanent fix before we start coming close to that limit, I'll happily listen to that argument.  Until then, we need to raise the limit.  Stop over-reacting to things.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 23, 2015, 11:45:47 AM
People have discussed sidechains and pruning and other suggestions, but the fact is, they will take time to implement to make sure they actually work.  A 20MB blocksize is a far more simple solution that will work right now and until I hear something better, that's the one I'll be supporting.

Ok, but, see, we can go for 2-5-8 MB limit (wihtout exponential growth) before we hit the limit, and use the time gained by shooting the can to invent something better

It's not so urgent that our only choice is to implement exponential size growth

What you are saying now is, let's go exponential, maybe we can sustain it.

This is the worst anti-fork argument, since it means you'd want have to have another hard fork each time you need to increase it.  If exponential adoption happens, either we find a way to cope with it, or another coin will.  Again, there are only two outcomes.  Increase the limit, or jump ship when Bitcoin can't cope and grinds to a halt.  

Take some time, consider what you are suggesting.
If you only do what is easy, rather than what is right, you will always be wrong.

Secondly, all of your premises here are incorrect.  There is no grinding to a halt.  The worse case scenario of not resolving this in a strategic way, or not forking for a larger max block size is that Bitcoin will continue to do what it already does better than anything else.  It does not "grind to a halt".  If your transaction is urgent, you pay a fee.

Important things are worth doing right.  Even if they are hard.  

Thirdly... So what if there are more hard forks later?  The current proposal from Gavin also guarantees that there will be more hard forks.  The proposal is a guess at what might be needed, it does not measure the block chain in any way to determine how to set the right block size.  It is a good guess, but it is just a guess, same as the 1MB limit was.  We don't finish with hard forks by going with the current proposal either.  So none of your argument here makes any sense at all.

You want us to be afraid of some other crypto currency taking over, and that is why we need to fork?  Sell your fear elsewhere.

The max block size is only the upper limit of what miners can set for their blocks.  The miners set the limits in practice.  Most miners have their limits set well below 1MB. 
hero member
Activity: 658
Merit: 500
February 23, 2015, 09:55:23 AM
Thanks for stripping away my human rights. I really appreciate it.

There are no rights; only individual preferences.

Tell it to big brother, cry-baby.

It's dangerous to deal with people who think that rights can just be suppressed.
full member
Activity: 212
Merit: 100
Daniel P. Barron
February 23, 2015, 09:27:09 AM
I didn't know being in WoT was necessary to be able to use Bitcoin. Does that mean all those bitcoins I've sent all these years are invalid?

It means that those bitcoin were sent from a non-person. Even automated scripts can send a transaction; don't feel too special.

Thanks for stripping away my human rights. I really appreciate it.

There are no rights; only individual preferences.

Tell it to big brother, cry-baby.
Pages:
Jump to: