Pages:
Author

Topic: blocksize solution: longest chain decides (Read 2173 times)

sr. member
Activity: 299
Merit: 250
September 04, 2015, 07:42:57 PM
#64
whats the connection to the spam limit Satoshi introduced as 1 mb blocksize.

Why is the spam limit blocksize not that important anymore?

I actually haven't seen an in-depth discussion of this and would love for someone more knowledgeable than me to weigh in.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 07:34:38 PM
#63
whats the connection to the spam limit Satoshi introduced as 1 mb blocksize.

Why is the spam limit blocksize not that important anymore?
sr. member
Activity: 299
Merit: 250
September 04, 2015, 05:33:25 PM
#62
I'm disappointed that no one is taking this opportunity to discuss solutions to spam attacks (dust transactions w/ maximum outputs). That's actually the issue that is forcing this debate -- not organic growth in transaction volume, which on average, is nowhere near the limit.

I'll admit that I'm not entirely familiar with the issue.  Can you give an example?

I thought there is a dust limit of 5500 Satoshis.  is that set by the miners by consensus or is part of the protocol?

AFAIK, the minimum is slightly less than 550 satoshis, or < $0.0013. I believe the last Coinwallet stress test involved outputs of 0.00001 -- approximately double the current definition of dust.

In theory, nodes could observe the standard output of stress test transactions, then simply alter their conf files to not relay transactions with outputs of that size (or smaller). The only loss here is that people cannot quickly send $0.002 transactions. The gain is that above the agreed upon dust thresholds, the standard fee should be adequate (unless I am approaching this incorrectly).

It's not a protocol issue. It affects which transactions an unmodified bitcoind/bitcoin-qt client will relay on the network. Miners can put as many non-standard transaction in their blocks as they want, but without further modification the reference client will not broadcast or relay those transactions to the miners. It's completely configurable. If the current definition of dust is not economical, miners/nodes can just change their conf files.

One idea would be to increase the client-coded threshold that defines an output as dust, which would increase the aggregate amount of bitcoins required to push large spam transactions with dust outputs. Another idea would be to require an additional fee to push transactions on a per-dust-output basis.

The issue has other implications in regards to bloat.

Well no you already know that dust has nothing to do with blockchain size but EVERYTHING to do with UXTO bloat.  Dust (or a very high probability of dust approaching 1) won't be spent thus it remains in the UXTO.  For normal economic transactions the UXTO only grows linearly (related more to the number of discrete entities not total transactions over time).  This is good because the UXTO is a far more critical resource than the unpruned blockchain.   It is highly likely that in the future most full nodes won't even maintain the full blockchain but they will need the entire UXTO.

Having tiny worthless transactions which last forever in the UXTO lowers the efficiency of every node, forever.

Dust outputs won't get spent. 100BTC outputs will get spent and pruned.

Moore's Law isn't magic.  It reflects that people push back against wastes of CPU time, disk storage, and other limitations to push technology to its physical limits.  Processor speeds don't just magically increase.  Disk storage doesn't just magically increase.  The assumptions underlying Moore's Law (which isn't a real law) involve bean-counters identifying and eliminating wastes of resources that slow down systems and break them, just as much as they involve people inventing new technologies.

The threat from dust transactions may be on the un-glamorous, bean-counting side of technological improvement, but that doesn't mean it doesn't exist.

legendary
Activity: 1162
Merit: 1007
September 04, 2015, 04:41:23 PM
#61
The hassle of a hard fork is an important feature of the process, not a burden.
Ever wondered why evolutionists cannot wrap their heads about the fact that sexual reproduction consumes that much energy?
Well, here is your answer. Erecting the new 8Mb limit is definitely pointing you in the right direction, Peter R. Wink

Haha I think you're right!
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 04:25:07 PM
#60
To begin, the demand for transaction and space on the block chain can be considered infinite. From this reasoning we can infer that by specialized hardware we really mean & connections we really mean high-end datacenters.

Demand for any commodity can be considered infinite if the price is low enough.  This was what my paper was about!  I showed that even though "demand can be considered infinite" that the size of the blocks mined would always be finite.  That is, the supply curve always intersects the demand curve at a finite block size.  

So, no, you can't infer that we'll need high-end datacenters (I do agree that hardware costs will likely increase, though).  
 
Quote
What you call a transaction quota I call a check on centralization. Remember that free market is not inherent to Bitcoin as it works under specific rules to carefully assign the players incentive. I suggest an irresponsible block size policy can change these incentives for the worst.

A check on centralization enforced by…a centralized group of developers?

The physicist James Clerk Maxwell hypothesized a daemon who could quickly open and close valves to collect the air molecules with higher kinetic energy from the slower moving molecules, thereby creating a perpetual motion machine.  

The cryptographer Gregory Maxwell hypothesized a daemon who could control the block size limit to prevent mining centralization without turning into the very centralized force the daemon was created to fight.  
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 04:23:32 PM
#59
I'm disappointed that no one is taking this opportunity to discuss solutions to spam attacks (dust transactions w/ maximum outputs). That's actually the issue that is forcing this debate -- not organic growth in transaction volume, which on average, is nowhere near the limit.

I'll admit that I'm not entirely familiar with the issue.  Can you give an example?

I thought there is a dust limit of 5500 Satoshis.  is that set by the miners by consensus or is part of the protocol?
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
September 04, 2015, 04:18:25 PM
#58
I'm disappointed that no one is taking this opportunity to discuss solutions to spam attacks (dust transactions w/ maximum outputs). That's actually the issue that is forcing this debate -- not organic growth in transaction volume, which on average, is nowhere near the limit.

Indeed actual p2p transactions between users is a marginal amount of the actual transaction volume.
sr. member
Activity: 299
Merit: 250
September 04, 2015, 04:17:03 PM
#57
I'm disappointed that no one is taking this opportunity to discuss solutions to spam attacks (dust transactions w/ maximum outputs). That's actually the issue that is forcing this debate -- not organic growth in transaction volume, which on average, is nowhere near the limit.
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
September 04, 2015, 04:13:30 PM
#56
I agree that miners will form pools (like we see today) if doing so gives them an economy of scale.  This is similar to the production of most commodities.  For example, men panning solo for gold in a creek can no longer compete with the billion-dollar+ gold miners like Barrick.

I also agree that if Bitcoin becomes wildly successful and TX volume increases 1000 times, that people will require specialized hardware and high-end internet connections to run full nodes (I'll just use an SPV wallet on my phone).  

What I don't agree with is that this will lead to a negative result: quite the opposite in fact!  If we allow Bitcoin to grow, I expect to see vastly more user, more miners, probably more nodes, but definitely more security in numbers.  I can't prove this, but no one can prove the opposite either.  

My problem with the small-blockers is that they don't want to take the chance of success and instead want to commit us to failure.  You are proposing adding a quota to block space production:


My problem is these are a lot of assumptions.

To begin, the demand for transaction and space on the block chain can be considered infinite. From this reasoning we can infer that by specialized hardware & connections we really mean high-end datacenters .

Don't even think about giving me that bs that I won't allow Bitcoin to grow. Bitcoin doesn't grow only in transactions and transactions don't necessarily mean more users. Miners, until we are able to commoditize mining into consumer hardware, will inherently be lead into giant economies of scale.  "Probably more nodes" doesn't cut it either seeing the existing decrease in node count as compared to a couple years ago when Bitcoin had much less users. Finally yes, you can't prove this.

What you call a transaction quota I call a check on centralization. Remember that free market is not inherent to Bitcoin as it works under specific rules to carefully assign the players incentive. I suggest an irresponsible block size policy can change these incentives for the worst.

legendary
Activity: 1162
Merit: 1007
September 04, 2015, 04:05:28 PM
#55
pragmatically, you are right. However, simply being a bip101 advocate misses the essence of the OP.

Understood.  I was mostly replying to the idea of creating another BIP with the cap removed.

What I no longer like about BIP101 is that it makes people worried about committing to something over a 20 year period (even though that's nonsense because we could just as easily fork down in two years as fork up but let's pretend it's not).  

The only thing that matters over the next few years for BIP101 is: "Will nodes and miners support up to 8MB blocks or not?"  Maybe we should just work on the negotiation for 8 MB.  

BIP101's fans can continue to run "BIP101"--they'll just tell themselves:

   "Hehehehe by the time 16 MB rolls around, we'll have the small-blockers convinced that bigger blocks was right!"

The small-blockers can run "8-MB only" and tell themselves:

   "Mu-haa-haa-haa, we'll show those big-blockers the errors of their ways!  We'll see less nodes and more mining centralization and then we'll have evidence to lower the block size!  Mu-haa-haa-haa"

sr. member
Activity: 278
Merit: 254
September 04, 2015, 04:04:45 PM
#54
i can't wait to bitch about transaction malleability block limit is getting old.
Both problems continue to fester for lack of leadership and organizational process.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 04:03:47 PM
#53
 The most recurring objection in the Reddit thread was: if most the hash power doesn't set the exact same limit, then there exists a "sweet spot"-block size that splits the network exactly in half.  I don't think this will happen because everyone knows that it could happen unless most people come to consensus on a limit.  As long as nodes/miners can efficiently communicate block size limit negotiations, then I think we'll witness what I call "spontaneous consensus" events…sort of like a phase change in matter (liquid -> solid) but instead the 1 MB limit crumbles and a new precise limit at (e.g.) 8 MB is erected.  
 

Innnnnteresting.

Like for example, if most pools are on 8 gb, and some rogue pool decides they will be 3 gb,
then most people will move away from that pool just like people moved away from GHash when
it got 50%.

Yes.  But probably that rogue pool caves in and switches to 8 GB to stop the emigration of hash power before they're out-of-business.

These are the sorts of things we'd need to explain to improve the presentation for such a proposal.  How exactly do we think these "spontaneous consensus" events occur?

Good question.  It's not gonna be by mental osmosis. Tongue. You can have some kind of broadcasting of opinions but then we're starting to look more like bip 100.
member
Activity: 64
Merit: 10
September 04, 2015, 04:01:27 PM
#52
I hate to bring this up, but following the logic stated in this thread, why not just go with BIP 101 rather than removing the cap entirely?  I haven't heard anyone say that it moves too slowly in the cap increase.

Assuming we agree that it doesn't move too slowly, then BIP 101 is simply a safer method of removing the cap.  It's a cap removal with training wheels.  If there are those projecting that 101 moves too slowly in the cap increase, I retract my statement.  I just haven't seen them as of yet.

The main objection to BIP101 is that it moves too fast (not too slowly) in order for home-based full nodes to keep up.
If we approximate BIP101's curve with a series of 4-year-lock-steps of a static limit, each requiring a hard fork, then the home-based demographic (which I consider important) can be salvaged.

As long as nodes/miners can efficiently communicate block size limit negotiations, then I think we'll witness what I call "spontaneous consensus" events…sort of like a phase change in matter (liquid -> solid) but instead the 1 MB limit crumbles and a new precise limit at (e.g.) 8 MB is erected.  

The hassle of a hard fork is an important feature of the process, not a burden.
Ever wondered why evolutionists cannot wrap their heads about the fact that sexual reproduction consumes that much energy?
Well, here is your answer. Erecting the new 8Mb limit is definitely pointing you in the right direction, Peter R. Wink

legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 03:59:53 PM
#51
having a bip just makes it more newsworthy and gives the idea more range in the thoughtsphere.
hero member
Activity: 493
Merit: 500
September 04, 2015, 03:56:56 PM
#50
pragmatically, you are right. However, simply being a bip101 advocate misses the essence of the OP.

Understood.  I was mostly replying to the idea of creating another BIP with the cap removed.
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 03:49:51 PM
#49
 The most recurring objection in the Reddit thread was: if most the hash power doesn't set the exact same limit, then there exists a "sweet spot"-block size that splits the network exactly in half.  I don't think this will happen because everyone knows that it could happen unless most people come to consensus on a limit.  As long as nodes/miners can efficiently communicate block size limit negotiations, then I think we'll witness what I call "spontaneous consensus" events…sort of like a phase change in matter (liquid -> solid) but instead the 1 MB limit crumbles and a new precise limit at (e.g.) 8 MB is erected.  
 

Innnnnteresting.

Like for example, if most pools are on 8 gb, and some rogue pool decides they will be 3 gb,
then most people will move away from that pool just like people moved away from GHash when
it got 50%.

Yes.  But probably that rogue pool caves in and switches to 8 GB to stop the emigration of hash power before they're out-of-business.

These are the sorts of things we'd need to explain to improve the presentation for such a proposal.  How exactly do we think these "spontaneous consensus" events occur?
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 03:44:08 PM
#48
I hate to bring this up, but following the logic stated in this thread, why not just go with BIP 101 rather than removing the cap entirely?  I haven't heard anyone say that it moves too slowly in the cap increase.

Assuming we agree that it doesn't move too slowly, then BIP 101 is simply a safer method of removing the cap.  It's a cap removal with training wheels.  If there are those projecting that 101 moves too slowly in the cap increase, I retract my statement.  I just haven't seen them as of yet.

pragmatically, you are right. However, simply being a bip101 advocate misses the essence of the OP.
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 03:41:37 PM
#47
Peter, you know exactly what that means: centralization...
Before I respond to this, can you admit that both Greg and you were wrong (at least about the particular detail you called me out on above)?  

To be quite honest I didn't bother considering the math from the post I quickly digged up. I'm sorry my example was poorly chosen but to be clear that was Greg's point. It sounds you may have been right, about this particular factor.

Thank you.

Quote
My point was to demonstrate that miners are increasingly tending towards schemes and centralization that can diminish their relaying costs and create possible unfair advantage which could be precipitated by an inconsiderate block size increase.

I agree that miners will form pools (like we see today) if doing so gives them an economy of scale.  This is similar to the production of most commodities.  For example, men panning solo for gold in a creek can no longer compete with the billion-dollar+ gold miners like Barrick.

I also agree that if Bitcoin becomes wildly successful and TX volume increases 1000 times, that people will require specialized hardware and high-end internet connections to run full nodes (I'll just use an SPV wallet on my phone).  

What I don't agree with is that this will lead to a negative result: quite the opposite in fact!  If we allow Bitcoin to grow, I expect to see vastly more user, more miners, probably more nodes, but definitely more security in numbers.  I can't prove this, but no one can prove the opposite either.  

My problem with the small-blockers is that they don't want to take the chance of success and instead want to commit us to failure.  You are proposing adding a quota to block space production:



And permit a group of programmers to enforce that production quota for some greater good (that no one can really define).  However, before we go down the road of production quotas, we should take a long look at other economies where they've been implemented.  The most famous is of course the USSR, but even in Canada we have quotas on the production of eggs.  These were originally implemented to ensure that farmers could earn a living wage and the public could enjoy a reliable supply of eggs; however, they now serve to keep small famers out of the commercial egg market and result in all sorts of lobbying and palm greasing (and higher prices for consumers).  He who controls the quota wields the power to pick winners and losers...


hero member
Activity: 493
Merit: 500
September 04, 2015, 03:29:53 PM
#46
I hate to bring this up, but following the logic stated in this thread, why not just go with BIP 101 rather than removing the cap entirely?  I haven't heard anyone say that it moves too slowly in the cap increase.

Assuming we agree that it doesn't move too slowly, then BIP 101 is simply a safer method of removing the cap.  It's a cap removal with training wheels.  If there are those projecting that 101 moves too slowly in the cap increase, I retract my statement.  I just haven't seen them as of yet.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 03:25:58 PM
#45
  The most recurring objection in the Reddit thread was: if most the hash power doesn't set the exact same limit, then there exists a "sweet spot"-block size that splits the network exactly in half.  I don't think this will happen because everyone knows that it could happen unless most people come to consensus on a limit.  As long as nodes/miners can efficiently communicate block size limit negotiations, then I think we'll witness what I call "spontaneous consensus" events…sort of like a phase change in matter (liquid -> solid) but instead the 1 MB limit crumbles and a new precise limit at (e.g.) 8 MB is erected. 
 

Innnnnteresting.

Like for example, if most pools are on 8 gb, and some rouge pool decides they will be 3 gb,
then most people will move away from that pool just like people moved away from GHash when
it got 50%.
Pages:
Jump to: