Pages:
Author

Topic: blocksize solution: longest chain decides - page 2. (Read 2173 times)

hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
September 04, 2015, 03:22:37 PM
#44
In order to produce larger blocks in the future, we'll need miners to come up with more efficient communication schemes similar to the relay network.  This just allows them to produce block space for cheaper and for us to enjoy lower-cost transactions.  Win-win.  

(Note also that what matters is the network average, and not a few fast connections [although those fast connections do help to lower the network average.])

Peter, you know exactly what that means: centralization...

Before I respond to this, can you admit that both Greg and you were wrong (at least about the particular detail you called me out on above)?  

To be quite honest I didn't bother considering the math from the post I quickly digged up. I'm sorry my example was poorly chosen but to be clear that was Greg's point. It sounds you may have been right, about this particular factor. My point was to demonstrate that miners are increasingly tending towards schemes and centralization that can diminish their relaying costs and create possible unfair advantage which could be precipitated by an inconsiderate block size increase.
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 03:17:23 PM
#43
In order to produce larger blocks in the future, we'll need miners to come up with more efficient communication schemes similar to the relay network.  This just allows them to produce block space for cheaper and for us to enjoy lower-cost transactions.  Win-win.  

(Note also that what matters is the network average, and not a few fast connections [although those fast connections do help to lower the network average.])

Peter, you know exactly what that means: centralization...

Before I respond to this, can you admit that both Greg and you were wrong (at least about the particular detail you called me out on above)? 
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
September 04, 2015, 03:14:00 PM
#42
In order to produce larger blocks in the future, we'll need miners to come up with more efficient communication schemes similar to the relay network.  This just allows them to produce block space for cheaper and for us to enjoy lower-cost transactions.  Win-win.  

(Note also that what matters is the network average, and not a few fast connections [although those fast connections do help to lower the network average.])

Peter, you know exactly what that means: centralization. They WILL get bigger and better at it, the code will improve, the costs will slowly be negligible.
 
I'm not concerned with the miners interest but the nodes because they are the ones to whom the costs are externalized.

More block space, and cheaper transactions is NOT a win-win. It is in fact a classic tragedy of the commons: the miners and those who wish to fill the block chain with all matter of things in existence have an incentive that cost users access to the governance of the network.

I'm referring to a world of SPV wallets absent of privacy and ability to ensure yourself the decentralization of the network.

legendary
Activity: 1162
Merit: 1007
September 04, 2015, 03:00:19 PM
#41
He is describe absolutely real and existing mining configurations which promise to worsen in the future given a removal of the blocksize cap.

Then why is the network propagation impedance so high?  Answer: because he's talking about something hypothetical (that is unlikely to ever happen and may not even be possible).  

Lies, lies and lies. You have no shame!

Evidence #1 :

Quote
As a more concrete example, the block relay network today communicates far less than one bit per bit of blocksize to relay a block (e.g. transmitting a 962597 byte block using 3804 bytes-- I wonder why instead you did not announce your discovery that the block relay network has beat the Shanon limit! Smiley --- after all, by your metric it can transmit X bits of information over a channel which has _significantly_ less than X capacity).

Uh…this is an example of Greg Maxwell making a error.  Everyone who understands information theory could see that.  When I pointed out this error to Greg, he just tried to come up with another hypothetical argument for why the fee market won't exists, and dropped the line of argument you posted above.  

It's easy to explain: the variable γ in my paper is the coding gain and describes the factor by which the information (the solved block) can be compressed.  Sending 962597 bytes with only 3804 bytes is a coding gain of:

    γ = 962597 / 3804 = 253

You can see right here that my proof takes this into account:  



In order to produce larger blocks in the future, we'll need miners to come up with more efficient communication schemes similar to the relay network.  This just allows them to produce block space for cheaper and for us to enjoy lower-cost transactions.  Win-win.  


(Note also that what matters is the network average, and not a few fast connections [although those fast connections do help to lower the network average.])

legendary
Activity: 1904
Merit: 1037
Trusted Bitcoiner
September 04, 2015, 02:54:19 PM
#40
i can't wait to bitch about transaction malleability block limit is getting old.
legendary
Activity: 1904
Merit: 1037
Trusted Bitcoiner
September 04, 2015, 02:50:58 PM
#39
I think it would be more strategic to go the other way.  Present this now,
let everyone freak out about "no limit Bitcoin" and then Bip 101 will seem
more reasonable by comparison (to those that believe we need a limit
or the sky will fall).

I half agree.  But my worry is that without a more convincing presentation, it won't be taken seriously (i.e., it won't cause "everyone freak out about 'no limit Bitcoin.'")    

Yes it needs more meat.

A list of considerations and objections would need to be compiled and then addressed.

Agreed.  The most recurring objection in the Reddit thread was: if most the hash power doesn't set the exact same limit, then there exists a "sweet spot"-block size that splits the network exactly in half.  I don't think this will happen because everyone knows that it could happen unless most people come to consensus on a limit.  As long as nodes/miners can efficiently communicate block size limit negotiations, then I think we'll witness what I call "spontaneous consensus" events…sort of like a phase change in matter (liquid -> solid) but instead the 1 MB limit crumbles and a new precise limit at (e.g.) 8 MB is erected.  

Quote
What got me started on this thread was the thought that "hey wait a sec,
everyone had consensus on what Bitcoin is and does, but now we can't
agree because of some stupid technical detail."

Exactly!  The same thought hit me recently too Smiley


bitcoin died in 2016 not because of a stupid technical detail, but because 15 different solutions were implemented for 1 stupid technical detail...

you could say bitcoin was trolled to death.
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
September 04, 2015, 02:48:12 PM
#38
He is describe absolutely real and existing mining configurations which promise to worsen in the future given a removal of the blocksize cap.

Then why is the network propagation impedance so high?  Answer: because he's talking about something hypothetical (that is unlikely to ever happen and may not even be possible).  

Lies, lies and lies. You have no shame!

You forced my hand here:

Quote
As a more concrete example, the block relay network today communicates far less than one bit per bit of blocksize to relay a block (e.g. transmitting a 962597 byte block using 3804 bytes-- I wonder why instead you did not announce your discovery that the block relay network has beat the Shanon limit! Smiley --- after all, by your metric it can transmit X bits of information over a channel which has _significantly_ less than X capacity).

Quote
You assume miners do not have the ability to change their level centralization.

 -- In fact they do, not just in theory but in pratice have responded to orphaning this way in the past; and it is one of the major concerns in this space.

Quote
For example it does not reflect how hashers return work to pools _today_ (and since 2011) as they so to only by referencing the merkel
root... the pool already knows the  transaction set. In that particular case it knows it because it selected it to begin with, but the same behavior holds if the hasher selects the transaction set and sends it first.

It only _very_ weakly reflects how the relay protocol works (only the selection and permutation is communicated; not the transaction data itself; for already known transactions). Even if you assume nothing more than that (in spite of the existing reality) you have not shown that the compressed data must be linear in the size of the block.

It does not reflect how P2Pool works (which also sends the transactions in advance).
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 02:45:31 PM
#37
I think it would be more strategic to go the other way.  Present this now,
let everyone freak out about "no limit Bitcoin" and then Bip 101 will seem
more reasonable by comparison (to those that believe we need a limit
or the sky will fall).

I half agree.  But my worry is that without a more convincing presentation, it won't be taken seriously (i.e., it won't cause "everyone freak out about 'no limit Bitcoin.'")   

Yes it needs more meat.

A list of considerations and objections would need to be compiled and then addressed.

Agreed.  The most recurring objection in the Reddit thread was: if most the hash power doesn't set the exact same limit, then there exists a "sweet spot"-block size that splits the network exactly in half.  I don't think this will happen because everyone knows that it could happen unless most people come to consensus on a limit.  As long as nodes/miners can efficiently communicate block size limit negotiations, then I think we'll witness what I call "spontaneous consensus" events…sort of like a phase change in matter (liquid -> solid) but instead the 1 MB limit crumbles and a new precise limit at (e.g.) 8 MB is erected. 

Quote
What got me started on this thread was the thought that "hey wait a sec,
everyone had consensus on what Bitcoin is and does, but now we can't
agree because of some stupid technical detail."

Exactly!  The same thought hit me recently too Smiley

legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 02:37:34 PM
#36
I think it would be more strategic to go the other way.  Present this now,
let everyone freak out about "no limit Bitcoin" and then Bip 101 will seem
more reasonable by comparison (to those that believe we need a limit
or the sky will fall).

I half agree.  But my worry is that without a more convincing presentation, it won't be taken seriously (i.e., it won't cause "everyone freak out about 'no limit Bitcoin.'")   

Yes it needs more meat.

A list of considerations and objections would need to be compiled and then addressed.

Also a few more paragraphs could be added about the nature of consensus and the
current difficulties.  What got me started on this thread was the thought that "hey wait a sec,
everyone had consensus on what Bitcoin is and does, but now we can't
agree because of some stupid technical detail."

member
Activity: 64
Merit: 10
September 04, 2015, 02:30:19 PM
#35
The limit determines the size of the consensus network.
If nodes cannot keep up, they will be pushed out.

On one hand, operators of full nodes have the right to decide what the limit should be, on the other hand, there must be a single static limit agreed upon via consensus. Otherwise more important full nodes (large miners, exchanges, merchants) might want to increase their limit via user-definable option and begin pushing home-based full nodes out of the consensus.

8Mb is a reasonable middle point for the next 4 years.

Your post shows exceptional logic and clarity.... until the last sentence.

We want to stay as close as possible to actual network demand and there are very reasonable opinions as to why blocks should get filled on average until we precipitate any change.

I appreciate the compliment.

Regarding the actual 8Mb figure.
The hassle of a hard fork (implied by the idea of a single static limit) should be enough to prevent individual nodes from meddling with the limit (via user-definable option or by recompiling the client), but because of the same reasons (the hassle of a hard fork) we don't want to do that every now and then. So we must future-proof it with regards to the competition and the current state of technology. That's why evolution has stages, it's not a continuous smooth process. You can look at it as a game of Chess as well, if you will. Now is Bitcoin's turn to make a move.

More on 8Mb justifications here.
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 02:28:44 PM
#34
I think it would be more strategic to go the other way.  Present this now,
let everyone freak out about "no limit Bitcoin" and then Bip 101 will seem
more reasonable by comparison (to those that believe we need a limit
or the sky will fall).

I half agree.  But my worry is that without a more convincing presentation, it won't be taken seriously (i.e., it won't cause "everyone freak out about 'no limit Bitcoin.'")   
sr. member
Activity: 278
Merit: 254
September 04, 2015, 02:26:09 PM
#33
He is describe absolutely real and existing mining configurations which promise to worsen in the future given a removal of the blocksize cap.

I don't understand what "mining configuration" has to do with block size.  I have a "mining configuration" that runs off a slow DSL line.  My mining configuration runs a single stratum connection and transfers one block header every few seconds.   I have successfully mined a block this way using a well connected solo mining pool.

Mining centralization is not the same as node centralization.

legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
September 04, 2015, 02:23:15 PM
#32
The hardcoding of a blocksize is part of consensus just as they are running the same codebase overall.  I question the wisdom of making that part of the code at all.  However, it seems that having no limit could be problematic, and as Panther pointed out, miners do want some predictability, which is why I'm liking Bip100 more. Do you know if Jeff Garzik ever considered making it a 51% vote?

Sickpig suggested that the block size limit was a transport layer constraint that crept into the consensus layer.  I agree.  I don't think we really need a protocol enforced limit because:

1. There is a significant economic cost to producing large spam blocks as an attack.  

2. There is a physical limitation to how large a block can be (due to bandwidth and other constraints).  

3. The miners can enforce an effective limit anyways.  

BIP100 seems redundant (and dangerous if it's not based on majority votes).  Let's get rid of the limit and let the miners sort it out.  

Ok cool.  Let's.

Is there a BIP for this?

If not, can we create one?

/u/awemany started working on one.  Here's a Reddit post introducing the concept:

https://www.reddit.com/r/bitcoin_uncensored/comments/3hdeqs/a_block_size_limit_was_never_part_of_satoshis/

And here's the link to his draft:

https://github.com/awemany/bslconfig/releases/download/second-draft/bslconfig.pdf

The idea was actually to just change MAX_BLOCKSIZE from constant to a user-adjustable variable max_blocksize (the user could set using the GUI to 8 MB or to infinity or to whatever).  

I felt this idea needed a better presentation to gain traction, and I suggested we hold off for now.  I may pursue this idea in the fall if there's still deadlock.  


I think it would be more strategic to go the other way.  Present this now,
let everyone freak out about "no limit Bitcoin" and then Bip 101 will seem
more reasonable by comparison (to those that believe we need a limit
or the sky will fall).
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 02:13:27 PM
#31
He is describe absolutely real and existing mining configurations which promise to worsen in the future given a removal of the blocksize cap.

Then why is the network propagation impedance so high?  Answer: because he's talking about something hypothetical (that is unlikely to ever happen and may not even be possible).  
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
September 04, 2015, 02:12:32 PM
#30
The limit determines the size of the consensus network.
If nodes cannot keep up, they will be pushed out.

On one hand, operators of full nodes have the right to decide what the limit should be, on the other hand, there must be a single static limit agreed upon via consensus. Otherwise more important full nodes (large miners, exchanges, merchants) might want to increase their limit via user-definable option and begin pushing home-based full nodes out of the consensus.

8Mb is a reasonable middle point for the next 4 years.

Your post shows exceptional logic and clarity.... until the last sentence.

We want to stay as close as possible to actual network demand and there are very reasonable opinions as to why blocks should get filled on average until we precipitate any change.
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
September 04, 2015, 02:09:30 PM
#29
These costs, as it has been explained to you repeatedly, are trivial given incremental centralization. In other words miners are incentivized to centralize so as to limit their orphan risks and create larger blocks.

No, Gmax has been hand-waving about some hypothetical future network configuration (that is unlike what bitcoin is or has ever been) that many people with broader knowledge of economics disagree with (assuming it's even physically possible).  

Right now--based on the average network propagation impedance--it would cost about 4 BTC to produce an 8 MB block, and VASTLY more to produce a 1 GB block (if the limit were removed).  Connectivity would have to improve already just to make 20 MB blocks economical...

You're lieing and a quick check into your email correspondence with him can confirm.

He is describe absolutely real and existing mining configurations which promise to worsen in the future given a removal of the blocksize cap.
member
Activity: 64
Merit: 10
September 04, 2015, 01:59:32 PM
#28
The limit determines the size of the consensus network.
If nodes cannot keep up, they will be pushed out.

On one hand, operators of full nodes have the right to decide what the limit should be, on the other hand, there must be a single static limit agreed upon via consensus. Otherwise more important full nodes (large miners, exchanges, merchants) might want to increase their limit via user-definable option and begin pushing home-based full nodes out of the consensus.

8Mb is a reasonable middle point for the next 4 years.
legendary
Activity: 1904
Merit: 1037
Trusted Bitcoiner
September 04, 2015, 01:56:28 PM
#27

/u/awemany started working on one.  Here's a Reddit post introducing the concept:

https://www.reddit.com/r/bitcoin_uncensored/comments/3hdeqs/a_block_size_limit_was_never_part_of_satoshis/

And here's the link to his draft:

https://github.com/awemany/bslconfig/releases/download/second-draft/bslconfig.pdf

The idea was actually to just change MAX_BLOCKSIZE from constant to a user-adjustable variable max_blocksize (the user could set using the GUI to 8 MB or to infinity or to whatever).  

I felt this idea needed a better presentation to gain traction, and I suggested we hold off for now.  I may pursue this idea in the fall if there's still deadlock.  


that's how it starts, there will be many ideas, and a lot of work put behind each one, forks start spewing out, hashing power get divided 2 no 3 no 4 "Bitcoin Blockchains" all at once competing all pretty much the same thing, but the geeks agree they are profoundly different, no one understands why!?!?, outsiders are convinced, bitcoin is and always will be GEEKY AS FUCK.

the end is near!

legendary
Activity: 1162
Merit: 1007
September 04, 2015, 01:52:38 PM
#26
These costs, as it has been explained to you repeatedly, are trivial given incremental centralization. In other words miners are incentivized to centralize so as to limit their orphan risks and create larger blocks.

No, Gmax has been hand-waving about some hypothetical future network configuration (that is unlike what bitcoin is or has ever been) that many people with broader knowledge of economics disagree with (assuming it's even physically possible).  

Right now--based on the average network propagation impedance--it would cost about 4 BTC to produce an 8 MB block, and VASTLY more to produce a 1 GB block (if the limit were removed).  Connectivity would have to improve already just to make 20 MB blocks economical...
legendary
Activity: 1162
Merit: 1007
September 04, 2015, 01:44:44 PM
#25
The hardcoding of a blocksize is part of consensus just as they are running the same codebase overall.  I question the wisdom of making that part of the code at all.  However, it seems that having no limit could be problematic, and as Panther pointed out, miners do want some predictability, which is why I'm liking Bip100 more. Do you know if Jeff Garzik ever considered making it a 51% vote?

Sickpig suggested that the block size limit was a transport layer constraint that crept into the consensus layer.  I agree.  I don't think we really need a protocol enforced limit because:

1. There is a significant economic cost to producing large spam blocks as an attack.  

2. There is a physical limitation to how large a block can be (due to bandwidth and other constraints).  

3. The miners can enforce an effective limit anyways.  

BIP100 seems redundant (and dangerous if it's not based on majority votes).  Let's get rid of the limit and let the miners sort it out.  

Ok cool.  Let's.

Is there a BIP for this?

If not, can we create one?

/u/awemany started working on one.  Here's a Reddit post introducing the concept:

https://www.reddit.com/r/bitcoin_uncensored/comments/3hdeqs/a_block_size_limit_was_never_part_of_satoshis/

And here's the link to his draft:

https://github.com/awemany/bslconfig/releases/download/second-draft/bslconfig.pdf

The idea was actually to just change MAX_BLOCKSIZE from constant to a user-adjustable variable max_blocksize (the user could set using the GUI to 8 MB or to infinity or to whatever).  

I felt this idea needed a better presentation to gain traction, and I suggested we hold off for now.  I may pursue this idea in the fall if there's still deadlock.  
Pages:
Jump to: