Pages:
Author

Topic: Bitcoin 20MB Fork - page 70. (Read 154787 times)

legendary
Activity: 1904
Merit: 1007
February 12, 2015, 02:35:37 AM
yes, i know. Gavincoin proposal is an attack on bitcoin. I just wanted to figure out which public channels to watch so i can be among the first ones to sell in case the community is forked and bitcoin hijacked.
I guess i'll just sell now and wait until the dust settles. Won't go anywhere before the halving anyways.
I think i might buy Mpcoin later if it comes to the MCA

Why do you think Gavincoin is an attack on bitcoin while MPcoin isn't?
legendary
Activity: 4690
Merit: 1276
February 12, 2015, 02:19:27 AM
...
But eventually after six months or so, let's say I get an 1100 kilobyte block (block B1, built on block A0).  It goes out on the network, and all the people
...

Ha!  Try six minutes or so.  I'll bet there will be a pretty nice bounty for exclusive use of the first 'gavintaint' (as disgusting as that sounds) and it is fairly cheap to spam a block to capacity...since all the Libertarians here are dead-set on subsidizing Bitcoin for the huddled masses to use.  An amazing quailty of Bitcoin is that it can turn Libertarians into Socialist.  Go figure.


So....  First, I have no idea where this "bounty" comes from.  Who would pay it and why?  Second, it'll be an orphaned block unless the majority of hash power is already on the B version, so where could a 'taint' come from anyway?  Unless the majority of hashing power accepts that block as valid, the chain it's in will be orphaned long before its coinbase can be spent, and if it's orphaned none of the other clients are going to accept that block as evidence that a transaction has happened, so they'll just ignore it, repeat its transactions in <1Mb blocks, and the world goes on.

I think maybe you believe that someone has motivations they don't have.


Like I said earlier, I will wish to send myself transactions on your 'B' chain as soon as I can.  These I will be able to spend on the 'B' chain while they remain on the 'A' chain which is the chain I am betting on for long duration survival and usefulness.

Making sure that transactions are tainted with coinbase from 'B' will insure that transactions can never be re-played on 'A'.  Thus, I'll want to get some coinbase from the first block which is over 1MB as soon as possible.

Since this splitting operation is just sending transactions from one hand to another (I control both the spend and recieve keys) these operations should be safe even if there is a lot of troubles and re-orgs and so on.  If a deep re-org is necessary due to some oversight or attack or whatever, that's fine.  I'll just try it again a bit later.

legendary
Activity: 924
Merit: 1132
February 12, 2015, 02:08:57 AM
...
But eventually after six months or so, let's say I get an 1100 kilobyte block (block B1, built on block A0).  It goes out on the network, and all the people
...

Ha!  Try six minutes or so.  I'll bet there will be a pretty nice bounty for exclusive use of the first 'gavintaint' (as disgusting as that sounds) and it is fairly cheap to spam a block to capacity...since all the Libertarians here are dead-set on subsidizing Bitcoin for the huddled masses to use.  An amazing quailty of Bitcoin is that it can turn Libertarians into Socialist.  Go figure.


So....  First, I have no idea where this "bounty" comes from.  Who would pay it and why?  Second, it'll be an orphaned block unless the majority of hash power is already on the B version, so where could a 'taint' come from anyway?  Unless the majority of hashing power accepts that block as valid, the chain it's in will be orphaned long before its coinbase can be spent, and if it's orphaned none of the other clients are going to accept that block as evidence that a transaction has happened, so they'll just ignore it, repeat its transactions in <1Mb blocks, and the world goes on.

I think maybe you believe that someone has motivations they don't have.

legendary
Activity: 1372
Merit: 1008
1davout
February 12, 2015, 02:07:05 AM
Normally , I am very negative towards TBF , but we have to give them a little credit for this small gesture at trying to increase decentralization with around ~1k being awarded to increase node count this year --

https://getaddr.bitnodes.io/nodes/incentive/

https://bitcointalksearch.org/topic/bitnodes-incentive-program-952996

So basically, there's a current shortage of nodes. Which leads TBF to actually pay for people to run some.
And at the same time considering making larger blocks?
sr. member
Activity: 532
Merit: 251
February 11, 2015, 11:51:13 PM
I suspect you are not standing atop the great pyramid with me, which I encourage because the perspective is truly enlightening:


You understand a lot of things.
Do you understand that it may not be necessary to guess?  
Don't you think it would be better to measure than to guess?

We have something that will be there in the future telling us how big blocks can safely be, the block chain itself can tell us.  
We do this already with difficulty, which is adjusted by how frequently acceptable hashes are calculated.

If we are going from a fixed limit, to a flexible one, let us at least attempt to do it so that we are no longer guessing.
We are determining a certain definition of what is to become a very special measuring device, because it is one born of consensus. Because of this it is in fact necessary that there is both some guess and some form of communal error vs what would be ideal (because one cannot tell the future, and mankind to this point is perceptually in conflict).

THAT is the true problem the architect realizes. Now you can convince the elite of this, but the issue is rather the masses wish bitcoin to be many things that alas it cannot be.  This is upsetting to the religious masses but not the top "scientists" or rather the "priests" and "advisors".

So what we are seeing is rather that we must make a consensus on a stagnant dead fixed decision, and allow what grows around it to be that which is "adjustable".  Unless there can be no consensus towards that direction, and then it seems everyone loses. (It shouldn't be seen as irrelevant that the greek/euro decision is to be discussed and announced soon and might dissolve the euro eventually).
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 11, 2015, 11:31:55 PM
Miles were different too, and quite a number of other standards.

Our disagreement seems based when and whether consensus is more important than quality, and the difference between and standards and protocols.  I'd aver that it is quality which creates consensus in protocols.  

This debate is not over what makes a standard unit of measurement.  I'll suggest to you that we are not designing a standard, it is a protocol, which is an entirely different thing.  A megabyte will still be a megabyte, and a block will still be a block.

Where an authority may deem a standard, it ought not deem a protocol.
This is why the connections I have made are so significant.  Szabo's seamless comparison of software and the Wealth of Nations, allows us to completely bridge the "data" from our economic history and we are then clearly able to see the connection of things such as the arising of pyramids as a product of a secure society. We ask "how can they be built so accurate", when the reality is they could NOT have been so secure and wealthy without such accuracy (again trying not to suggest which is the cause and which the effect).

And so also is important that "what is ideal money?" to which John Nash has extensively explained and extrapolated for us (and I seemingly am by far the most read in the world on this issue (and would love to meet someone to the otherwise)), and then there is THIS poster that is the most knowledgeable on the concept of "ideal poker" (the evolution of "skilled games").

But you allude to a disconnection of words, and or language, or in other words when it comes to protocol you refer to what is "ideal dialogue", and this too have been extensively explained, by Dr. David Bohm (where both bohm and nash have interesting connections to einstein): http://sprott.physics.wisc.edu/chaos-complexity/dialogue.pdf

And so rather than your point about the differences between protocol and measurement standing, I think rather our point of the similarity between these things is more "relevant", or deserves to be relevated, or needs relevation.  In other words the pyramids being the bedrock of the civilization was the necessary measuring device or basepoint for all aspects which include language (basis of protocol/communiciation). Clearly we didn't forget this, but we didn't understated there relations in the first place (or for a very long time).  It cannot be irrelevant we lost our understanding of any of this as well as the actual Egyptian language itself up until 150 or so years ago.


Bribe me with a solution that will not have to be revisited again in the future.
  
One that is not based on a prediction of the future but that adjusts with the needs of the future automatically.
Give me blocks that are as big as needed but not bigger, and protect us from rogue miners that may want to knock lower bandwidth miners out or increase the storage costs for everyone running a full node by loading up blocks with massive transaction bloat.

There have been such proposals made, they are not as simple as Gavin's.  If we have a solution that provides good confidence that we never have to have this discussion again, that is a bribe I would accept.   Why go from one wrong number to another?
Yes exactly and this we must understand is part of the problem (and so arises the solution), because clearly in the future such a consensus may in fact be impossible, but we must also understand as well, it seems that in choosing the bedrock unit for society that will come out of the block size decision, we have only the very little empirical evidence that was needed to be gathered since block-chain genesis.

And so gavin's best guess, is by far the most logical best guess available. There cannot be a better extrapolation from the future (at this time), and so the ideal here becomes the consensus that is closest to the architects suggestion.  

You understand a lot of things.
Do you understand that it may not be necessary to guess?  
Don't you think it would be better to measure than to guess?

We have something that will be there in the future telling us how big blocks can safely be, the block chain itself can tell us.  
We do this already with difficulty, which is adjusted by how frequently acceptable hashes are calculated.

If we are going from a fixed limit, to a flexible one, let us at least attempt to do it so that we are no longer guessing.
hero member
Activity: 742
Merit: 500
February 11, 2015, 11:30:55 PM
Can someone please provide me the following info:

-where will the outcome of the discussion be announced?

It will be propagated along the bitcoin network.


-if they would be pushing for the hardfork: where will it be announced first?

the block chain network

-how to get most up to date info on the decisions?

block chain

-who is making the final decisions?

the consensus of the network

thanks for briefly letting me know
My definitions are terrible etc, but we need to understand the true nature of the problem we face.

yes, i know. Gavincoin proposal is an attack on bitcoin. I just wanted to figure out which public channels to watch so i can be among the first ones to sell in case the community is forked and bitcoin hijacked.
I guess i'll just sell now and wait until the dust settles. Won't go anywhere before the halving anyways.
I think i might buy Mpcoin later if it comes to the MCA
sr. member
Activity: 532
Merit: 251
February 11, 2015, 11:17:49 PM
Miles were different too, and quite a number of other standards.

Our disagreement seems based when and whether consensus is more important than quality, and the difference between and standards and protocols.  I'd aver that it is quality which creates consensus in protocols.  

This debate is not over what makes a standard unit of measurement.  I'll suggest to you that we are not designing a standard, it is a protocol, which is an entirely different thing.  A megabyte will still be a megabyte, and a block will still be a block.

Where an authority may deem a standard, it ought not deem a protocol.
This is why the connections I have made are so significant.  Szabo's seamless comparison of software and the Wealth of Nations, allows us to completely bridge the "data" from our economic history and we are then clearly able to see the connection of things such as the arising of pyramids as a product of a secure society. We ask "how can they be built so accurate", when the reality is they could NOT have been so secure and wealthy without such accuracy (again trying not to suggest which is the cause and which the effect).

And so also is important that "what is ideal money?" to which John Nash has extensively explained and extrapolated for us (and I seemingly am by far the most read in the world on this issue (and would love to meet someone to the otherwise)), and then there is THIS poster that is the most knowledgeable on the concept of "ideal poker" (the evolution of "skilled games").

But you allude to a disconnection of words, and or language, or in other words when it comes to protocol you refer to what is "ideal dialogue", and this too have been extensively explained, by Dr. David Bohm (where both bohm and nash have interesting connections to einstein): http://sprott.physics.wisc.edu/chaos-complexity/dialogue.pdf

And so rather than your point about the differences between protocol and measurement standing, I think rather our point of the similarity between these things is more "relevant", or deserves to be relevated, or needs relevation.  In other words the pyramids being the bedrock of the civilization was the necessary measuring device or basepoint for all aspects which include language (basis of protocol/communiciation). Clearly we didn't forget this, but we didn't understated their relations in the first place (or for a very long time).  It cannot be irrelevant we lost our understanding of any of this as well as the actual Egyptian language itself up until 150 or so years ago.


Bribe me with a solution that will not have to be revisited again in the future.
  
One that is not based on a prediction of the future but that adjusts with the needs of the future automatically.
Give me blocks that are as big as needed but not bigger, and protect us from rogue miners that may want to knock lower bandwidth miners out or increase the storage costs for everyone running a full node by loading up blocks with massive transaction bloat.

There have been such proposals made, they are not as simple as Gavin's.  If we have a solution that provides good confidence that we never have to have this discussion again, that is a bribe I would accept.   Why go from one wrong number to another?
Yes exactly and this we must understand is part of the problem (and so arises the solution), because clearly in the future such a consensus may in fact be impossible, but we must also understand as well, it seems that in choosing the bedrock unit for society that will come out of the block size decision, we have only the very little empirical evidence that was needed to be gathered since block-chain genesis.

And so gavin's best guess, is by far the most logical best guess available. There cannot be a better extrapolation from the future (at this time), and so the ideal here becomes the consensus that is closest to the architects suggestion.  
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 11, 2015, 11:07:05 PM
If we are against the royal cubit (20mb), then I should like to understand from the individuals perspective, how much you could be bribed for too accept and adopt it...or what your counter offer might be...

thanks

(are we standing on the pyramids together? Quote
Launch the interactive, choose “khufu” and then “view from top. Stand on the pyramids: http://www.pbs.org/wgbh/nova/ancient/explore-ancient-egypt.html)

Bribe me with a solution that will not have to be revisited again in the future.
 
One that is not based on a prediction of the future but that adjusts with the needs of the future automatically.
Give me blocks that are as big as needed but not bigger, and protect us from rogue miners that may want to knock lower bandwidth miners out or increase the storage costs for everyone running a full node by loading up blocks with massive transaction bloat.

There have been such proposals made, they are not as simple as Gavin's.  If we have a solution that provides good confidence that we never have to have this discussion again, that is a bribe I would accept.   Why go from one wrong number to another?
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 11, 2015, 11:00:37 PM
In the ideal case, we would discover this method of calculation, and that would let us avoid future forks over this same issue in case it ever must be re-readdressed in order to "save Bitcoin".
Ah, I think you've missed my perspective!  and possibly the very nature of this problem.  First in relation to money and what is considered "ideal", we should definitely consider 20 years of lectures on the subject: http://sites.stat.psu.edu/~babu/nash/money.pdf

And then of course we must consider what it is, in fact, that does create the wealth of the wealthiest "nations" http://www.gutenberg.org/files/3300/3300-h/3300-h.htm (AN INQUIRY INTO THE NATURE AND CAUSES OF THE WEALTH OF NATIONS. By Adam Smith)

It might be helpful as well, if you would be so kind to stand atop the great pyramid whilst we discuss this important matter:

Quote from: thewealthofchips
Launch the interactive, choose “khufu” and then “view from top. Stand on the pyramids: http://www.pbs.org/wgbh/nova/ancient/explore-ancient-egypt.html

And so truly we end up with the realization that CANNOT in fact have the ideal case, that by discovering a method or measurement then we might be able to choose what is ideal.  Because we are simultaneously asking the question of what that method should be.  Or in other words, because of a shit ton of backwards reverse engineering type maths, there has been a realization that there can be no consensus on the ideal.  Consensus will only come on some that is not ideal.

In other words there MUST be a single static consensus and it cannot then therefore be ideal...and so in light of such insanity the only logical solution would be to take the architects advice, and I simply point out the relation of his specific advice to the royal cubit.  We will tell the masses it is related perfectly to the pyramids, and that will make sense to them:

Quote from: wiki
In Ancient Egypt, cubit rods were used for the measurement of length. A number of these have survived: two are known from the tomb of Maya, the treasurer of Tutankhamun, in Saqqara; another was found in the tomb of Kha (TT8) in Thebes. Fourteen such rods, including one double cubit rod, were described and compared by Lepsius in 1865.[6] These cubits range from 523 to 529 mm (20.6 to 20.8 in) in length, and are divided into seven palms; each palm is divided into four fingers and the fingers are further subdivided.[4][6][7]

Miles were different too, and quite a number of other standards.

Our disagreement seems based when and whether consensus is more important than quality, and the difference between and standards and protocols.  I'd aver that it is quality which creates consensus in protocols.  

This debate is not over what makes a standard unit of measurement.  I'll suggest to you that we are not designing a standard, it is a protocol, which is an entirely different thing.  A megabyte will still be a megabyte, and a block will still be a block.

Where an authority may deem a standard, it ought not deem a protocol.
hero member
Activity: 658
Merit: 501
February 11, 2015, 10:53:18 PM
Normally , I am very negative towards TBF , but we have to give them a little credit for this small gesture at trying to increase decentralization with around ~1k being awarded to increase node count this year --

https://getaddr.bitnodes.io/nodes/incentive/

https://bitcointalksearch.org/topic/bitnodes-incentive-program-952996
sr. member
Activity: 532
Merit: 251
February 11, 2015, 10:48:20 PM
If we are against the royal cubit (20mb), then I should like to understand from the individuals perspective, how much you could be bribed for too accept and adopt it...or what your counter offer might be...

thanks

(are we standing on the pyramids together? Quote
Launch the interactive, choose “khufu” and then “view from top. Stand on the pyramids: http://www.pbs.org/wgbh/nova/ancient/explore-ancient-egypt.html)
legendary
Activity: 4690
Merit: 1276
February 11, 2015, 10:22:13 PM
...
But eventually after six months or so, let's say I get an 1100 kilobyte block (block B1, built on block A0).  It goes out on the network, and all the people
...

Ha!  Try six minutes or so.  I'll bet there will be a pretty nice bounty for exclusive use of the first 'gavintaint' (as disgusting as that sounds) and it is fairly cheap to spam a block to capacity...since all the Libertarians here are dead-set on subsidizing Bitcoin for the huddled masses to use.  An amazing quailty of Bitcoin is that it can turn Libertarians into Socialist.  Go figure.

'gavintaint' is kind of like the bathtub ring left by The Cat In The Hat so it won't be hard to come by, but it might be handy to be the first guy get access to some.  I predict that as soon as gavintaint is available there will be a big rush by many hodlers to double-up.  This should fill your 'B' blocks for some time.

legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
February 11, 2015, 10:21:53 PM
In the ideal case, we would discover this method of calculation, and that would let us avoid future forks over this same issue by the limit adjusting as needed, so that it never must be re-readdressed in order to "save Bitcoin".
The ideal case is to recognize that whatever problem a protocol limit is believed to solve, it's being solved the wrong way, then solve the problem a better way, then remove the unnecessary limit.
That would be fine too... better even, however other than a block size limit, what may be a solve to too large (even absurdly large) block sizes?  Large blocks have a slightly higher chance of being orphaned, but they also have a counterbalancing effect of being a type of 'innocent' selfish mining.

This effect is stronger against smaller mining concerns or lower bandwidth pools.
sr. member
Activity: 532
Merit: 251
February 11, 2015, 09:27:09 PM
Can someone please provide me the following info:

-where will the outcome of the discussion be announced?

It will be propagated along the bitcoin network.


-if they would be pushing for the hardfork: where will it be announced first?

the block chain network

-how to get most up to date info on the decisions?

block chain

-who is making the final decisions?

the consensus of the network

thanks for briefly letting me know
My definitions are terrible etc, but we need to understand the true nature of the problem we face.
hero member
Activity: 742
Merit: 500
February 11, 2015, 09:23:33 PM
Can someone please provide me the following info:

-where will the outcome of the discussion be announced?
-if they would be pushing for the hardfork: where will it be announced first?
-how to get most up to date info on the decisions?
-who is making the final decisions?


thanks for briefly letting me know
sr. member
Activity: 532
Merit: 251
February 11, 2015, 09:21:21 PM
So we are necessarily left with:

Quote
Change or no Change.
Limit or No Limit.
Architect logic or Sub Architect logic.
And each of which (aside from no change) requires consensus.

Which seems then the that only reasonably favorable direction then is to try to change the limit to some limit as close to the architects' logic as possible. And of course it must then be by mostly the community's doing because for example Gav, doesn't not want it to be called Gav coin (and no intelligent person would want it to be "their-name"-coin).

Looking from that certain perspective: We have Deal or No Deal vs those that might help secure the consensus (but not so happy about doing so)...and that our most favorable size is the royal cubit...

And an offer then to those that oppose it:

How much would you take (and what form of payment), to sway you to for example to 20mb...and/or what is your utility function in relation to different block sizes.  Will you take a specific different size...or will you take a lower bribe for a size somewhat closer to the royal cubit?

It seems important we understand this problem in order to be able to relevate the solution.  And so it might then be helpful that I am the world best poker player (provided satoshi owns the banking system, and exactly weighing on the exact outcome of this problem (ie if Satoshi owns the financial system, I own poker).

full member
Activity: 137
Merit: 100
February 11, 2015, 08:58:41 PM
In the ideal case, we would discover this method of calculation, and that would let us avoid future forks over this same issue by the limit adjusting as needed, so that it never must be re-readdressed in order to "save Bitcoin".
The ideal case is to recognize that whatever problem a protocol limit is believed to solve, it's being solved the wrong way, then solve the problem a better way, then remove the unnecessary limit.
I don't think the limit should be 100% removed as without it, someone could create ridiculously large blocks so that other nodes would have difficulty remaining synchronized with the network. That in turn would cause more orphaned blocks   
sr. member
Activity: 532
Merit: 251
February 11, 2015, 08:51:08 PM
What if the degree of fragmentation is being heavily exaggerated or overestimated by one side?
Such a question seems completely reasonable and comparable to a (simple) game theory problem no (ie bluffs and credibility etc)?

May I ask Sir/Madame, are you standing on top the "great pyramid" with me?  Might we see this together (from the same view/vantage)?

Quote
Launch the interactive, choose “khufu” and then “view from top. Stand on the pyramids: http://www.pbs.org/wgbh/nova/ancient/explore-ancient-egypt.html
legendary
Activity: 1400
Merit: 1013
February 11, 2015, 08:49:14 PM
If an important decision fragments the community too much then nothing really matters, you dissolve the entire movement.  We shouldn't need to show the maths or computations for that. I'm not sure if we realize this is the difficulty, and the only difficulty.  Those that are arguing clear points, from clearly sides, do not seem to fully grasp the difficulty of the issue.  In short, the trade offs do not permit "ideal-ness", and the logical argument is not necessarily going to align a large enough consensus.
What if the degree of fragmentation is being heavily exaggerated or overestimated by one side?
Pages:
Jump to: