Pages:
Author

Topic: Gold collapsing. Bitcoin UP. - page 10. (Read 2032140 times)

legendary
Activity: 1400
Merit: 1009
August 17, 2015, 04:11:28 PM
But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid.
I mentioned this, idk about a hundred pages back, in the form of talking about what mining is actually for.

The only reason we need a P2P network and mining at all is to resolve double spending. Most of the work of validating transactions is stateless, except that two proofs are needed which require a mined blockchain to produce:

  • The inputs to the transaction exist
  • No transaction exists which spends the same inputs

There are really only two ways that mining can fail:

  • A miner can perform a double spend
  • A miner can execute a denial of service against valid transactions

The point of proof of work is to raise the cost of both of those attacks. Bitcoin never has made either of those attacks impossible in a mathmatical sense (and doing so is probably impossible) - all Bitcoin ever did was put a defined cost on those two attacks.

If people would stop arguing about undefined terms like "decentralization", then maybe instead we could talk about attacks in terms of ways an attacker might reduce the proof of work cost for performing double spending or DoS attacks.

It should be possible to stop worrying about miners and all the ways they might behave sub-optimally, as long as they don't have any way to avoid paying the specified PoW cost for any attacks they might perform.

View in these terms, having any protocol-mandated block size limit at all is a built-in denial of service attack and so should be removed as soon as possible.
legendary
Activity: 1764
Merit: 1002
August 17, 2015, 04:10:26 PM
what a laugh.

before Peter's paper, Cripplecoiner's were arguing how IBLT wasn't proven, practical, needed or likely to be accepted since we had SC's and LN on the horizon.

now that the argument has changed, they hold up IBLT as "inevitable" and destined to undermine the propagation latency of blocks that Peter's theory relies upon.

such duplicity.

 Huh

Care to support this with quotes or you are just pulling things out your ass as is the norm for you?

Who are the "cripplecoiners" in a story where the debate didn't quite form yet?

solex, who has been one of the threads most prominent experts on IBLT, should be able to back up my claim of Blockstream core dev resistance to implementing IBLT.  furthermore, all one has to know is that the original IBLT proposal was made by Gavin and you know how Todd and BS react to anything Gavin.
legendary
Activity: 1764
Merit: 1002
August 17, 2015, 04:06:55 PM

I think Erdogan's insight into the game theory behind the block size limit was correct.

But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid.  

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.



I think this is confusing a protocol enforced "limit" and market preferences.

There can not be disagreement on the protocol enforced limit, which is what your right hand graph shows. If there was then miners that issued larger blocks will get forked off of every miner with a lower limit.

I think the current situation is also more representative of the right hand side graph, than the left. Today all miners have a fixed protocol limit of 1MB, but many have preferences for smaller blocks. For example the stress tests showed just how many still had the 750KB soft limit in place. So we in practice have the right hand graph today.

What is needed instead is to get rid of the protocol limit in practice (maybe keep a high water anti-spam limit which is what the 1MB was/is), while letting the market show it's preferences. This would be like a combination of the two graphs where an anti-spam limit is far off to the right of the graph, and below that miners show a range of preferences on block sizes they are willing to both issue and accept, which looks like your graph on the right.

This situation probably leads to a loose and dynamic form of market consensus on sizes. Miners that decided to only accept blocks well below most other miners' preference risk being orphaned at a higher rate and so are forced to up the size they accept to better match other miners. At the same time miners that issue blocks larger than what most other miners are willing to build on also risk being orphaned at a higher rate. The result is miners are forced by market pressures to move towards a consensus.

This is where we should be and the hard protocol limit prevents the market from properly functioning.

i was thinking the same thing.  but in practice, i think the graph on the right will be much more steep approaching the limit as drawn on the left graph.  IOW, all miners will have to stay relatively close together to prevent the dynamic you just outlined of being forked off from multiple more variant block size limits.  they then have to tip toe upwards.
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
August 17, 2015, 04:06:42 PM
what a laugh.

before Peter's paper, Cripplecoiner's were arguing how IBLT wasn't proven, practical, needed or likely to be accepted since we had SC's and LN on the horizon.

now that the argument has changed, they hold up IBLT as "inevitable" and destined to undermine the propagation latency of blocks that Peter's theory relies upon.

such duplicity.

 Huh

Care to support this with quotes or you are just pulling things out your ass as is the norm for you?

Who are the "cripplecoiners" in a story where the debate didn't quite form yet?
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
August 17, 2015, 04:02:30 PM

I think Erdogan's insight into the game theory behind the block size limit was correct.

But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid.  

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.

I notice you did not reply or comment on my opinion of you guys solutions... Maybe you missed it, maybe you don't care to answer? Either way I'd be curious to better understand the logic behind your proposal.

...


I honestly don't know what you're asking me.  What you quoted seemed like substance-less hand-wavving to me.

All we're proposing is to return to the design proposed in the Bitcoin white paper where "nodes accept the block only if all transactions in it are valid and not already spend" (#5):



Furthermore, I do not take Bitcoin's success as inevitable.  It may fail.  Right now, I see the biggest risk as developer centralization.  The risk of miner centralization due to adjusting an anti-spam measure seems minor in comparison.  

Minor? Surely you've missed this post

Quote
Bitcoin is already centralized. Do you realize that a cabal of a half-dozen people (I'm not talking about the developers) have the power, even if they have yet to exercise it, to arbitrarily control bitcoin? That this power also rests with anyone who controls the networks used by this cabal, which is presently confined to a small number of datacenters? That if they were in the US all it would take is a couple of national security letters for full control over the bitcoin network? I assure you the Chinese government has much stronger strings to pull.
The story of bitcoin over the last two years has been struggling hard to keep bitcoin decentralized in step with wider usage. In this effort we are floundering -- bitcoin scales far better today than it did in early 2013 (at which time it couldn't have even supported today's usage), but centralization pressures have been growing faster still.
That is a story that most people who work on bitcoin scalability can relate to, but which doesn't seem to be commonly understood among the casual userbase.
https://www.reddit.com/r/Bitcoin/comments/3h7eei/greg_luke_adam_if_xt_takes_over_and_wins_the/cu53eq3

Substance-less? Your whole argument and proposal is based on substance-less arguments and assumptions. I explicitly show why the idea that having more people deciding on an arbitrary number "because free-market" is utterly broken and totally ignores the dynamics at stake.
legendary
Activity: 1764
Merit: 1002
August 17, 2015, 04:00:29 PM

I think Erdogan's insight into the game theory behind the block size limit was correct.

But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid. 

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.

I notice you did not reply or comment on my opinion of you guys solutions... Maybe you missed it, maybe you don't care to answer? Either way I'd be curious to better understand the logic behind your proposal.

Because if that's all this hinges on:

cost of mining a larger block = self regulating block sizes

We're in a world of problems.


Quote
VI. This is a clerical issue, because block propagation and other considerations incentivize miners to keep blocks small anyway. The 1MB is just a hard limit getting in the way of things, the marketplace of miners should be allowed to fix block size as it seems appropriate.

While this argument has been disingenuously brought by Gavin himself, the fact is that the proposed inverted bloom filters upgrade would allow all blocks to propagate in constant time, regardless of their size.

Some might still ask: why is that?

Quote
davout: gavinandresen: "oh, the IBLT stuff? yes, that’d make propagation O(1)" <<< so with that, there's no network bottleneck anymore, at least no real incentive for miners to keep blocks small, right?

gavinandresen:davout: Miners would only have the meta-incentive of “we can collectively maximize revenue if we make blocks THIS big”

Except miners are not a person. They are multiple, geographically diverse groups of interests each bounded by different resources, costs and infrastructure. I kind of happen to think that this is what is broken with the "nodes and miners should be able to decide on whatever block size they like" proposition. I can also see clear as day through the attempt of many here at rationalizing this behavior as "free-market decides best, how dare you propose centrally designed SPAM CONTROL."

The assumption you seem to make is that miners & nodes (through the magic of the "invisible hand" I suppose) will arrive at an equilibrium of decentralization in some kind of benevolent act "because incentives & game theory". If we consider that the argument about cost of creating large block is moot, the rational then becomes: miners will act in an altruistic way to conserve trust of the network.

These points are not very clear to me. I don't imagine a scenario where several resourceful corporations do not turn this into an arms race that few will be able to keep up with. We are now only beginning to see mining and network infrastructure enter professional stage. If the incentive to mine Bitcoin increases the seemingly amateur and small scale set ups should soon be erased off the network and replaced by massive datacenters that should outnumber any of these small players so as to make their "voice" in the balance exercise of decentralization vs. block size worthless.

You might imagine that as "bitcoiners" realize this issue they will "protest" but I suggest that by this point A. you will not be able to actually become aware of the problem and B. there will be nothing to do about it as the network will have become "captured" because of "network ossification" and the general laziness of the herd which will prefer comfort and stability over change and doubt.

what a laugh.

before Peter's paper, Cripplecoiner's were arguing how IBLT wasn't proven, practical, needed or likely to be accepted since we had SC's and LN on the horizon.

now that the argument has changed, they hold up IBLT as "inevitable" and destined to undermine the propagation latency of blocks that Peter's theory relies upon.

such duplicity.
newbie
Activity: 28
Merit: 0
August 17, 2015, 04:00:24 PM
[...]

I just gave you permission via a Reddit PM subject to (1) putting me as second author to reflect the fact that you've contributed at this point more than I have, (2) making it clear that this is a WORKING DRAFT. This could be done with a sub-title on the first page, or a water-mark on every page.

Yes, done, thank you. Here is the post on /r/Bitcoin:

https://www.reddit.com/r/Bitcoin/comments/3hcrmn/new_blocksize_bip_user_configurable_maximum_block/

and here on /r/Bitcoin_uncensored:

https://www.reddit.com/r/bitcoin_uncensored/comments/3hcs6o/new_blocksize_bip_user_configurable_maximum_block/
legendary
Activity: 1153
Merit: 1000
August 17, 2015, 04:00:06 PM

I think Erdogan's insight into the game theory behind the block size limit was correct.

But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid.  

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.



I think this is confusing a protocol enforced "limit" and market preferences.

There can not be disagreement on the protocol enforced limit, which is what your right hand graph shows. If there was then miners that issued larger blocks will get forked off of every miner with a lower limit.

I think the current situation is also more representative of the right hand side graph, than the left. Today all miners have a fixed protocol limit of 1MB, but many have preferences for smaller blocks. For example the stress tests showed just how many still had the 750KB soft limit in place. So we in practice have the right hand graph today.

What is needed instead is to get rid of the protocol limit in practice (maybe keep a high water anti-spam limit which is what the 1MB was/is), while letting the market show it's preferences. This would be like a combination of the two graphs where an anti-spam limit is far off to the right of the graph, and below that miners show a range of preferences on block sizes they are willing to both issue and accept, which looks like your graph on the right.

This situation probably leads to a loose and dynamic form of market consensus on sizes. Miners that decided to only accept blocks well below most other miners' preference risk being orphaned at a higher rate and so are forced to up the size they accept to better match other miners. At the same time miners that issue blocks larger than what most other miners are willing to build on also risk being orphaned at a higher rate. The result is miners are forced by market pressures to move towards a consensus.

This is where we should be and the hard protocol limit prevents the market from properly functioning.
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
August 17, 2015, 03:58:52 PM
[...]

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.

[...]
  


Fully agreed. To add to this: This situation exists even today with 1MB blocks. A node might only support 900kB blocks on average bandwidth- or cpu-wise (e.g. some old ARM board) and will drop off when the network actually moves to Mike's cliff hy saturating at 1MB.


and this is precisely why big money outside of China interested in mining will support this proposal b/c it allows decentralization away from China as a result of the GFC.  specifically, the USA should want to see this.  the feedback effects to China will only further this dynamic as their gvt will relax GFC rules if they want to stay in the game.  all good for Bitcoin in general.

Oh my god  Cheesy Cheesy

To paraphrase:

Fuck China, let them choke on blocks they can't handle. Meanwhile let's centralize mining infrastructure in high bandwidth location until the communist party realize their mistake and give everyone more freedom or until our own government fuck us over and decide to control bandwidth themselves.

legendary
Activity: 1162
Merit: 1007
August 17, 2015, 03:58:03 PM

I think Erdogan's insight into the game theory behind the block size limit was correct.

But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid.  

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.

I notice you did not reply or comment on my opinion of you guys solutions... Maybe you missed it, maybe you don't care to answer? Either way I'd be curious to better understand the logic behind your proposal.

...


I honestly don't know what you're asking me.  What you quoted seemed like substance-less hand-wavving to me.

All we're proposing is to return to the design proposed in the Bitcoin white paper where "nodes accept the block only if all transactions in it are valid and not already spend" (#5):



Furthermore, I do not take Bitcoin's success as inevitable.  It may fail.  Right now, I see the biggest risk as developer centralization.  The risk of miner centralization due to adjusting an anti-spam measure seems minor in comparison.  
legendary
Activity: 1764
Merit: 1002
August 17, 2015, 03:55:53 PM
This is a special fork for those who do not agree with the blocksize scheduled increase as proposed by Gavin and Mike in their divisive altcoin fork, "Bitcoin XT".

This version can be used to protect the status quo until real technical consensus is formed about the blocksize.

This version is indistinguishable from Bitcoin XT 0.11A except that it will not actually hard fork to BIP101, yet appears on the p2p network as Bitcoin XT 0.11A replete with features, yet at a consensus level behaves just like Bitcoin Core 0.11. If it is used to mine, it will produce XT block versions without actually supporting >1MB blocks.

Running this version and/or mining with XT block versions will make it impossible for the Bitcoin XT network to detect the correct switchover and cause a premature fork of anyone foolish enough to support BIP101 without wide consensus from the technical community.

It prevents correct detection of Bitcoin XT adoption in the wild since usage will be known to have been tampered with and thus all statistical data gathered by getnodes can only be considered unreliable.

https://github.com/xtbit/notbitcoinxt#not-bitcoin-xt

Is there a way to distinguish this version vs the actual XT?

first off, i'm sure someone will figure out how to detect this pseudo-XT.  second, for Cripplecoiner's to stoop this low shows their despersation and ultimate failure as it shows they can't be trusted.
legendary
Activity: 1764
Merit: 1002
August 17, 2015, 03:52:34 PM
[...]

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.

[...]
  


Fully agreed. To add to this: This situation exists even today with 1MB blocks. A node might only support 900kB blocks on average bandwidth- or cpu-wise (e.g. some old ARM board) and will drop off when the network actually moves to Mike's cliff hy saturating at 1MB.


and this is precisely why big money outside of China interested in mining will support this proposal b/c it allows decentralization away from China as a result of the GFC.  specifically, the USA should want to see this.  the feedback effects to China will only further this dynamic as their gvt will relax GFC rules if they want to stay in the game.  all good for Bitcoin in general.
legendary
Activity: 1162
Merit: 1007
August 17, 2015, 03:46:30 PM

That said:

Can I push my BIP proposal to reddit with you as the author in it? I don't want to cause any larger confusion and want your explicit permission first Smiley

I just gave you permission via a Reddit PM subject to (1) putting me as second author to reflect the fact that you've contributed at this point more than I have, (2) making it clear that this is a WORKING DRAFT. This could be done with a sub-title on the first page, or a water-mark on every page.
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
August 17, 2015, 03:45:35 PM

I think Erdogan's insight into the game theory behind the block size limit was correct.

But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid. 

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.

I notice you did not reply or comment on my opinion of you guys solutions... Maybe you missed it, maybe you don't care to answer? Either way I'd be curious to better understand the logic behind your proposal.

Because if that's all this hinges on:

cost of mining a larger block = self regulating block sizes

We're in a world of problems.


Quote
VI. This is a clerical issue, because block propagation and other considerations incentivize miners to keep blocks small anyway. The 1MB is just a hard limit getting in the way of things, the marketplace of miners should be allowed to fix block size as it seems appropriate.

While this argument has been disingenuously brought by Gavin himself, the fact is that the proposed inverted bloom filters upgrade would allow all blocks to propagate in constant time, regardless of their size.

Some might still ask: why is that?

Quote
davout: gavinandresen: "oh, the IBLT stuff? yes, that’d make propagation O(1)" <<< so with that, there's no network bottleneck anymore, at least no real incentive for miners to keep blocks small, right?

gavinandresen:davout: Miners would only have the meta-incentive of “we can collectively maximize revenue if we make blocks THIS big”

Except miners are not a person. They are multiple, geographically diverse groups of interests each bounded by different resources, costs and infrastructure. I kind of happen to think that this is what is broken with the "nodes and miners should be able to decide on whatever block size they like" proposition. I can also see clear as day through the attempt of many here at rationalizing this behavior as "free-market decides best, how dare you propose centrally designed SPAM CONTROL."

The assumption you seem to make is that miners & nodes (through the magic of the "invisible hand" I suppose) will arrive at an equilibrium of decentralization in some kind of benevolent act "because incentives & game theory". If we consider that the argument about cost of creating large block is moot, the rational then becomes: miners will act in an altruistic way to conserve trust of the network.

These points are not very clear to me. I don't imagine a scenario where several resourceful corporations do not turn this into an arms race that few will be able to keep up with. We are now only beginning to see mining and network infrastructure enter professional stage. If the incentive to mine Bitcoin increases the seemingly amateur and small scale set ups should soon be erased off the network and replaced by massive datacenters that should outnumber any of these small players so as to make their "voice" in the balance exercise of decentralization vs. block size worthless.

You might imagine that as "bitcoiners" realize this issue they will "protest" but I suggest that by this point A. you will not be able to actually become aware of the problem and B. there will be nothing to do about it as the network will have become "captured" because of "network ossification" and the general laziness of the herd which will prefer comfort and stability over change and doubt.
legendary
Activity: 2492
Merit: 1473
LEALANA Bitcoin Grim Reaper
August 17, 2015, 03:41:44 PM
This is a special fork for those who do not agree with the blocksize scheduled increase as proposed by Gavin and Mike in their divisive altcoin fork, "Bitcoin XT".

This version can be used to protect the status quo until real technical consensus is formed about the blocksize.

This version is indistinguishable from Bitcoin XT 0.11A except that it will not actually hard fork to BIP101, yet appears on the p2p network as Bitcoin XT 0.11A replete with features, yet at a consensus level behaves just like Bitcoin Core 0.11. If it is used to mine, it will produce XT block versions without actually supporting >1MB blocks.

Running this version and/or mining with XT block versions will make it impossible for the Bitcoin XT network to detect the correct switchover and cause a premature fork of anyone foolish enough to support BIP101 without wide consensus from the technical community.

It prevents correct detection of Bitcoin XT adoption in the wild since usage will be known to have been tampered with and thus all statistical data gathered by getnodes can only be considered unreliable.

https://github.com/xtbit/notbitcoinxt#not-bitcoin-xt

Is there a way to distinguish this version vs the actual XT?
hero member
Activity: 644
Merit: 504
Bitcoin replaces central, not commercial, banks
August 17, 2015, 03:41:38 PM
The 1MB'ers are happy watching that adoption isn't growing anymore. Stupidity at its best.


Stupidity is pretending that "adoption" and "growth" is all about users & transactions.

Probably one of your more stupid posts.

I'm glad you like it  Cheesy

Your butthurt is expected of typical VC/startup braindamage that can't see beyond "MOAR USERS" "MOAR ADOPTION"!
sr. member
Activity: 392
Merit: 251
August 17, 2015, 03:41:25 PM
So many newbies supporting XT. Imagine that.  Roll Eyes
newbie
Activity: 28
Merit: 0
August 17, 2015, 03:38:48 PM
[...]

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.

[...]
   


Fully agreed. To add to this: This situation exists even today with 1MB blocks. A node might only support 900kB blocks on average bandwidth- or cpu-wise (e.g. some old ARM board) and will drop off when the network actually moves to Mike's cliff hy saturating at 1MB.

Also: Maybe we should add it to the BIP draft even, or at least reference this?

That said:

Can I push my BIP proposal to reddit with you as the author in it? I don't want to cause any larger confusion and want your explicit permission first Smiley
legendary
Activity: 2492
Merit: 1473
LEALANA Bitcoin Grim Reaper
August 17, 2015, 03:38:39 PM
We have been reminded, by the pretention nodes, that there is always a risk of a largeblock being orphaned, especially the first one, then the next one that is larger, and so on. Not only due to timing, verification of the block, the technical stuff, but also the willingness of others to build on it. In business, the risk transforms directly to cost.

After a block of 2MB for example, the risk is reduced for blocks up to and including that exact size. We will therefore in the future see step increases in the blocksize, with retraction in between due to the varying demand. The typical leg up, stability, another leg up pattern, all market based.

Yes, tip  toeing forward according to fundamentals, both technical and economic.

"Step increases" and "top toeing forward" won't help much here.  If the main concern is acceptance of the larger block (which I think it is and which you seem to imply with the reference to pretention nodes), then 1.001 MB and 8 MB blocks are, roughly speaking, "the same".  A node either accepts both (if it allows larger blocks) or it rejects both (if it does not).  So while I agree that miners may be reluctant to accept (and especially build on their own) large blocks, I don't think there is any particular reason to "tip toe forward" once they are on a chain that is invalid according to the old rules.  Unless, of course, verification and relaying timing is a concern.  But then the fork should actually have never been done in the first place and XT failed already during the planning stage.

I think Erdogan's insight into the game theory behind the block size limit was correct.

But to recognize his insight, I think we need to stop thinking in terms of "valid blocks" and start thinking in terms of "valid transactions."  All blocks that are composed exclusively of valid transactions are valid. 

Now instead of thinking that only Core and XT exist, imagine that there are dozens (and in the future possibly hundreds) of competing implementations of Bitcoin.  Each implementation has its own rules for what block size it will build upon.  From this viewpoint, the "effective limit" is the size of the largest block that's ever been included in the Blockchain.  If a miner wants to create a larger block (e.g., to collect more fees), then he has to weigh the chances that his block is orphaned with his desire to create a larger block.  If we imagine that the block size limit across the network forms some distribution as shown in the chart labelled "NEW THINKING" below, then, since the miner can't be 100% sure what this distribution is, it is rational for him to use the tip-toe method to minimize risk.



   


cost of mining a larger block = self regulating block sizes

Pages:
Jump to: