Pages:
Author

Topic: Bitcoin 20MB Fork - page 30. (Read 154787 times)

donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
March 07, 2015, 02:53:48 AM
So increasing the blockchain (in size, per block) is not a solution and decreasing it isn't either?
I find this, illogical. What are you proposing?
What he probably means is that pruning does nothing to increase transactions per second.
Are you sure it's just that..?
Increasing the block size and pruning would be a decent solution, for now.
I'm not sure how advantageous pruning would be. That is probably covered in another thread. I would prefer a much larger block and a slightly larger OP_RETURN.
legendary
Activity: 2674
Merit: 2965
Terminated.
March 07, 2015, 01:28:20 AM
So increasing the blockchain (in size, per block) is not a solution and decreasing it isn't either?
I find this, illogical. What are you proposing?
What he probably means is that pruning does nothing to increase transactions per second.
Are you sure it's just that..?
Increasing the block size and pruning would be a decent solution, for now.

He isn't proposing anything. He just likes to be special.
There this guy who did some decent work back in 2013.

His success plateaued a couple years ago but the rest of Bitcoin continued to grow. His accomplishments aren't nearly as important or impressive today as they were in the past.

He also gets off on leading a cult of personality, but unfortunately as his relative importance to Bitcoin shrinks due to Bitcoin's continued growth relative to his plateauing, his ability to attract and maintain the cult is shrinking.
This means that in order to satisfy his ego, he needs Bitcoin to stop growing.
So he's one of those people.
legendary
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
March 06, 2015, 10:24:13 PM
Is there any way to estimate how long it will take to reach a consensus on whether to implement the 20MB fork?

Consensus is impossible because there are two competing visions for Bitcoin's future and no sufficiently popular consensus has emerged.

The Monopolist Maximalist camp thinks <20MB is madness, because they want retail and smartchain services.

The Minimalist Maximalst camp thinks >~10MB is madness, because they prefer a small, diffuse, and defensible attack surface.

I'd be OK with closing the loop by doubling block size when block reward halves.  And going to 2MB retroactively.  But I'd rather stay at 1MB than go to 20+++ and risk killing a golden goose that is working so well it's like magic.
sr. member
Activity: 341
Merit: 250
March 06, 2015, 09:38:57 PM
Is there any way to estimate how long it will take to reach a consensus on whether to implement the 20MB fork?
legendary
Activity: 2156
Merit: 1072
Crypto is the separation of Power and State.
March 06, 2015, 09:14:40 PM
His success plateaued a couple years ago but the rest of Bitcoin continued to grow. His accomplishments aren't nearly as important or impressive today as they were in the past.

He also gets off on leading a cult of personality, but unfortunately as his relative importance to Bitcoin shrinks due to Bitcoin's continued growth relative to his plateauing, his ability to attract and maintain the cult is shrinking.

This means that in order to satisfy his ego, he needs Bitcoin to stop growing.

I like your proposal for market-based block sizes.  Have fun starting JustusCoin to implement it.

Davout's opinion on the 20MB fork is correct; your ad hom attacks on him just make you look desperate.

The Monopolist Maximalist position is obviously untenable hubris and increasingly being modified to the more humble/reasonable Minimalist Maximalist paradigm. 

EG:

Andreas Antonopoulos (speaking at MIT), recently said:

"Bitcoin may end up being the gold-reserve currency… digital gold reserving value for other currencies, hundreds of other currencies.

As long as you have a two-way exchange between the value, you can very seamlessly move from one to the other… I expect we will see a world where bitcoin provides a core fungible currency value layer on top of which you’ll have protocol layers that deliver micro payments, fast transaction processing, anonymity and all kinds of other programmatic guarantees depending on the niche application.  Which is why there will not be one currency to rule them all, because there is no one application for everyone.
"  (Reference: http://youtu.be/J8y_GypCWf4?t=28m2s)

Are you now going to also accuse Andreas of needing to stop BTC from growing, to satisfy his ego?

Are Andreas' accomplishments and contributions "plateauing" as well?

Is Andreas likewise leading a "cult of personality?"  Or are Davout and Andreas both leaders in the same cult?  Please clarify!   Grin
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
March 06, 2015, 08:57:48 PM
Blockchain pruning is entirely do-able.

Cool story, but it doesn't really solve anything.
So increasing the blockchain (in size, per block) is not a solution and decreasing it isn't either?
I find this, illogical. What are you proposing?
What he probably means is that pruning does nothing to increase transactions per second.
legendary
Activity: 1400
Merit: 1013
March 06, 2015, 04:30:46 PM
Blockchain pruning is entirely do-able.

Cool story, but it doesn't really solve anything.
So increasing the blockchain (in size, per block) is not a solution and decreasing it isn't either?
I find this, illogical. What are you proposing?

He isn't proposing anything. He just likes to be special.
There this guy who did some decent work back in 2013.

His success plateaued a couple years ago but the rest of Bitcoin continued to grow. His accomplishments aren't nearly as important or impressive today as they were in the past.

He also gets off on leading a cult of personality, but unfortunately as his relative importance to Bitcoin shrinks due to Bitcoin's continued growth relative to his plateauing, his ability to attract and maintain the cult is shrinking.

This means that in order to satisfy his ego, he needs Bitcoin to stop growing.
legendary
Activity: 1904
Merit: 1007
March 06, 2015, 04:21:06 PM
Blockchain pruning is entirely do-able.

Cool story, but it doesn't really solve anything.
So increasing the blockchain (in size, per block) is not a solution and decreasing it isn't either?
I find this, illogical. What are you proposing?

He isn't proposing anything. He just likes to be special.
legendary
Activity: 2674
Merit: 2965
Terminated.
March 06, 2015, 03:21:21 PM
Blockchain pruning is entirely do-able.

Cool story, but it doesn't really solve anything.
So increasing the blockchain (in size, per block) is not a solution and decreasing it isn't either?
I find this, illogical. What are you proposing?
legendary
Activity: 1274
Merit: 1000
March 06, 2015, 02:21:12 PM
http://gavintech.blogspot.com/2015/01/twenty-megabytes-testing-results.html

Quote
Twenty Megabytes testing results
Executive summary: if we increased the maximum block size to 20 megabytes tomorrow, and every single miner decided to start creating 20MB blocks and there was a sudden increase in the number of transactions on the network to fill up those blocks....

... the 0.10.0 version of the reference implementation would run just fine.

You can check my work and get the detailed blow-by-blow in my for-testing-only megablocks branch (see megablocks_notes.txt).

CPU and memory usage scaled up nicely, there were no surprises. Both CPU and memory usage for the 20MB blockchain were within my criteria of "somebody running a decent personal computer on a pretty good home network connection should be able to run a full node."

I did have a surprise syncing a 20-MB chain to a VPS (virtual private server): bigger blocks were four times faster than the small, main-chain blocks. I don't know why; it is possible something else was happening on the VPS machine to affect the results, or maybe disk I/O is better with larger blocks.

So what's next?

Next we need a soft fork to deal with some longstanding technical debt related to the recent OpenSSL-was-willing-to-validate-too-much-stuff problem. Pieter Wuille and Gregory Maxwell have been working through that.

But then we need a concrete proposal for exactly how to increase the size. Here's what I will propose:

    Current rules if no consensus as measured by block.nVersion supermajority.
    Supermajority defined as: 800 of last 1000 blocks have block.nVersion == 4
    Once supermajority attained, block.nVersion < 4 blocks rejected.
    After consensus reached: replace MAX_BLOCK_SIZE with a size calculated based on starting at 2^24 bytes (~16.7MB) as of 1 Jan 2015 (block 336,861) and doubling every 6*24*365*2 blocks -- about 40% year-on-year growth. Stopping after 10 doublings.
    The perfect exponential function:
    size = 2^24 * 2^((blocknumber-336,861)/(6*24*365*2))
    ... is approximated using 64-bit-integer math as follows:

    double_epoch = 6*24*365*2 = 105120
    (doublings, remainder) = divmod(blocknumber-336861, double_epoch)
    if doublings >= 10 : (doublings, remainder) = (10, 0)
    interpolate = floor ((2^24 << doublings) * remainder / double_epoch)
    max_block_size = (2^24 << doublings) + interpolate
    This is a piecewise linear interpolation between doublings, with maximum allowed size increasing a little bit every block.


I created a spreadsheet and graph of how the maximum size would grow over time.

But... but... WRECK! RUIN! MADNESS!

I'm confident that there are no technical barriers to scaling up-- I've shown that our current code can handle much larger blocks, and assuming that progress in electronics and networking doesn't come to a sudden screeching stop, that our current code running on tomorrow's hardware would be able to handle the growth I'm proposing.

Of course, we won't be running current code on tomorrow's hardware; we'll be running better code. CPU usage should go down by a factor of about eight in the next release when we switch to Pieter's libsecp256k1 library for validating transactions. Network usage should get cut in half as soon as we stop doing the simplest thing and re-broadcasting transactions twice. And I'm sure all the smart engineers working on Bitcoin and Bitcoin-related projects will find all sorts of ways to optimize the software.

And yes, that includes making the initial block downloading process take minutes instead of days.

So that leaves economic arguments-- most of which I think I addressed in my Blocksize Economics post.

I'll try to restate a point from that post that it seems some people are missing: you can't maximize the total price paid for something by simply limiting the supply of that something, especially if there are substitute goods available to which people can switch.

People want to maximize the price paid to miners as fees when the block reward drops to zero-- or, at least, have some assurance that there is enough diverse mining to protect the chain against potential attackers.

And people believe the way to accomplish that is to artificially limit the number of transactions below the technical capabilities of the network.

But production quotas don't work. Limit the number of transactions that can happen on the Bitcoin blockchain, and instead of paying higher fees people will perform their transactions somewhere else. I have no idea whether that would be Western Union, an alt-coin, a sidechain, or good old fashioned SWIFT wire transfers, but I do know that nobody besides a central government can force people to use product with higher costs, if there is a lower-cost option available.

So how will blockchain security get paid for in the future?

I honestly don't know. I think it is possible blocks containing tens of thousands of transactions, each paying a few millibits in fees (maybe because wallets round up change amounts to avoid creating dust and improve privacy) will be enough to secure the chain.

It is also possible big merchants and exchanges, who have a collective interest in a secure, well-functioning blockchain, will get together and establish assurance contracts to reward honest miners.

I'm confident that if the Bitcoin system is valuable, then the participants in that market will make sure it keeps functioning securely and reliably.

And I'm very confident that the best way to make Bitcoin more valuable is to make it work well for both large and small value transactions.

Posted 20th January by Gavin Andresen
legendary
Activity: 924
Merit: 1132
March 05, 2015, 05:20:53 PM

yeah... who knows what goes through the head of altcoiners...

What goes through the heads of altcoiners is scads of coins with 30-second blocks -- which, if you do the math, means they're already doing up to 20MBytes worth of transactions in Bitcoin's nominal block generation time. 

At least they could be, if anybody actually used them.
hero member
Activity: 658
Merit: 501
March 05, 2015, 02:58:38 PM
Wow lightning network seems very intriguing. Is there a separate discussion about this here on bitcointalk?

https://bitcointalksearch.org/topic/lightning-network-another-proposal-to-make-bitcoin-scale-970822
sr. member
Activity: 342
Merit: 250
March 05, 2015, 02:55:42 PM

Wow lightning network seems very intriguing. Is there a separate discussion about this here on bitcointalk?
hero member
Activity: 658
Merit: 501
legendary
Activity: 1372
Merit: 1008
1davout
March 05, 2015, 05:56:14 AM
and yet other folks and pushing something to 'handle' gigabloatchain?

yeah... who knows what goes through the head of altcoiners...
legendary
Activity: 1260
Merit: 1002
March 05, 2015, 05:47:32 AM
Blockchain pruning is entirely do-able.

Cool story, but it doesn't really solve anything.

so, people disagree regarding the block size, and yet other folks and pushing something to 'handle' gigabloatchain?

wtf is this roadmap?

i need more pop corn.

PS: lmao http://thebitcoin.foundation/
legendary
Activity: 1372
Merit: 1008
1davout
March 05, 2015, 01:54:20 AM
Blockchain pruning is entirely do-able.

Cool story, but it doesn't really solve anything.
legendary
Activity: 924
Merit: 1132
March 04, 2015, 05:45:40 PM
Blockchain pruning is entirely do-able.

We've already had several altcoins launch (most recently Clams) with an enormous genesis block that contained a "snapshot" of current bitcoin txouts as of the time the genesis block was composed. 

Bitcoin could also switch to a new genesis block.  Here is how it would work.

1. New client is created and starts publishing a new block version, indicating a "hard fork" is coming.

2. New client waits for 95% majority on its block version, then starts publishing a new block format, which contains a hash of the current set of unspent txouts.  This TxOut set is "stripped" of contextual information including its block height and the transaction in which it originated, just as the "spin-off chains" among altcoins are made. Essentially it's just amount and key hash. 

3. Everybody can check that the new blocks are descended from the original genesis block (continuity hash) *and* that the entire set of unspent txouts is one that agrees with their current image of the universe.

4. One thousand blocks later (so well after any possibility of an orphaned chain), a special "MegaBlock" goes into the block chain.  It contains the set of unspent TxOuts that corresponds to the hash that was created for the first new-format block.  Thereafter, TxOuts created before that snapshot may not be used in transactions; however, a wallet containing the key that would have spent such a TxOut can now use the same key to spend a corresponding unspent TxOut that's published in the MegaBlock. 

5. After the MegaBlock everybody can still use the existing continuity hash to ensure that the block is descended from the first of the new-format blocks, and use the MegaBlock itself to verify unspent txOuts.  And it's up to them whether they keep the block chain back to the original Genesis block or not.

6. With the new block format containing a hash of the current unspent TxOut set, everybody who's following along can check that their TxOut set matches the hash that's published in every block.  Additionally, a new MegaBlock can be published every few years, allowing the block chain previous to it to be dropped.

This could be done differently:  Instead of having a Megablock the new block format could just reserve some space in every block to publish replacements for the oldest unreplaced txOuts, enabling the blocks prior to the block containing the last replaced txOut to be dropped (a "rolling root").  That would be more efficient if the blocks containing the txOuts are uniformly more than 2 years old.  But, all at once and nothing first is a heck of a lot easier to check and be sure you've got it right.

Anyway, my point is that blockchain pruning is not a technological risk; it's something that there are known and implementable ways to do.  I've outlined one such way, and as protocol, it checks.  It's not terribly efficient, but heck, we've been broadcasting every transaction twice for years now so efficiency is probably not a showstopper.

legendary
Activity: 1904
Merit: 1007
March 04, 2015, 04:33:02 PM
On the other side, is anyone even seriously looking into compression or pruning methods? That would definitely help out with out without the fork.

I am sure that it's not on the "critical" list because it's not an issue right now.
legendary
Activity: 2674
Merit: 2965
Terminated.
March 04, 2015, 02:29:58 PM
On the other side, is anyone even seriously looking into compression or pruning methods? That would definitely help out with out without the fork.
Pages:
Jump to: