Pages:
Author

Topic: The Blocksize Debate & Concerns - page 13. (Read 11213 times)

legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
June 26, 2016, 06:56:45 AM
#16
They are all the fallacies that support socialism, big government, bailouts, wealth redistribution, etc... 

As and when bitcoin becomes more mainstream, it too will have taken on a majority of individuals who do not understand that the whole thing was created precisely to prevent majorities or powerful individuals from violating the property rights of other individuals. 

I'd argue the whole thing was created to be neutral and open to anyone and that the disagreements in the community are largely caused by people trying to project their own political outlooks on the protocol.  One of these days you'll all just have to accept that as the userbase increases, so will the number of different opinions and worldviews.  Not everyone is a hardcore libertarian and you're going to have to share the chain with them whether you like it or not.  Bitcoin is not some isolationist paradise where you can shut out all the people you personally disagree with. 
jr. member
Activity: 57
Merit: 27
June 26, 2016, 05:27:55 AM
#15

Already Ethereum users make the incorrect argument that bitcoin was hardforked in the past to fix the value overflow bug (it wasn't) and thus its okay for them to manually tamper with the ledger to claw the funds the DAO lost and hand them over to other users.  You're seeing a first hand demonstration for how quickly people cling to argument of convenience.


The fallacies used by Ethereum users to justify tampering with the ledger are extreme and are certainly a slippery slope to the end of reliability in the truth and security of the transactions on their chain.  They are all the fallacies that support socialism, big government, bailouts, wealth redistribution, etc... 

As and when bitcoin becomes more mainstream, it too will have taken on a majority of individuals who do not understand that the whole thing was created precisely to prevent majorities or powerful individuals from violating the property rights of other individuals.  This is why making such interventions as hard as possible, if not downright impossible, must remain the top priority of the system. 
member
Activity: 81
Merit: 10
June 26, 2016, 04:02:44 AM
#14
3) Sidechains + Lightning Network
This is the path of scaling the Core team choose. And I think it's a very elegant, decentralized approach to scaling.

Both are already failures - if you're realistic about it. I don't see any connection between the bad ideas of Gregory Maxwell and a few others, and Bitcoin as a system. Sidechains are one of the dumbest ideas I've seen in Bitcoin ever. The ideas was proposed 3 years ago, whitepaper written 2 years ago, and there is no functioning system and I predict there will never be. Lightning is quite interesting, but will be a failure, too. I will be on my beliefs by selling Bitcoin for a better alternative when it arrives. There are deep economic reasons for these ideas to be bad, but you can't expect from core developers any understanding of it. They don't even see the incentive and scaling problems as they are. "Scaling Bitcoin" is a misnomer to begin with.

7) Corruptible devs
Also the bitcoin developers are decentralized, even if the Core team has control over the github depository, everyone can host bitcoin's source code on his server and work on the code.

No. Bitcoin dev is centralised around a small group of people, for good and bad reasons. Nobody can add just add any patch to the system. Forking blockstream/core is basically starting a new chain and project.
member
Activity: 97
Merit: 10
June 21, 2016, 02:14:50 PM
#13
snip
Damn, thanks for that lengthy and comprehensive response--big confidence builder as it (more-or-less) confirms a lot of my assumptions regarding the concerns in the system. A decentralised backbone upon which other systems can be built seems like the right choice if decentralisation is something which we value. Second-layer solutions that make tradeoffs of decentralisation/whatnot to meet the demands of higher capacity can happen (and already do--just look at all the off-chain trades on exchanges) on top of a decentralised ledger, but second-layer solutions cannot return decentralisation to Bitcoin if it is lost.

Honestly, I'm going long on Bitcoin anyhow (most of my transactions involve combining small outputs from altcoin mining), so current usability/capacity concerns aren't especially pressing for me, especially with huge improvements in these areas already in development. It will be an experience to see how this plays out over the next few years--just what will Bitcoin look like by the 2020 halving?
hero member
Activity: 854
Merit: 1009
JAYCE DESIGNS - http://bit.ly/1tmgIwK
June 21, 2016, 11:53:56 AM
#12
i actually think the OP's post is actually quite unbiased and reasonable

here is my insight to address three points (sacred protocol)(corruptible devs)(corruptible pools)
though things should change to adapt to the changing reality of users needs, we need to accept that even soft forks should need consensus just to be activated. and should need independent testing and reading every line of code.
this is because a softfork due to its nature of NOT needing upwards close to 100% adoption, allows the risk of bad code being added to change things 'on a whim'.
im glad mining pools are not going to run any new versions until they have done their own checks and tests... even if greenlit by core.

I dont support hardforks, but i can accept arguments from pro-hardforker if they can come up with good ones. It can be debated.


Yes soft forks need a lot of reviewing and testing as well, absolutely true. But this is already happening, every bitcoin client can potentially have some bug in it, that is why people should never uppgrade all at once.

Therefore you cannot force hardfork, because then everyone has to uppgrade before the grace period runs out, therefore a hardfork is 1000% more risky than a soft one.

Currently only 37% run the latest bitcoin version, so the other 63% acts as a backbone if there is a 0 day bug in the currrent version.

It cannot be said about a hardfork scenario....

Just added this point to point 10) , because it's really a very important point
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
June 21, 2016, 10:01:59 AM
#11


But on the balance, I think the risks can be handled and the capacity increase will be useful,

You speak as if capacity is a luxury.  The implication is that if capacity needs to be limited
that's ok because Bitcoin can act as a settlement layer or something like that.

This is something I (and others) are very suspicious of.
legendary
Activity: 1806
Merit: 1024
June 21, 2016, 08:34:51 AM
#10
Not screwing up decentralization early in the system's life is a high priority for me: It is utterly integral to the system's value proposition in a way that few other properties are,
 
[...]

Decentralization is also central to other elements that we must improve to preserve Bitcoin's competitiveness as a worldwide, open and politically neutral money-- such as fungiblity.

I fully agree! For me it's pretty hard to understand that quite a few people seem to be unable to understand the primary importance of decentralization for Bitcoin being valuable. People, who claim that transaction capacity and fast growth are more important than decentralization do not really get the reason of why Bitcoin came into existence.

Today, you can already do fast, high capacity transactions via centralized electronic payment providers. You don't need Bitcoin for that. A decentralized Bitcoin will never be able to compete with centralized payment solutions in terms of efficiency. And it should not have this goal, because the essential competitive advantage of Bitcoin stems from the fact that in addition to the value-transfer service it provides it represents also a store of value. This store of value function and hence the competitive advantage is endangered by any action that diminishes Bitcoin's fungibility.

ya.ya.yo!
legendary
Activity: 4410
Merit: 4766
June 21, 2016, 07:25:26 AM
#9
to Gmaxwell, i can honestly say that your last post was a genuine open minded opinion taking everything into account. it was a good read.

there is just one sticking point
Quote
I worry a lot that there is a widespread misunderstanding that blocks being "full" is bad-- block access is a priority queue based on feerate-- and at a feerate of ~0 there effectively infinite demand (for highly replicated perpetual storage). I believe that (absent radical new tech that we don't have yet) the system cannot survive as a usefully decentralized system if the response to "full" is to continually increase capacity (such as system would have almost no nodes, and also potentially have no way to pay for security). One of the biggest problems with hardfork proposals was that they directly fed this path-to-failure, and I worry that the segwit capacity increase may contribute to that too... e.g. that we'll temporarily not be "full" and then we'll be hit with piles of constructed "urgent! crash landing!" pressure to increase again to prevent "full" regardless of the costs.  E.g. a constant cycle of short term panic about an artificial condition pushing the system away from long term survivability.

at this current early era of bitcoin, the fee's are treated as a bonus(subsidy) and the reward is treated as the salary(income). whereby the deflationary nature of the reward causes the fiat price(speculatively) to offset the deflationary nature to keep pools comfortable long term for a couple decades(ignoring short term panic price emotional drama once each 4 years).
and as such it is not necessary at this time to be forced to wait for blocks to be full, before increasing some bufferspace/wiggle room to allow natural growth right now, purely on the bases of fee's

i say this because the paragraph i quoted is concerning and that it seems that bottlenecking blocks appears to some devs to be a good thing, because it launches the fee war.
but fees will only become essential in a couple decades.

so here is my question. imagine that segwit was in production in 2013. tested and independently vetted by 2014. would it have been included in v0.10 or would developers have dragged their feet until the fee war started this year and waited to 0.13 because of the worry about mining in 2032+(reward under 1btc)
isnt it the case that by pushing a fee war early is simply causing a taxation of bitcoin which will hurt bitcoins utility and desirability which can literally tank the speculative price to force fee's to become more essential sooner, to offset their income and thus raise the tax higher in and endless spiral.

however within those couple decades other things can be put inplace to help achieve the ultimate goal in a few decades of switching salary<-->bonus (income<-->subsidy)
whereby the 'full blocks are good, fee war is good' mindset should not be front of mind for devs, because a slow progression should be the mindset. rather than a manic push to switch sooner

i hope to see a honest and respectful opinion
i truly hope your reply is that segwit would have been implemented when independently vetted, tested, rather than dragging feet in favour of a early fee war.
legendary
Activity: 3542
Merit: 1965
Leading Crypto Sports Betting & Casino Platform
June 21, 2016, 06:32:58 AM
#8
Your last argument about the old timers makes complete sense to me, but hopefully these changes would not be done in a way to compromise the safety of coins stored years ago. < Software should always be backward compatible > I am also of the opinion that block sizes or any other parameter within the protocol should only be changed, if it is very detrimental to it's survival, not just to add small cosmetic changes. ^smile^
staff
Activity: 4284
Merit: 8808
June 21, 2016, 04:50:47 AM
#7
Out of curiosity, would you be willing to state your personal opinion on under what conditions it would be appropriate and/or beneficial to raise the blocksize limit?
I'd be glad to, and have many times in the past (also on the more general subject of hardforks), though I don't know that my particular views matter that much. I hope they do not.

Quote
What particular concerns, technical or otherwise, do you consider most important when it comes to considering alterations to this particular aspect of the system? Of course, blocksize is only one aspect of capacity (and a horribly inefficient method of scaling capacity), but I'm curious if there are any particular conditions under which you feel raising the blocksize would be the appropriate solution. Is it only a last-resort method of increasing capacity? And when it comes to capacity, to what extent should it be increased? Is, "infinite", capacity even something we should be targetting?
Since we live in a physical world, with computers made of matter and run on energy there will be limits. We should make the system as _efficient_ as possible because, in a physical world, one of the concerns is that there is some inherent trade-off between blockchain load and decentralization. (Note that blockchain load != Bitcoin transactional load, because there are _many_ ways to transact that have reduced blockchain impact.) ... regardless of the capacity level we're at, more efficiency means a better point on that trade-off.

Not screwing up decentralization early in the system's life is a high priority for me: It is utterly integral to the system's value proposition in a way that few other properties are, and every loss of decenteralization we've suffered over the system's life has been used to argue that the next loss isn't a big deal, making progress backwards is hard: The system actually works _better_ in a conventional sense, under typical usage, as it becomes more centralized: Upgrades are faster and easier, total resource costs are lower, behavior is more regular and consistent, some kinds of attacks are less commonly attempted, properties which are socially "desirable" but not rational for participants don't get broken as much, etc. Decentralization is also central to other elements that we must improve to preserve Bitcoin's competitiveness as a worldwide, open and politically neutral money-- such as fungiblity.

But even if we imagine that we got near infinite efficiency, would we want near infinite capacity?

Bitcoin's creator proposed that security would be paid for in the future by users bidding for transaction fees. If there were no limit to the supply of capacity, than even a single miner could always make more money by undercutting the market and clearing it*. Difficulty can adapt down, so low fee income can result in security melting away.  This could potentially be avoided by cartel behavior by miners, but having miners collude to censor transactions would be quite harmful... and in the physical world with non-infinite efficiency, avoiding the miners from driving the nodes off the network is already needed.  Does that have to work as a static limit?  No-- and I think some of the flexcap proposals have promise for the future for addressing this point.

Some have proposed that instead of fees security in the future would be provided by altruistic companies donating to miners. The specific mechanisms proposed didn't make much sense-- e.g. they'd pay attackers just as much as honest miners, but that could be fixed... but the whole model of using altruism to pay for a commons has significant limitations, especially in a global anonymous network. We haven't shown altruism of this type to be successful at funding development or helping to preserve the decentralization of mining (e.g. p2pool). I currently see little reason to believe that this could workable alternative to the idea initially laid of of using fees... of course, if you switch it around from altruism to a mandatory tax, you end up with the inflationary model of some altcoins-- and that probably does "work", but it's not the economic policy we desire (and, in particular, without a trusted external input to control the inflation rate, it's not clear that it would really work for anyone).

So in the long term part of my concern is avoiding the system drifting into a state where we're all forced to choose between inflation or failure (in which case, a bitcoin that works is better than one that doesn't...).

As far as when, I think we should execute the most extreme caution in incompatible changes in general. If it's really safe and needed we can expect to see broad support... it becomes easier to get there when efforts are made to address the risks, e.g. segwit was a much easier sell because it improved scalablity while at the same time increasing capacity. Likewise, I expect successful future increases to come with or after other risk mitigations.

(* we can ignore orphaning effects for four reasons, orphaning increases as a function of transaction load can be ~completely eliminated with relay technology improvements, and if not that by miners centralizing.. and if all a miners income is going to pay for orphaning losses there will be no excess paying for competition thus security, and-- finally-- if transaction fees are mostly uniform, the only disincentive from orphaning comes from the loss of subsidy, which will quickly become inconsequential unless bitcoin is changed to be inflationary.)

Quote
It is my understanding that Core's implementation of segwit will also include an overall, "blocksize", increase (between both the witness and transaction blocks), though with a few scalability improvements that should make the increase less demanding on the system (linear verification scaling with segwit and Compactblocks come to mind). Do you personally support this particular instance of increasing the overall, "blocksize"?
I think the capacity increase is risky. The risks are compensated by improvements (both recent ones already done and immediately coming, e.g. libsecp256k1, compactblocks, tens of fold validation speed improvements, smarter relay, better pruning support) along with those in segwit.

I worry a lot that there is a widespread misunderstanding that blocks being "full" is bad-- block access is a priority queue based on feerate-- and at a feerate of ~0 there effectively infinite demand (for highly replicated perpetual storage). I believe that (absent radical new tech that we don't have yet) the system cannot survive as a usefully decentralized system if the response to "full" is to continually increase capacity (such as system would have almost no nodes, and also potentially have no way to pay for security). One of the biggest problems with hardfork proposals was that they directly fed this path-to-failure, and I worry that the segwit capacity increase may contribute to that too... e.g. that we'll temporarily not be "full" and then we'll be hit with piles of constructed "urgent! crash landing!" pressure to increase again to prevent "full" regardless of the costs.  E.g. a constant cycle of short term panic about an artificial condition pushing the system away from long term survivability.

But on the balance, I think the risks can be handled and the capacity increase will be useful, and the rest of segwit is a fantastic improvement that will set the stage for more improvements to come. Taking some increase now will also allow us to experience the effects and observe the impacts which might help improve confidence (or direct remediation) needed for future increases.

Quote
To be clear, I'm just asking as someone who'd like to hear your informed opinion. In the short-term I'm not exactly worried about transaction capacity--certainly not hardfork worried--as with segwit on the way and the potential for LN or similar mechanisms to follow, short-term capacity issues could well be on their way out (really all I want short-term is an easier way to flag and use RBF so I can fix my fees during unexpected volume spikes). What I'm more curious about at this point are your views on the long-term--the, "133MB infinite transactions", stage.
I think we don't know exactly how lightning usage patterns will play out, so the resource gains are hard to estimate with any real precision. Right now bidi channels are the only way we know of to get to really high total capacity without custodial entities (which I also think should have a place in the Bitcoin future).

Some of the early resource estimates from lightning have already been potentially made obsolete by new inventions. For example, the lightning paper originally needed the ability to have a high peak blocksize in order to get all the world's transactions into it (though such blocks could be 'costly' for miners to build, like flexcap systems) in order to handle the corner case where huge numbers of channels were uncooperative closed all at once and all had to get their closures in before their timeouts expired. In response to this, I proposed the concept of a sequence lock that stops when the network is beyond capacity ("timestop"); it looks like this should greatly the need for big blocks at a cost of potentially delaying closure when the network is usually loaded; though further design and analysis is needed.  I think we've only started exploring the potential design space with channels.

Besides capacity, payment channels (and other tools) provide other features that will be important in bringing our currency to more places-- in particular, "instant payment".

As much as I personally dislike it, other services like credit are very common and highly desired by some markets-- and that is a service that can be offered by other layers as well.

I'm sorry to say that an easy to use fee-bump replacement feature just missed the merge window for Bitcoin Core 0.13. I'm pretty confident that it'll make it into 0.14. I believe Green Address has an feebump available already (but I haven't tried it). 0.13 will have ancestor feerate mining ("child pays for parent") so that is another mechanism that should help unwedge low fee transactions, though not as useful as replacement.

Quote
Of course, you're free to keep your opinions to yourself or only answer as much as you're comfortable divulging; it's not my intent to have you say anything that would encourage people to sling even more shit at you than they already have been lately Undecided
Ha, that's unavoidable.  I just do what I can, and try to remind myself that if no one at all is mad then what I'm doing probably doesn't matter.
member
Activity: 97
Merit: 10
June 21, 2016, 03:50:36 AM
#6
doesn't say anything about a 1MB block size in the Bitcoin whitepaper. Changing it to 2 MB (like we did) does not change the protocol
The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there.
Out of curiosity, would you be willing to state your personal opinion on under what conditions it would be appropriate and/or beneficial to raise the blocksize limit? What particular concerns, technical or otherwise, do you consider most important when it comes to considering alterations to this particular aspect of the system? Of course, blocksize is only one aspect of capacity (and a horribly inefficient method of scaling capacity), but I'm curious if there are any particular conditions under which you feel raising the blocksize would be the appropriate solution. Is it only a last-resort method of increasing capacity? And when it comes to capacity, to what extent should it be increased? Is, "infinite", capacity even something we should be targetting?

It is my understanding that Core's implementation of segwit will also include an overall, "blocksize", increase (between both the witness and transaction blocks), though with a few scalability improvements that should make the increase less demanding on the system (linear verification scaling with segwit and Compactblocks come to mind). Do you personally support this particular instance of increasing the overall, "blocksize"?

To be clear, I'm just asking as someone who'd like to hear your informed opinion. In the short-term I'm not exactly worried about transaction capacity--certainly not hardfork worried--as with segwit on the way and the potential for LN or similar mechanisms to follow, short-term capacity issues could well be on their way out (really all I want short-term is an easier way to flag and use RBF so I can fix my fees during unexpected volume spikes). What I'm more curious about at this point are your views on the long-term--the, "133MB infinite transactions", stage.

Of course, you're free to keep your opinions to yourself or only answer as much as you're comfortable divulging; it's not my intent to have you say anything that would encourage people to sling even more shit at you than they already have been lately Undecided
legendary
Activity: 2506
Merit: 1030
Twitter @realmicroguy
June 20, 2016, 08:43:33 PM
#5
1) The sacred Bitcoin Protocol: I think the current proposed HF from 1mb to 2mb wont make or break anything.

Correct. It doesn't say anything about a 1MB block size in the Bitcoin whitepaper. Changing it to 2 MB (like we did) does not change the protocol. And Satoshi certainly never intended for it to be static OR a consensus rule. It was a temporary measure only with the intention to "bump it up" if needed.

https://bitcoin.org/bitcoin.pdf

The primary reason it has remained unchanged is because increasing it would cause direct competition with other off-chain "sponsored" solutions. The correct answer is to increase the blocksize to 2MB unless a better solution becomes available.

~~
staff
Activity: 4284
Merit: 8808
June 21, 2016, 02:05:29 AM
#5
Malleability is already fixed. Segwit does not fix it further.
Technical point: This is very much not the case: Malleability is blocked in the relay network for a narrow class of transactions but anything clever is exposed, multisig is exposed to other-signer malleation, and all transactions are exposed to malleation by miners. Segwit fixes it in a deep and profound way for all purely segwit transactions.

doesn't say anything about a 1MB block size in the Bitcoin whitepaper. Changing it to 2 MB (like we did) does not change the protocol
The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there.

Already Ethereum users make the incorrect argument that bitcoin was hardforked in the past to fix the value overflow bug (it wasn't) and thus its okay for them to manually tamper with the ledger to claw the funds the DAO lost and hand them over to other users.  You're seeing a first hand demonstration for how quickly people cling to argument of convenience.

The rule by math element of Bitcoin is essential to the value proposition, great care should be taken to not erode it based on short term interests.
legendary
Activity: 4410
Merit: 4766
June 20, 2016, 07:28:07 PM
#4
i actually think the OP's post is actually quite unbiased and reasonable

here is my insight to address three points (sacred protocol)(corruptible devs)(corruptible pools)
though things should change to adapt to the changing reality of users needs, we need to accept that even soft forks should need consensus just to be activated. and should need independent testing and reading every line of code.
this is because a softfork due to its nature of NOT needing upwards close to 100% adoption, allows the risk of bad code being added to change things 'on a whim'.
im glad mining pools are not going to run any new versions until they have done their own checks and tests... even if greenlit by core.
hero member
Activity: 854
Merit: 1009
JAYCE DESIGNS - http://bit.ly/1tmgIwK
June 20, 2016, 06:20:21 PM
#3
1) The sacred Bitcoin Protocol: I think the current proposed HF from 1mb to 2mb wont make or break anything. But, it'll set a precedent that bitcoin protocol can be changed according to the demand of a cheering crowd. In future, there may be a cheering crowd to raise 21m supply cap. Question is what we do then?

Yes combine 1) with 8 ), and if the community gets hijacked by socialists, they will demand a wealth redistribution mechanism to be implemented in the protocol.

It's sort of an extreme example, but it is plausible. So that is why change has to be very hard to make, and well tested and thought out before it happens.

What will we do then? Nothing, by then the entire community is just destroyed. We need to prevent this from happening.

2) I meant fungibility, i always mix the two. I`ll corect it.

5) Yes, as soon as it would exist, nations would start hacking eachother , and it would be chaos , but i dont think it's possible to exist

7) Yes, but delaying good info is not incentivized. They can earn money and fame by helping bitcoin, so it's against their incentive to do that.

8 ) Yes, but that is why i wish the old guard to stay, so far they have the most coins, and they are the most reputable people for bitcoin, so if they dont sell their coins, we will always have good people controlling the coins and influencing the community with pro-bitcoin spirit.

legendary
Activity: 1662
Merit: 1050
June 20, 2016, 06:17:15 PM
#2
1) The sacred Bitcoin Protocol: I think the current proposed HF from 1mb to 2mb wont make or break anything. But, it'll set a precedent that bitcoin protocol can be changed according to the demand of a cheering crowd. In future, there may be a cheering crowd to raise 21m supply cap. Question is what we do then?

2) Segwit: Malleability is already fixed. Segwit does not fix it further. Segwit is aimed to create space within 1mb block size and creating scopes from Sidechains & Lightning Network. But, Segwit is complicated. It might become a disaster like The DAO. Hence, as Segwit is soft fork, there should always be non-Segwit nodes and applications doing non-Segwit transaction, so that in case of a disaster, a part of the network survives and hence bitcoin.

3) Sidechains + Lightning Network: Secondary layer. No problem.

4) Urgent Hardfork: Very unlikely for robust network like bitcoin.

5) Quantum Computers: FIAT money sitting on banking network is on much more threat than Bitcoin network. We are ~10 billion USD, while they are in trillion. We might sit back and enjoy the show.

7) Corruptible devs: Devs can not enforce anything immediately as a change to the network. They can hold back the existing working system and resist a change. So, dev level corruption is no big threat.

6 & 8 ) Corruptible miners & Corruptible community: Corruption is a relative word. In a way, a part will always be corrupted. But, in reality, only those matter, who have deep skin in the system itself. In reality, community is miners and holders. If there are 1000 corrupted holder holding 0.01 BTC each, then they will have less effect on the network than 10 holders holding 10 BTC each. Because, if, at any point, holders need to decide the winning chain, the coin holding majority will decide which chain will survive by selling the other chain's coin. Individual majority of holders is not important. Now, to serve their own interest, the majority holders wont go against the best health of the bitcoin network, corrupted or not corrupted.

9) Hardfork not fair for long term savers: Already discussed in above points.
hero member
Activity: 854
Merit: 1009
JAYCE DESIGNS - http://bit.ly/1tmgIwK
June 20, 2016, 06:08:48 PM
#1
Hello, I've been in bitcoin for 3+ years now and I would like to share my intellectual opinion about big blocks for bitcoin (2mb ,8mb ,etc) , the Bitcoin Hardfork, and about bitcoin in general and what are the concerns we need to watch out for.

Now there have been many shills, from both sides, so to just clear that out, so I want rational arguments pro/contra big blocks and hardforking bitcoin. I am personally anti-hardfork and therefore anti-big blocks, and I will demonstrate here why it is the best choice in my opinion. So stop shilling, and let's start debating this like civilized adults. I`ll present here my arguments and then you guys can respond to it. The thread will not be moderated so that I should not be accused as a shill. But I hope troll posts or low intellect posts (usually 1 liners) will be removed by forum moderators. Aso remember these are only my opinions from the knowledge I have so if I'm wrong, feel free to correct me, I`m open to criticism!




1) The sacred Bitcoin Protocol

The bitcoin protocol is like a constitution, once you start making changes in it, you will pretty much find yourself in financial tyranny, abuse of power and corruption pretty soon. Whenever things can be changed easily, they will. If there is no resistance to change, they will always happen. Therefore if the bitcoin protocol gets changed only by a small amount, some people will abuse that opportunity, and gradually add more changes, until bitcoin goes off it's original mission, and becomes centralized. Weather that be some sort of filter/blacklist/censoring system, or some sort of other oppressive patch installed into bitcoin. And then bitcoin will fail.

Of course like any constitution, changes have to happen eventually. You can't have the barbarian constitutions of the middle ages rule modern society, even if in the middle ages that was the sacred norm. Changes have to happen eventually, it is inevitable, but it has to be natural, well thought of, and based on real evidence and many many testing.

So in my opinion any change should be at least 1 generational. You don't want to change bitcoin too fast, when it's not even adopted yet by many people. Industry and tech changes too fast nowadays, but you have to slow down so that the clients can catch up. Hard fork is definitely not necessary right now (nor I think it will be in the foreseeable future), because I fear the community is still too young and prone to manipulation, therefore using this 'ring of power' is not an option.


2) Segwit

Segwit is good and necessary to fix fungibility and malleability. As a side-effect, it grows the block capacity as well. It is a temporary solution in terms of scaling the block capacity, because that is not it's goal. It's goal was only to fix malleability, which it probably will to some degree.


3) Sidechains + Lightning Network

This is the path of scaling the Core team choose. And I think it's a very elegant, decentralized approach to scaling. It is one way of doing it, and it's the most easy, immediate and fair way of doing it. Not to mention it's the most secure.

It's not a giant blob of code like ETH+ smart contracts, it's actually isolated from the bitcoin protocol. Security is only good if 1 sector is isolated from another. Think of it as a submarine, if the compartments are not isolated, then if it's breached, the water will sink the entire sub.

From transaction perspective, it can scale bitcoin really bigtime, while adding little or no centralization to the approach. Other 'classical' way would be to add an intermediary to issue IOU bitcoins that could be traded offchain. Well we know that is not an option for decentralization, so the LN is really cutting edge innovation, with little nor no centralization costs to the community.

The transactions will be instant, decentralized or semi-decentralized if you don't host the node, and would be the optimal choice given the current circumstances.

Therefore it is good to have this elegant, secure, and innovative approach to scaling. It also enables new features to bitcoin, that can be used to expand business, economy and other functions.

It is truly a piece of software art, truly crafted by the smartest people on the planet.


4) Urgent Hardfork

Well if the hardfork is really a 100% urgency like if ECDSA gets broken, then it's an urgency, and unfortunately it will have to be implemented. People will not like it, but it will be a true emergency so they will have no choice. However the probability of this is very low.

The more worrysome would be, once the devs have tasted the power of hardfork, then what? What classifies as an urgency? Can they just roll out hardfork every week when an 'urgency' becomes available? We go back to point 1), and how if things get too crazy, it can end bitcoin.

These have to be answered, and the community should always be skeptical.


5) Quantum Computers

I hear this argument many times, and it's mostly fearmongering. What if quantum computers become reality they can break bitcoin...?

Well I personally don't believe in large scale quantum computers, I think they won't ever exist, and there is plenty of evidence that supports this. Yes some laboratory theoretical computers might exist, but there are hard limits of thermodynamics that prevent the existence of large ones.

It's the same nonsense like the free energy advocates that think there is some hidden energy source, when 7th grade physics demonstrates you otherwise.

However classic computers might become one day powerful enough to find some exploit in the crypto algorithms (because brute forcing is obviously nearly-impossible).

And for that we shall react as in point 4). But not sooner than that. Bitcoin doesn't need 'what will happen if' type of patches. Bitcoin doesn't need insurance against alien invasion, we can deal with it, when the threat becomes sizeable.


6) Corruptible miners

What if the miners become untrustworthy? Low probability again, or it will be temporary. Game theory proves us that people working for incentives won't do things that will undermine the incentives. Or in other words people don't bit the hand that feeds them. If miners become shady then either they will run out of money before they can do too much evil, or they will become replaced by the honest ones, that would gain more money from mining honestly. Not to mention if they really piss off the community, everyone will just sell their coins and the mining business, as well as bitcoin will be over. If they implement capital controls to slow down the dumping, then the price will crash to 0 because nobody can buy, just as nobody can sell. Etc.. with many more theoretical attacks, but all end in disaster for the attacker.  So a miner betrayal would only be temporary at best.


7) Corruptible devs

So far we have had very honest reputable developers. What if that changes and we get replaced by corrupt ones?

Well it's like the saying goes, what you reap is what you sow. If the bitcoin community is comprised of programmers , IT experts and PC experts, then you will always find the best ones amongst them that can work on the code. If the bitcoin community will be comprised of morons, then bitcoin will end long before that.

So it's more important to keep the community healthy, because the community will supply the developers and bitcoin experts.

Also the bitcoin developers are decentralized, even if the Core team has control over the github depository, everyone can host bitcoin's source code on his server and work on the code.

The enforcing mechanism in bitcoin is the miners ,and I've explained in point 6) why miners are not prone to betrayal. So the devs incentive is to show his talent, get fame, get donations, and get careers. The miners enforce the rules, while the community decides weather they like it or not.

So if a dev becomes corrupt, he will achieve little, but will lose everything , especially his reputation.


8﴿ Corruptible community

Devs might be corrupt, miners might be, but all of them would be temporary issues. What if the community becomes corrupt?

Yes this is one of the most concerning factors, if the old guard leaves or gets bored of bitcoin, and we get replaced by 80 IQ morons , illiterate people, or just simply opportunists that dont care about bitcoin only want quick profit, then what?

Well this is one of the biggest and most realistic threats to bitcoin. If the community is dumb, it can be manipulated, and divided to be controlled. Everyone will just control the community by their own incentives and whoever can fool the most people will have control over bitcoin. It is a scary possibility.

My opinion would be a moderate adoption, with many tutorials for newbies, as teaching newbies about bitcoin and it's values. People need to feel home in bitcoin and appreciate it's qualities.

Also the old guard should never leave, if they do they are pussies, and whatever happens to bitcoin will be partially their fault. So people should educate newbies and teach them why bitcoin is the way to go.

9) Hardfork not fair for long term savers

Most bitcoin old guard, as well as new investors, want to save bitcoin for long term, usually in a secure wallet. If you roll out hard fork every year, then they will always have to uppgrade their wallets and work with sensitive information, that would expose them to unnecessary risk.

If they don't they can lose their money. What if they are not announced? What if they don't keep up with bitcoin news? You can't just push changes so fast because people will lose confidence in bitcoin's long term future.

If a person wants to put away for his kid's 18th birthday, and forget about that wallet for many years (but keep the private keys backed up and secure), he doesn't want to always be updated with all fixes and patches.

Even other softwares usually have support for 1 version of their software for many years, so why can't bitcoin be the same? It is, and that is how should it be!

So the SET & FORGET approach that most people feel comfortable with, will be violated, and it will upset many bitcoin whales.

10) Hardfork risks the entire network

Currently only 37% of the network runs the latest software, which is good, because if a bug gets discovered the other 67% will act as a backbone.

If you want to force a hardfork then everyone has to uppgrade before the grace period runs out, therefore all the network will have the same version. If a bug or malware gets discovered in the hardforked client, then the entire bitcoin network will be destroyed.

Therefore hardforks are ultra vulnerable to 0 day bugs (or undiscovered bugs). Decentralization should mean that you cannot be 100% confident of the developers, because we are all humans and we make mistakes, so even if the devs have checked the code many many times, you can still not risk the network, because  0 day bugs are very frequent, even in huge projects ( *cought* Ethereum *cought*)
Pages:
Jump to: