Pages:
Author

Topic: Anti-fork guys: What is YOUR proposal? - page 4. (Read 5433 times)

hero member
Activity: 854
Merit: 1000
June 01, 2015, 08:57:59 PM
#35
I don't understand why people refuse to the fork,if it is something necessary

everybody refuse but nodoby purpose anything
newbie
Activity: 11
Merit: 0
June 01, 2015, 08:52:35 PM
#34
Take action automatically if blocks have been, on average since the last difficulty adjustment more than 62% full, for instance increase the allowed block size next difficulty adjustment. If they are less than 38% full verse via decrease it. Use the same controller software now in place for difficulty to maintain fullness of blocks ~50%.

I would have to take much longer to ponder the consequences but I really like the Idea of maintaining the fullness of blocks at ~50%.
legendary
Activity: 1022
Merit: 1000
June 01, 2015, 07:40:55 PM
#33
I too really like the graph, thanks for posting it.  The scale is logged so it does make it look fuller currently than it is but still the point is clearly made.

I think the only potential issue I have not seen discussed is what kind of transactions are in the blocks.  Are there still lots of dust transactions?  If so, then maybe there is a little more time as those transactions get squeezed out.  I remember there was a time when satoshidice was responsible for like 75% of all blockchain transactions.  I presume those days are long gone.

Given that the block size started out a lot bigger, moving it back to where it originally was would seem to make sense.  I just assume the downside is that the blockchain gets even bigger.

Thanks to all the posters who tried to explain what was going on, this was one of the better threads I saw on this.
legendary
Activity: 1666
Merit: 1057
Marketing manager - GO MP
June 01, 2015, 02:32:33 PM
#32
Take action automatically if blocks have been, on average since the last difficulty adjustment more than 62% full, for instance increase the allowed block size next difficulty adjustment. If they are less than 38% full verse via decrease it. Use the same controller software now in place for difficulty to maintain fullness of blocks ~50%.

Ok that's not strictly anti-fork, just that if there has to be a fork, make it count and figure out a smart solution.

I think it's even better to adjust the difficulty in relation to the fullness of blocks so that conformation times and supply of new coins are related to network demand, but I know that would be seen as too radical so I'm willing to settle for intelligent blocksize adjustment instead.
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
June 01, 2015, 02:09:00 PM
#31
This is the situation and what we know:



So we are heading towards a dead end. Now, Gavin has proposed a fork of 20MB. That's a proposal. I don't know for how long would 20MB last for. Ideally, it should be able to deal with all the world's biggest electronic payment platforms transactions convined, so, can 20MB deal with a volume of for example, Western Union, Paypal, Mastercard... and so on and so on, combined? Because that is were we should aim at for the future. Bitcoin must be able to engulf all of that volume within itself to become the #1 electronic transaction method on the planet. Everything else is bullshit.
Will 20MB be enough for this or we will need ANOTHER fork in the future? If 20MB is enough, it's a reasonable proposal. Sure, forking sucks, but im not hearing any other realistic solutions. Of course, like I said before, im assuming 20MB is the definitive fork. We can't be forking every X years, that's just retarded.

What is not reasonable, is making a ton of FUD threads, saying how Gavincoin will kill BTC, while implying staying with the 1MB blocksize limit is all fine and dandy. So everytime you say Gavin's proposal is shit, explain why AND propose your own solution, otherwise you are just spreading cheap FUD.

Another question I have: What is Satoshi's view on this? im sure he expected this to happen, so I guess he had an opinion before leaving.

Wow, what a manipulating graph you posted. It lets it look like the blocks are already 95% full at the moment. And that 20MB is the last hope. I dont like such manipulations to the mind.

There is plenty of time getting to a consensus. There is no hasty decision needed. At least thats something the graph shows clearly. Especially not when the chief coder claims he will go to a company that will somewhat privatize bitcoin then. It looks more to me like Bitcoin-XT has finalized their plans and now need a great ICO, or how you could name that.
hero member
Activity: 700
Merit: 501
June 01, 2015, 10:49:08 AM
#30
How will the limits effect on the value of per Bitcoin?

probably some initial dump due to fud and random mistrust, from some whale especially, but we will stabilize after that, to the current price again

one could expect a sub 200 this time, more easily than before i guess...

But sub 200 in what coin? Because as far as I Know, we will have 2 coins now, the old coin (in the old blockchain) and the new coin, in the new blockchain, so that means 2 prices, means a new coin needs to be added on exchanges, means companies will have to choose in between one of the 2 blockchains, means a lot of things. This is a total mess.
full member
Activity: 134
Merit: 147
June 01, 2015, 09:36:45 AM
#29
With a recent interview with Mike Hearn and Gavin Andersen they have said themselves they are not sure how close we are to hitting the limit in which they further explained as it could be years down the line or it could potentially be 1 month from now.

all those claiming to know when we are going to hit are speaking total bullshit in all honesty. anyone who is anti fork though needs to consider what other solutions there are using side chains will not be any better than the propsal gavin is offering it looks to me like the core developers and gavin and hearn are taking this too personally maybe gmaxwell more than anyone else.

although I do agree with a lot gmaxwell has said but the fact of the matter is something needs to be done and without including the proposal gmaxwell is offering and hopefully we can come up with a solution which we dont have to take gavins approach although I believe conseus will be reached if gavin does go is way
hero member
Activity: 686
Merit: 500
June 01, 2015, 09:32:21 AM
#28
Proposal 1: Show some respect to Satoshi. Make it 21MB at least.

2. Some work on faster confirmation times, or mass adopt 0/1 confirms.
newbie
Activity: 17
Merit: 0
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
June 01, 2015, 07:16:30 AM
#26
This is the situation and what we know:



This image perfectly explains the dilemma in a succinct way.  More people need to see it in the various threads where people are discussing the issues (although I'm still pretty sure most people still don't understand the implications of how the network would behave in the event of running with full blocks).  I think I'll even go back and edit some of my previous posts to include it and recommend others do the same.
legendary
Activity: 1876
Merit: 1000
June 01, 2015, 04:51:33 AM
#25
the obvious; litecoin  Wink
hero member
Activity: 605
Merit: 500
June 01, 2015, 04:25:48 AM
#24
This may be missed by others in Bitcoin Discussion I would recommend listening the Let's Talk Bitcoin #217 Podcast it helps a lot at looking at it objectively.
It's pretty informative at explaining both sides whether it is 1MB or 20MB and provides a good understanding of the issues.
https://letstalkbitcoin.com/blog/post/lets-talk-bitcoin-217-the-bitcoin-block-size-discussion

Stephanie, Andreas and Adam speak first with Bitcoin Foundation Chief Scientist Gavin Andresen about the Bitcoin Block Size debate, where it came from, why it matters and what he thinks we should do about it.

During the second half of the show, Bitcoin Developer and outspoken block size increase opponent Peter Todd joins the discussion to share his perspective on the issue, the problems he sees and why taking action now actually prevents the problem from finding a market solution.

---
Let it run to the limit then if we have no choice fork but try to resolve bandwidth issues to scale up, while increasing min txt to reduce loads.

Thank you for posting this E... Now both sides of the argument are more clear and make much more sense.
hero member
Activity: 700
Merit: 500
June 01, 2015, 02:38:43 AM
#23
This may be missed by others in Bitcoin Discussion I would recommend listening the Let's Talk Bitcoin #217 Podcast it helps a lot at looking at it objectively.
It's pretty informative at explaining both sides whether it is 1MB or 20MB and provides a good understanding of the issues.
https://letstalkbitcoin.com/blog/post/lets-talk-bitcoin-217-the-bitcoin-block-size-discussion

Stephanie, Andreas and Adam speak first with Bitcoin Foundation Chief Scientist Gavin Andresen about the Bitcoin Block Size debate, where it came from, why it matters and what he thinks we should do about it.

During the second half of the show, Bitcoin Developer and outspoken block size increase opponent Peter Todd joins the discussion to share his perspective on the issue, the problems he sees and why taking action now actually prevents the problem from finding a market solution.

---
Let it run to the limit then if we have no choice fork but try to resolve bandwidth issues to scale up, while increasing min txt to reduce loads.
legendary
Activity: 3248
Merit: 1070
June 01, 2015, 02:31:52 AM
#22
How will the limits effect on the value of per Bitcoin?

probably some initial dump due to fud and random mistrust, from some whale especially, but we will stabilize after that, to the current price again

one could expect a sub 200 this time, more easily than before i guess...
sr. member
Activity: 266
Merit: 250
How will the limits effect on the value of per Bitcoin?
sr. member
Activity: 770
Merit: 250
I'm not anti-fork, nor am I really bothered which "stick" everyone ends up running with, but the problem remains that BTC can not scale to be a general world currency.

The only solution is a distributed ledger of some form, but on a single block chain, that is not possible to do.  Side chains could be classified as a form of distributed ledger, and may work to some degree of success, but its more of form of partitioning rather than distribution, because you can not be sure that all side chains are using compatible properties or protocols to directly interact with each other, and so there is additional overhead to enforce this.

To achieve load capabilities that will allow any crypto-currency to truly become a mass market, general world currency, and stay true to the original ideals of Bitcoin as a (100% decentralized), said currency must be developed from the beginning with a viable, working distributed ledger solution in mind.

With BTC, there is no solution, period.  Increasing the block size will never solve this problem if BTC was to become truly mainstream, due to the required block size to cope with 100's - 1000's of transactions per second loading.  If block sizes are in the 100MB+ to deal with that kind of load, the end result is more centralization, and less full nodes on the network due to the costs and inconvenience of maintaining/running a full node.  The block chain size can be mitigated with some form of pruning, but you still have the bandwidth and processing requirements to ensure full nodes are able to deal with those high transaction loads.

I've been looking at both sides of the equation and I have to agree. The proposal to just raise the blocksize limit to 20mb isn't solving anything, it's just prolonging the issue(A careless quick temporary fix if you will). Bitcoin still won't be able to scale.
sr. member
Activity: 462
Merit: 250
I'm anti fork an my proposal is to just fire Gavin and be happy thereafter.
And he should take his fanclub with him!
Ok and what will you do when world demands more Bitcoin transactions, if we want to make Bitcoin bigger than the baby already it is we need changes. Please do not be afraid of change, that will kill you..

More forks in the future may be needed but I'm not sure if FIAT will dissapear leaving economics world to Bitcoin or any other...
legendary
Activity: 2786
Merit: 1031
I dont think we'll see many replies here, people are too busing making fud threads while not proposing any solution. Then again, what gentlemand said is a problem, if we'll need more and more forks it doesnt sound good. Hopefully if they make a fork, it's a fork that is self adaptive to this problem so we don't need to ever fork again.

the proposed alternative are not good enough, hence the only solution is to go ahead and making this change, other 4 devs are worried that in the future it might be necessary to fork again because 20mb would not be enough, but with 20MB we are in the same boat as visa or mastercard(if i'm not mistaken, don't remember they number of transaction per second), talking about transaction per second, worse case we should do 30 mb right now


that is the reason why we should make an automatic increase. dont forget, we are winning time (years) and other solutions will be developed too!


how deal with people that "forget" to upgrade their clients or can only do it at a high cost?

Serious financial software can't be upgraded every day

That's why Gavin already began this discussion, because it will take many months for the majority of the network to upgrade, that's the thing with distributed networks, one has to plan it way ahead.
sr. member
Activity: 728
Merit: 256
Proposal 1: Show some respect to Satoshi. Make it 21MB at least.

Proposal 2: I need to understand the lightning.network before proposing this part.
legendary
Activity: 1050
Merit: 1016
I dont think we'll see many replies here, people are too busing making fud threads while not proposing any solution. Then again, what gentlemand said is a problem, if we'll need more and more forks it doesnt sound good. Hopefully if they make a fork, it's a fork that is self adaptive to this problem so we don't need to ever fork again.

the proposed alternative are not good enough, hence the only solution is to go ahead and making this change, other 4 devs are worried that in the future it might be necessary to fork again because 20mb would not be enough, but with 20MB we are in the same boat as visa or mastercard(if i'm not mistaken, don't remember they number of transaction per second), talking about transaction per second, worse case we should do 30 mb right now

20MB gets Bitcoin nowhere even close to VISA/Mastercard processing abilities.

Visa on a regular day processes 2-3000 tx/s, and has a peak capacity of about 40,000 tx/s

20MB blocks give Bitcoin a theoretical capacity of about 140 tx/s, in practice it will be more around the 70tx/s

For Bitcoin to have the same capacity as VISA alone on a normal day, would require block sizes of around 500MB, to be able to handle VISA peak of 40,000 tx/s, blocks would need to be about 10GB

tnx for giving me the exact numbers, then if those are correct we only need another fork for 400-500, with another rise of x20, i don't find this to be a big problem like the others 4 dev are claiming, and complaining about

No problem.

It could become a big problem, read my post further up in this thread.  If Bitcoin was to become truly mainstream in the next 1-2 years, then technologies required such as connection bandwidth, CPU power, and storage to support that load, could be priced out of the "guy on the street" budget range.  If normal users are not able, or willing, to run full nodes and ensure real decentralization, then you end up in a mess.


I'm not anti-fork, nor am I really bothered which "stick" everyone ends up running with, but the problem remains that BTC can not scale to be a general world currency.

The only solution is a distributed ledger of some form, but on a single block chain, that is not possible to do.  Side chains could be classified as a form of distributed ledger, and may work to some degree of success, but its more of form of partitioning rather than distribution, because you can not be sure that all side chains are using compatible properties or protocols to directly interact with each other, and so there is additional overhead to enforce this.

To achieve load capabilities that will allow any crypto-currency to truly become a mass market, general world currency, and stay true to the original ideals of Bitcoin as a (100% decentralized), said currency must be developed from the beginning with a viable, working distributed ledger solution in mind.

With BTC, there is no solution, period.  Increasing the block size will never solve this problem if BTC was to become truly mainstream, due to the required block size to cope with 100's - 1000's of transactions per second loading.  If block sizes are in the 100MB+ to deal with that kind of load, the end result is more centralization, and less full nodes on the network due to the costs and inconvenience of maintaining/running a full node.  The block chain size can be mitigated with some form of pruning, but you still have the bandwidth and processing requirements to ensure full nodes are able to deal with those high transaction loads.

But didn't satoshi wanted Bitcoin to (ideally) become a world currency capable of dealing with said volume? I don't think he wanted Bitcoin to stay some small niche payment method.

I can't speak for Satoshi, nor his long term intentions, but you need to remember that Bitcoin was/is just experiment.  

If I was Satoshi and had all these ideas, Bitcoin in its current incarnation would of been the first step, a proof of concept, and not the final end game.  I'm certain that he was aware of its limitations, he was a smart guy, so would also know that to achieve global currency status would require some later refinement, revisions and work.  

I think what is more likely is that Bitcoin ran away from him before he had a chance to get to that point, it got more attention than he thought it would and got nervous about the implications.
Pages:
Jump to: