Pages:
Author

Topic: The Blocksize Debate & Concerns (Read 11213 times)

sr. member
Activity: 294
Merit: 250
July 17, 2016, 03:30:34 AM
We can't put all our eggs in the off-chain basket.  There will be consequences if miners don't see sufficient ROI.

The quotes you used are correct, any fees taken away from miners going to have consequence in reduced Bitcoin security especially in future when fees become much more important and going to represent higher % of miners income than today.

Any fees paid in offline solutions are fees which miners wont get anymore. Thats why its very dangerous precedent to try limit the blocksize so no more fee paying transactions can be included even though current technology easily support slightly bigger blocks for more transaction fees to collect, and miners understand this.
hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
July 17, 2016, 01:57:13 AM
It (lightning) will certainly help with scaling in the short term, but over longer periods, the real question becomes what percentage of transaction fees are skimmed off the top and don't end up going to the miners securing the main chain?  It could potentially hit miners hard later down the line if we don't strike the balance right.  Too much traffic on the main chain is bad, but too much off-chain could be equally precarious.

So yeah, you're right in that it (liquid) helps increase liquidity, but at the same time, it's allowing exchanges to move large volumes of funds around while minimising contact with the main chain.  Your vision of the future is that traditional financial institutions and big business will settle on the main chain and pay big fees to miners, but if they see exchanges doing it off-chain and still maintaining a high level of security whilst paying less in the process, why wouldn't other industries follow suit in a similar manner?

Would be somewhat ironic if most big businesses jumped on sidechains and you had to start begging people to put their cups of coffee on the blockchain just so the miners get a bit of income.  Tongue

Future growth will happen on layer 2 services in exchanges and third party services who will charge fees. these service fees will bypass miners and as the Bitcoin security subsidy halves every 4 years we know miners will get less and less revenue.

Absolutely.  I've argued before that the smallblock militants suck at math and logic and I'm still yet to hear a counter argument from them that makes a shred of sense.  Either we get more users paying fees to the miners, or the fees have to rise.  Diverting fees to third parties makes it even worse.

We can't put all our eggs in the off-chain basket.  There will be consequences if miners don't see sufficient ROI.


In a really open system, where there is no limits - except the 21Mio - it would be all up to the game theory and on + off chain scaling projects and groups would deliver a very nice solution in a fair competition.
What we now have is still BSLaudations....
Once the Chinese will understand this soon, they could / should open up the system by own codings.
legendary
Activity: 3948
Merit: 3191
Leave no FUD unchallenged
July 16, 2016, 06:06:20 AM
It (lightning) will certainly help with scaling in the short term, but over longer periods, the real question becomes what percentage of transaction fees are skimmed off the top and don't end up going to the miners securing the main chain?  It could potentially hit miners hard later down the line if we don't strike the balance right.  Too much traffic on the main chain is bad, but too much off-chain could be equally precarious.

So yeah, you're right in that it (liquid) helps increase liquidity, but at the same time, it's allowing exchanges to move large volumes of funds around while minimising contact with the main chain.  Your vision of the future is that traditional financial institutions and big business will settle on the main chain and pay big fees to miners, but if they see exchanges doing it off-chain and still maintaining a high level of security whilst paying less in the process, why wouldn't other industries follow suit in a similar manner?

Would be somewhat ironic if most big businesses jumped on sidechains and you had to start begging people to put their cups of coffee on the blockchain just so the miners get a bit of income.  Tongue

Future growth will happen on layer 2 services in exchanges and third party services who will charge fees. these service fees will bypass miners and as the Bitcoin security subsidy halves every 4 years we know miners will get less and less revenue.

Absolutely.  I've argued before that the smallblock militants suck at math and logic and I'm still yet to hear a counter argument from them that makes a shred of sense.  Either we get more users paying fees to the miners, or the fees have to rise.  Diverting fees to third parties makes it even worse.

We can't put all our eggs in the off-chain basket.  There will be consequences if miners don't see sufficient ROI.
hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
sr. member
Activity: 336
Merit: 250
July 11, 2016, 10:08:33 PM
legendary
Activity: 4410
Merit: 4766
July 11, 2016, 07:00:57 PM
You're the ones that want to postpone it now and are going to blame Core later for "delaying" a capacity increase. I'm sure of it.

oh lauda.. your switching..
one day segwit is the ultimate and only capacity solution
next day segwit is not the capacity solution its designed to solve other features that are made pointless to solve (malleability vs RBF)
next day it has capacity increases as a side effect
next day segwit is not intended to be a capacity increase solution.

seriously.. stop overselling segwit to fit one rhetoric and then underselling it to pretend you have not dedicated your last 6 months of life to pushing for segwit as a ultimate capacity solution.
legendary
Activity: 4410
Merit: 4766
July 11, 2016, 06:43:20 PM
the question is not about 2 directions constantly running.. its about taking a certain route and everyone following.
because consensus is about making a choice and going for that single route
we wont have a point where there are 2,3,4 different chains all with the 7 year of bitcoin data and satoshis genesis block all working at the same time forever

again, if you think that 2 forks will survive then you have been fed too much of the blockstream rhetoric

the miners and merchants and users would eventually choose one. and the other would just disappear into orphans and eventually no one would build on that chain..

put it this way. if 95% of mining pools upgraded to segwit tomorrow. then within 2 weeks EVERYONE would/SHOULD upgrade too otherwise they are no longer full nodes..(yes users not upgrading is diluting the node count of FULL VALIDATING NODES which has its own risks)

so knowing everyone could/should upgrade to remain full validation nodes. it might aswell include the extra buffer in that same update too
and then at a later date miners can decide when its safe for them to include data above the old limit.

emphasis: releasing the code does not cause a fork. only miners making bigger blocks when the community are not ready is the risk of a fork..which wont happen due to orphans costing the miners money.
so the simple solution is release the code and there wont be a risk

again simply preventing the code even being publicly available to core fanboys is the cause of controversy. not miners, not merchants.. not users.. just devs causing controversy by limiting choice.
if the code was available. people could move across.. then there wont be any orphans or issues because miners would only push passed the old limits when they dont see it as a orphan risk. and that would only happen if the devs got their heads out of the sand
hero member
Activity: 854
Merit: 1009
JAYCE DESIGNS - http://bit.ly/1tmgIwK
July 11, 2016, 05:27:23 PM
It's both interesting and sad how nobody ('forkers') mentioned this before Segwit was actually delivered. You're the ones that want to postpone it now and are going to blame Core later for "delaying" a capacity increase. I'm sure of it.

To tell you the truth i dont really like the Segwit and the "hacks" that bitcoin BIPS represent.

However I also dont want to turn bitcoin into a political scene, where you have each group representing their own interests and trying to impose their own hardforks on everyone else.

It is clear that bitcoin needs a separation of powers, and that is why the hardforks should be an unthinkable option. There mere fact that people just mention it, proves that humans have learned nothing from the politics of the past 5000 years.


You will have:

SocialistBTC
LiberalBTC
ConservativeBTC
MarxBTC

Folks do we really need that shit?
legendary
Activity: 2674
Merit: 2965
Terminated.
July 02, 2016, 01:40:23 AM
Segwit on Litecoin first: 2016!
It's both interesting and sad how nobody ('forkers') mentioned this before Segwit was actually delivered. You're the ones that want to postpone it now and are going to blame Core later for "delaying" a capacity increase. I'm sure of it.

No, and yes, that is the storyline. The simple solution sits in the wheelhouse (or was it bikeshed?). While the complex one, is rushed along. 
While the change in the block size limit might seem simple, a hard fork is definitely anything but simple. Additionally, you can see that 2 MB is unsafe without those additional (new) artificial limits added by Gavin. I also don't feel like Segwit is being rushed at all. It's been worked on since 2015 and tested for several months.
member
Activity: 117
Merit: 10
July 02, 2016, 01:14:03 AM
Well, we can't really state that Bitcoin is very well designed. I'm certain that the developers who are working on it currently, would do a lot of things differently if they could.
Created sick, and commanded to be sound.


The only 'good' thing about altcoins is that they can be used to test out some 'difficult' ideas.
Segwit on Litecoin first: 2016!


I don't think that we are going to see such radical re-design (what are the limitations of a HF?)
https://en.wikipedia.org/wiki/Omphaloskepsis


so there's not much 'point' in discussing it in, at least not in this thread.
Agreed


I do wonder if they can get RC1 released before the halving (even though Segwit still has no activation parameters). However, rushing is usually not the right thing to do.
No, and yes, that is the storyline. The simple solution sits in the wheelhouse (or was it bikeshed?). While the complex one, is rushed along. 
legendary
Activity: 2674
Merit: 2965
Terminated.
July 01, 2016, 02:06:35 PM
I see, the pesky block format is the issue in my opinion. Because it's standardized. If the transactions were each put arbitrarly in blocks, it would be enough to just verify their hashes to see that they are original,and then there would be no need for the block format.
It would be a very big deviation from the protocol , but it would be interesting to see an altcoin try this.
Well, we can't really state that Bitcoin is very well designed. I'm certain that the developers who are working on it currently, would do a lot of things differently if they could. The only 'good' thing about altcoins is that they can be used to test out some 'difficult' ideas. I don't think that we are going to see such radical re-design (what are the limitations of a HF?), so there's not much 'point' in discussing it in, at least not in this thread. I do wonder if they can get RC1 released before the halving (even though Segwit still has no activation parameters). However, rushing is usually not the right thing to do.

AFAIK the only thing implemented so far is segwit... which only (temporarily) avoids a hardfork, and still increases bandwidth, which is the supposed boogieman of "government control".
Compact blocks aim to improve bandwidth requirements, i.e. lower them. Additionally, have we already forgotten all the excellent upgrades that they've delivered (e.g. libsecp256k1)?
hv_
legendary
Activity: 2534
Merit: 1055
Clean Code and Scale
July 01, 2016, 12:19:53 PM
summary of mindsets

blockstream: "we need to get to a capacity that competes against Visa, they do thousands of transactions a second but they settle in days, bitcoin needs to do the same but in 10 minutes"
community: "so you want to invent LN which settles every.. umm...week, month, never? hmmmm"

blockstream: "we need to fix malleability so people can trust zero confirms"
community: "so you invent RBF to make zero confirm untrustable again"

blockstream: "hardforks are bad because everyone has to move over on day 0"
community: "some people already run implementations with higher limits. after all its a 0byte->Xmb rule change, not a exceed 1mb+ at all costs rule. so people can run, test, bugfix the higher limit implementations even now, giving plenty of time. also softforks require pools to upgrade too before it activates. so its literally the same boat.."

blockstream: "we dont want to dilute the node count with bigger blocks"
community: "segwit also increases the data for nodes. infact there is evidence of 2.8mb blocks.. not only that but segwit introduces pruned no witness mode which will definitely dilute the node count"

blockstream: "a hard fork wants to be 8gb blocks next year, run everyone the world will end if that is allowed"
community: "2mb is an acceptable amount of data and it took alot of 2015 for lots of people to drown out the doomsdays to find a number the majority were happy with. and it will grow NATURALLY as technology and ability grows(slowly). there is no end of days meteor incoming in the next year"

blockstream: "anything not blockstream is an altcoin"
community: "anything connecting to the network of the bitcoin genesis block and 7 years of bitcoin data is not an altcoin"

great post Franky.


Yes
Should be repeated every day now!
legendary
Activity: 4410
Merit: 4766
July 01, 2016, 09:37:52 AM

What if the blocksize limit were chosen by each node individually?
 

I believe that is what bitcoin unlimited attempts to do.  

I don't know the particulars but generally I like the idea
that the blocksize be decided dynamically by consensus not
by protocol rules.

bitcoin unlimited does. but the guy "RealBitcoin" is suggesting people LOWER their own block limit below any community accepted consensus purely to alleviate that individuals personal internet speed issues. which would ofcourse render that individual not part of the network(amungst other issues).

however the internet speed issue can be resolved by not being stupid and setting your node to connect to 100 other nodes. but instead 1-6 nodes for instance, which even on a slow connection is adequate.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
July 01, 2016, 09:32:28 AM
I always enjoy reading your comments, jonald_fyookbool.  Please consider coming over to https://bitco.in/forum/ once and a while!

Why only once and a while? I have no objection with Jonald spending all of his time within the bigblock-altcoin development community.

Bitcoin Core developers made Hearn, Andresen, and Peter R. look like schoolboys by designing and implementing a truly innovative, secure, and decentralization preserving way to scale Bitcoin. The FUD-campaign led by XT/ClassicCoin shills has totally failed - it was proven wrong by reality. The fancy extrapolation graphics circulated as propaganda for the simpleminded were not effective in pressuring Bitcoin Core developers to implement a fast route to government control of transactions.

None of the guys involved in the XT/ClassicCoin schemes are being taken serious anymore, with Gavin Andresen giving a prime example of his outright dangerous incompetence by lending credibility to alleged serial scammer Craig Wright.

ya.ya.yo!

AFAIK the only thing implemented so far is segwit... which only (temporarily) avoids a hardfork, and still increases bandwidth, which is the supposed boogieman of "government control".

legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
July 01, 2016, 09:21:47 AM

What if the blocksize limit were chosen by each node individually?
 

I believe that is what bitcoin unlimited attempts to do.  

I don't know the particulars but generally I like the idea
that the blocksize be decided dynamically by consensus not
by protocol rules.
legendary
Activity: 1806
Merit: 1024
July 01, 2016, 07:53:04 AM
I always enjoy reading your comments, jonald_fyookbool.  Please consider coming over to https://bitco.in/forum/ once and a while!

Why only once and a while? I have no objection with Jonald spending all of his time within the bigblock-altcoin development community.

Bitcoin Core developers made Hearn, Andresen, and Peter R. look like schoolboys by designing and implementing a truly innovative, secure, and decentralization preserving way to scale Bitcoin. The FUD-campaign led by XT/ClassicCoin shills has totally failed - it was proven wrong by reality. The fancy extrapolation graphics circulated as propaganda for the simpleminded were not effective in pressuring Bitcoin Core developers to implement a fast route to government control of transactions.

None of the guys involved in the XT/ClassicCoin schemes are being taken serious anymore, with Gavin Andresen giving a prime example of his outright dangerous incompetence by lending credibility to alleged serial scammer Craig Wright.

ya.ya.yo!
hero member
Activity: 854
Merit: 1009
JAYCE DESIGNS - http://bit.ly/1tmgIwK
July 01, 2016, 07:33:13 AM
What if the blocksize limit were chosen by each node individually? Everyone would set the minimum = send and the maximum =receive.
No that would not work at all. We'd have nodes that have differentiating chain heights which would most likely cause a lot more problems. You can already limit the amount of bandwitdh that you want to spend though (e.g. 'blocksonly' mode).

So for example I would send minimum 2 mb blocks (if full), but accept maximum 6 mb blocks.
Additionally, after some time most blocks will be above the threshold that you set so it becomes ineffective. It just delays the inevitable.

Far too large a contingent would never accept Andresen, Garzik, and the rest of those cronies as the developers.
They used to be good, but now we see them 'contribute' rarely or wrongly (e.g. 'header first mining').

I see, the pesky block format is the issue in my opinion. Because it's standardized.

If the transactions were each put arbitrarly in blocks, it would be enough to just verify their hashes to see that they are original,and then there would be no need for the block format.

It would be a very big deviation from the protocol , but it would be interesting to see an altcoin try this.

So then the block sizes would vary, and each node would verify the transaction counts in them, and pass them along eachother.
legendary
Activity: 3430
Merit: 3080
July 01, 2016, 07:30:32 AM
I still don't understand what you're talking about Franky
legendary
Activity: 4410
Merit: 4766
July 01, 2016, 07:26:32 AM
Far too large a contingent would never accept Andresen, Garzik, and the rest of those cronies as the developers.
They used to be good, but now we see them 'contribute' rarely or wrongly (e.g. 'header first mining').

luke Jr's independent code next month.. REKT campaign being prepared by you guys yet??
legendary
Activity: 2674
Merit: 2965
Terminated.
July 01, 2016, 07:25:44 AM
What if the blocksize limit were chosen by each node individually? Everyone would set the minimum = send and the maximum =receive.
No that would not work at all. We'd have nodes that have differentiating chain heights which would most likely cause a lot more problems. You can already limit the amount of bandwitdh that you want to spend though (e.g. 'blocksonly' mode).

So for example I would send minimum 2 mb blocks (if full), but accept maximum 6 mb blocks.
Additionally, after some time most blocks will be above the threshold that you set so it becomes ineffective. It just delays the inevitable.

Far too large a contingent would never accept Andresen, Garzik, and the rest of those cronies as the developers.
They used to be good, but now we see them 'contribute' rarely or wrongly (e.g. 'header first mining').
Pages:
Jump to: