Pages:
Author

Topic: Why is big blocks bad? (Read 2020 times)

brand new
Activity: 0
Merit: 0
July 07, 2017, 01:27:53 PM
#44
I run a full node at home. I don't think running a full node is for everyone, but it's not that bad. CPU usage is pretty low, disk usage is acceptable, and network traffic is not too bad. The initial syncing process takes forever, though.

We should be able to increase the base block size modestly at this time, in addition to SegWit (for various other benefits it offers), without risking much centralization of nodes. I know my (3-year-old) computer can easily handle it.

As has been pointed out in this thread, the more significant deciding power is with the miners. At this point, I'm way more concerned about the centralization of mining than I am about centralization of nodes. I would rather not have a vast majority of the hashpower located in a single country. What can be done about that?
sr. member
Activity: 490
Merit: 256
July 07, 2017, 12:23:07 PM
#43
I believe that having a bigger block is beneficial in terms of the users. There will be more transaction be confirmed within a single block. If there are more transaction that will be confirmed at a time then lower fees will be imposed. Also, since more transactions are processes per block, there is less waiting time for the confirmation of the transaction be done. This is what I believe at first but due to the arguments laid by others, I think my thoughts are useless to the facts that what will really happen if the increase in blocksize will be implemented.
legendary
Activity: 2982
Merit: 4193
July 07, 2017, 09:02:14 AM
#42
What is Core's excuse to avoid increasing bitcoin block size to 2mb?
Just imagine when the miners a mining a small rock then they suddently have to mine a really big rock, they will we stuck there for a long time that cause the chain stop.
NO, NOT TRUE. The mining speed will not be affected by the block size, at least not significantly. The merkle root is indeed, hashed by the miner to place it into the block header. However, it is not hashed repeatedly, it is only hashed once before it is passed to the miners in a pool. The tradeoff for this is worth it.

However, the propagation time for bigger blocks can potentially be longer and this can cause blocks to get orphaned. It would take more time for nodes to process it too. Miners are generally not very affected*.

*Well SPV mining.
sr. member
Activity: 288
Merit: 250
July 07, 2017, 08:56:38 AM
#41
What is Core's excuse to avoid increasing bitcoin block size to 2mb?
Just imagine when the miners a mining a small rock then they suddently have to mine a really big rock, they will we stuck there for a long time that cause the chain stop.
newbie
Activity: 22
Merit: 0
July 06, 2017, 07:27:18 AM
#40
Big Blocks are not really bad in and of themselves. What's bad are those people who think that "Big Blocks = Scaling". It is not. Doubling the block size and getting 2x the number of transactions while suffering a quadratic increase in computational costs can hardly be called scaling. It is just an incremental improvement that you could make to the protocol as computing and network resources become cheaper and more abundant.
hv_
legendary
Activity: 2520
Merit: 1055
Clean Code and Scale
July 06, 2017, 07:09:41 AM
#39
Some people in core think it's bad because they fear

1) Hard Fork (HF) needed (was discussed by Satoshi)

2) Smaller users cannot afford running thy system (nodes)


So they want rather SegWit, that allows like in a crowded court to get more space for the mob -> the witness / jury sit on the street and can get pruned on wish (was never recommended by Satoshi Nakamoto!!).


Now they will allow with segwit big companies to run BS licened 2nd layer scaling business (small users may run out of business anyway)

And finally with UASF they (at least Luke) are happy to risk a HF .


 Huh

Make sense ??

NOOOOOOOOOO !!!

newbie
Activity: 18
Merit: 0
July 06, 2017, 06:35:45 AM
#38
Yes those fees are getting unbearable (especially for small payments) we have to find a solution and do it quickly. I have no technical background at all but I think increasing block size is the only logical step forward.

It is logic like this which is very very dangerous.    You want a quick solution to a immediate problem you are having .... and you think you know the solution, but freely admit you don't understand it.


Increasing blocks ... will allow bitcoin to be controlled by those who are powerful enough  (and we all know there are very powerful forces out there that will take control if an opportunity presents).   Slippery slope, that leads to nowhere good.

Solutions to capacity will come in time.
sr. member
Activity: 438
Merit: 266
June 07, 2017, 07:04:20 AM
#37
No, making blocks bigger isn't bad. They have almost no cons and only pros:
- Availability of bitcoin to more people
- Cheaper transactions than we have these days
- And big return of 10-minute transactions!
Bitcoin will die if it won't be forked. Especially Core devs will regret this sooner or later.
legendary
Activity: 2982
Merit: 4193
June 07, 2017, 05:41:08 AM
#36
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives.
It takes a lot more resources than bandwidth and hard drives to run a node. For starters, you do have to have a strong CPU that can verify the blocks in a fast manner. The increase in resources also means that it is close to impossible to run nodes on VPS.
So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?
Since it takes a longer time to verify, more and more miners will be going to SPV mining and skip the verification of the blocks as to save resources. If you don't have a limit, the block size can be as large as you want and if a miner decides to mine a 100MB block, this could take forever to verify. This can negatively harm Bitcoin.

Codswallop. I've been running a full node and on my I7-4790k tower computer and the cpu resource is rarely over 1%. It touches 2% when a new block is downloaded. Bandwidth is not an issue as millions do not use 56k phone lines anymore.

No one is suggesting 100mb block, so calm down and stop fuding.
Would you check out the price of your PC and the cost of running it 24/7? Not everyone is going to run it 24/7 and most are going to run it on a raspberry pi or VPS. Perhaps, you didn't really notice your CPU usage while synchronizing? It will only get more agonizing if the blocks gets too big.

No, neither do I think we would hit 100mb. But would you read the question? I'm not saying that the block would be 100mb, I am saying that the miners can potentially (though unlikely) mine a huge block if there wasn't any block size limit and cause nodes to take time to verify it.

Note: I don't oppose increasing the block limit either. Calm down and read the post that I am replying to Smiley.
hero member
Activity: 770
Merit: 629
June 06, 2017, 02:09:49 AM
#35
Why not create Jihancoin with 2 mega blocks?

This will happen when bitcoin's brand name has eroded away.  Not quite there yet, but in a not-too-distant future, when bitcoin will have become a crypto amongst other crypto, yes.  For the moment, the bitcoin brand name is still what keeps unity in bitcoin, because the one changing it, could lose the brand name, and hence, lose about everything (bitcoin is essentially a brand name, not much more).

When there will be 5 or 10 cryptos with comparable (even if individually still smaller) market cap and trade volume as bitcoin, bitcoin will be able to split and solve its problems.
member
Activity: 108
Merit: 10
June 06, 2017, 12:40:15 AM
#34
Why not create Jihancoin with 2 mega blocks?
sr. member
Activity: 484
Merit: 250
June 06, 2017, 12:06:59 AM
#33
In simple words blocks contain lots of unconfirmed transaction and if each of them has their transfer history with them then imagine how much data will be transferred just because of you made a transaction. so they are now trind to remove that extra shit and put it in segwet wallet and will link the blocks
I do not think the big bad blocks, large blocks will make the speed of the transaction more quickly, things easier for the user, however, there is a risk, large blocks can not guarantee The security, it is easily hacked by the hack, which is what no one wants. So if you want larger blocks, Bitcoin needs decentralization.
hero member
Activity: 490
Merit: 520
June 05, 2017, 11:44:47 PM
#32
What is Core's excuse to avoid increasing bitcoin block size to 2mb?
So it requires more storage space for users over time (essentially 2x the growth from this point in blockchain data usage than what it would be with 1mb blocks) along with making it a challenge to maintain nodes as there are slightly higher barriers to entry. That's what I've heard so far and it sounds fairly accurate.
I don't really have a very strong opinion right now but it does seem like larger block sizes are the better plan right now.
hero member
Activity: 770
Merit: 629
June 05, 2017, 11:38:35 PM
#31
Increasing the blocksize to increase the transaction capacity of the network will also increase the storage and bandwidth requirements for running a "full node" of bitcoin.

Core believes that this would be bad for bitcoin because it would mean fewer people are "verifying" all the transactions, causing a bit more centralization in terms of trusting fewer miners and full nodes.

The big blockers believe that the network will survive and remain decentralized, and giving up a bit in terms of storage and bandwidth costs is worth it for faster/more transactions with lower fees.

The sneaky thing in this argument is that it is tacitly assumed that a full node has something to do with 'decentralization'.  It hasn't.  One should distinguish "decentralization" (which is a notion of *decision power*) and "distribution" (which has to do with spread-out architecture of a system).  Facebook is entirely centralized (the power is with 1 entity, Mark Zuckerberg) but is very much distributed throughout the world.  

To measure the power of "decentralization" of a full node, one has to ask oneself what *decision power* it has in bitcoin.  What are the decisions in bitcoin ?  They are the block chain.  All decisions in bitcoin are in the block chain: the block chain contains all the data on which bitcoin's existence is based.  There's nothing in bitcoin of any significance that is not notified in the block chain.

So, how much can a full node take decisions ?

The block chain of bitcoin is supposed to be built according to a certain set of rules: the protocol.  Full node software checks the entire block chain to see whether this protocol has been followed or not.  The protocol rules are built into the software of the full node.  So at first sight, it could seem that full nodes can signal that the block chain one is giving to them, is not built according to the protocol.  That's true: it can SIGNAL it, to its owner.

Essentially, we can assume that a certain part of the block chain, since its beginning, is built according to the protocol in the software of the full node.  And at a certain point, say, block 515000, the block in the block chain is NOT according to the rules.  What does the full node do ? It refuses that block 515000, and it waits until it finds a "correct" successor to block 514 999, the last "good" block.

Now, who is MAKING the block chain ?  The only entities MAKING the block chain are the miners.  In order to make a block chain, one needs to prove HUGE amounts of work, so only miners can possibly propose blocks which contain this huge amount of work.  Nobody else can.  Miners validate previous blocks, by building their blocks on top of the block they validate.   That is the core of the consensus decision mechanism in bitcoin: a block is validated by other miners, because they build their blocks on top of it.
Given that only miners can validate blocks, and that, if they do, they build blocks on top of the valid blocks, it means that as long as they have consensus amongst them, there is only one block chain out there, and nobody can make another one.

So if your "validating node" somehow doesn't like block 515 000, the only unique block 515 000 out there on which miners built 515 001 and 515 002 etc..., then your validating node stops forever at 514 999, and that's it.  Now you know.  And you knowing is the only thing that happens.  Your node stopping at 514 999, or you switching off the computer on which it runs, has, from the outside, exactly the same effect.  You have no decision power that goes beyond "switching off your computer".

This is why full nodes have no decision power on the decisions in bitcoin, that is to say, on the construction of the block chain ; and nodes that have no decision power do not contribute to decentralization.

Bitcoin was made that way: the decision power is the consensus decision, which is made with proof of work (miners), not with nodes (because nodes can be Sybil attacked).  The decision is made by building on top of a block which the provider of proof of work deems valid ; to be, himself, validated by the next block built on top of him etc....

Full nodes can essentially do two things:

1) copy the unique block chain that is out there and serve as a proxy server for it
2) disagree and stop, waiting forever for something that will not come

This is why, even in 2010, Satoshi already said that "the only people needing to run full nodes are those wanting to make new coins" (mining).  The decentralization of bitcoin resides in those that make the block chain.  There are about 20 of them.  That's the real "decentralization" of bitcoin, not whether some guy in central Africa has a full node running in his basement.

legendary
Activity: 1806
Merit: 1090
Learning the troll avoidance button :)
June 05, 2017, 10:53:22 PM
#30
What is Core's excuse to avoid increasing bitcoin block size to 2mb?

Debate on whether 1MB is superior but really at this point we already have the codes ready for a fork in a few months.
https://bitcointalk.org/index.php?topic=1928093.0;topicseen
legendary
Activity: 2912
Merit: 6403
Blackjack.fun
June 05, 2017, 05:00:56 PM
#29

That can hardly have been said better; I consider myself from at least upper middle class in my region where min wage is below usd200. The largest capacity hdd I have ever owned is 750mb and I never experienced true high speed internet till recently. Full nodes are already concentrated with the wealthy. Let it not further be out of reach.

Probably 750 gb not mb.
I remember I was in high school when we had the first 1gb drives and that was a long time ago.

Also the internet is not really a factor when running a node. Nor the cpu.
It's the f hard drive that really drives me nuts when it syncs after being offline for a while.

legendary
Activity: 2856
Merit: 3548
Join the world-leading crypto sportsbook NOW!
June 04, 2017, 05:05:28 PM
#28
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives. So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?

You are obviously living in a 1st world country. I travel between countries and I experience both worlds. I can tell you from experience that there

are only a small percentage of people in this world that would agree with your statement. Some of these countries have no infrastructure at all, so

they cannot host nodes as it is now. Decentralization is one of the most important aspects of Bitcoin and it is in our best interest to prevent the

centralization of Bitcoin. Every Block size increase will decrease the ability of smaller nodes to survive and in the end you will only have big

centralized data centres that can run nodes. {not the ideal situation for the sustainability of the network}  Angry

That can hardly have been said better; I consider myself from at least upper middle class in my region where min wage is below usd200. The largest capacity hdd I have ever owned is 750mb and I never experienced true high speed internet till recently. Full nodes are already concentrated with the wealthy. Let it not further be out of reach.
sr. member
Activity: 284
Merit: 250
June 04, 2017, 04:49:04 PM
#27
I'm an ordinary bitcoin user and it's hard for me to understand these technical nuances. But if the increase in the block will lead to an improvement in the quality of the transfer of coins, then I agree to increase the block.
legendary
Activity: 924
Merit: 1000
June 04, 2017, 04:34:59 PM
#26
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives. So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?

All these stupid excuses are given by the miners Core developers, so that they could continue with the 1MB blocks. Continuing with the small block size means increase in the number of unconfirmed transactions, delays in confirmation, and most importantly increase in the transaction fee.
legendary
Activity: 1806
Merit: 1090
Learning the troll avoidance button :)
June 04, 2017, 04:22:30 PM
#25
What is Core's excuse to avoid increasing bitcoin block size to 2mb?

Fork, but in reality no real excuse to avoid 2mb other than the impact it might have on harddrive space to store a node and transaction download upload speeds that may effect distribution. Needed though in order to transact faster and with more transactions. Does get repetitive though but mmm few more months to go till this circle of questions end.
Pages:
Jump to: