Pages:
Author

Topic: Why is big blocks bad? - page 2. (Read 2063 times)

legendary
Activity: 2184
Merit: 1024
Vave.com - Crypto Casino
June 04, 2017, 03:20:50 PM
#24
Jesus this debate is so fuckin boring now. Do you guys even listen to yourselves?
sr. member
Activity: 756
Merit: 253
June 04, 2017, 03:10:57 PM
#23
In simple words blocks contain lots of unconfirmed transaction and if each of them has their transfer history with them then imagine how much data will be transferred just because of you made a transaction. so they are now trind to remove that extra shit and put it in segwet wallet and will link the blocks

Well hope segwit is the solution though because there's been a lot of negative press about it and the possibility of the network getting hacked if segwit is introduced.
legendary
Activity: 1274
Merit: 1004
June 04, 2017, 02:32:48 PM
#22
In simple words blocks contain lots of unconfirmed transaction and if each of them has their transfer history with them then imagine how much data will be transferred just because of you made a transaction. so they are now trind to remove that extra shit and put it in segwet wallet and will link the blocks
sr. member
Activity: 336
Merit: 252
June 04, 2017, 02:07:28 PM
#21
Core/Blockstream/AXA want to constrict blocks to 1mb (or very low) forever to profit off of payment hubs. There is absolutely nothing wrong with large blocks and that was Satoshis intended way of scaling. Bitcoin would be fine if it weren't for bought out Core. Bitcoiners must oppose blockstream/core/segwit/uasf and fork away to big blocks soon!
I totally agree. I'm back to the bitcoin world since few weeks after years and I seriously think these high fees are not sustainable for the use most of us do of bitcoin
Yes those fees are getting unbearable (especially for small payments) we have to find a solution and do it quickly. I have no technical background at all but I think increasing block size is the only logical step forward.
legendary
Activity: 2912
Merit: 6403
Blackjack.fun
June 04, 2017, 02:05:59 PM
#20
Jumping the gun. Massive assumptions that we get 300m users.

I do have software that monitors the CPU, hdd, etc and see no problems. I've done my research, have you. I do video editing, browsing and Bitcoin at the same time. Video editing as one would know is CPU intensive and uses the hdd... no problems. Bitcoin isn't affected at all. Buy a proper computer and not some 10 years old out-of-date crap or laptops.

If you do video editing probably you don't use the red color that much.
And stop doing this here also.

Not only it is rude to use red but it also translate to a kid wanting attention.

If you really want to prove something download afresh copy of the client and record the video of it synchronizing with bitcoinwisdom in the background
Let's see how long the sync will take

Just increasing the blocks will solve the problem temporary.... and that window will get smaller each time.
We need another aproach



full member
Activity: 210
Merit: 100
💰💰💰💰
June 04, 2017, 01:55:50 PM
#19
Core/Blockstream/AXA want to constrict blocks to 1mb (or very low) forever to profit off of payment hubs. There is absolutely nothing wrong with large blocks and that was Satoshis intended way of scaling. Bitcoin would be fine if it weren't for bought out Core. Bitcoiners must oppose blockstream/core/segwit/uasf and fork away to big blocks soon!
I totally agree. I'm back to the bitcoin world since few weeks after years and I seriously think these high fees are not sustainable for the use most of us do of bitcoin
legendary
Activity: 2744
Merit: 1174
June 04, 2017, 01:51:04 PM
#18
Increasing the blocksize to increase the transaction capacity of the network will also increase the storage and bandwidth requirements for running a "full node" of bitcoin.

Core believes that this would be bad for bitcoin because it would mean fewer people are "verifying" all the transactions, causing a bit more centralization in terms of trusting fewer miners and full nodes.

The big blockers believe that the network will survive and remain decentralized, and giving up a bit in terms of storage and bandwidth costs is worth it for faster/more transactions with lower fees.
This is already a problem with the current size. I'm not running a full node because it takes too much space and takes too much time to update. Maybe if I were being paid to do that i'd consider it, but giving up a significant portion of my disk just doesn't seem to be worth the trouble.
Also, for most people it doesn't really matter if they have to download 60 or 120 GB of data, because that 60 will already discourage most people.
legendary
Activity: 1162
Merit: 1000
June 04, 2017, 01:44:39 PM
#17
As already said, big blocks would require more bandwidth and computer recourse usage, and the blockchain would grow twice faster(more time to sync and bigger storage needed).

Another problem is, change the size of the block would make communication between nodes impossible until everyone update, and wouldn't take long until Bitcoin gets more popular and the blocks are full again. Is the trouble worth for a temporary fix?
sr. member
Activity: 913
Merit: 252
June 04, 2017, 01:27:52 PM
#16
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives. So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?

All these stupid excuses are given by the miners, so that they could continue with the 1MB blocks. Continuing with the small block size means increase in the number of unconfirmed transactions, delays in confirmation, and most importantly increase in the transaction fee.
legendary
Activity: 924
Merit: 1000
June 04, 2017, 01:21:49 PM
#15
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives.
It takes a lot more resources than bandwidth and hard drives to run a node. For starters, you do have to have a strong CPU that can verify the blocks in a fast manner. The increase in resources also means that it is close to impossible to run nodes on VPS.
So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?
Since it takes a longer time to verify, more and more miners will be going to SPV mining and skip the verification of the blocks as to save resources. If you don't have a limit, the block size can be as large as you want and if a miner decides to mine a 100MB block, this could take forever to verify. This can negatively harm Bitcoin.

Codswallop. I've been running a full node and on my I7-4790k tower computer and the cpu resource is rarely over 1%. It touches 2% when a new block is downloaded. Bandwidth is not an issue as millions do not use 56k phone lines anymore.

No one is suggesting 100mb block, so calm down and stop fuding.

Yeah , increase the blocks to 2mb we could have 600k transactions a day...in 2 months we will go above that
Then we will go for 4 mb..then for 16..

Let's assume we have 300 million bitcoin users that each make a transaction  a day.
Enjoy  1GB blocks.


Also check your hdd/ssd that's the one that is getting hammered while downloading and verifying blocks not the cpu.

Jumping the gun. Massive assumptions that we get 300m users.

I do have software that monitors the CPU, hdd, etc and see no problems. I've done my research, have you. I do video editing, browsing and Bitcoin at the same time. Video editing as one would know is CPU intensive and uses the hdd... no problems. Bitcoin isn't affected at all. Buy a proper computer and not some 10 years old out-of-date crap or laptops.
hero member
Activity: 574
Merit: 500
June 04, 2017, 12:43:03 PM
#14
Core/Blockstream/AXA want to constrict blocks to 1mb (or very low) forever to profit off of payment hubs. There is absolutely nothing wrong with large blocks and that was Satoshis intended way of scaling. Bitcoin would be fine if it weren't for bought out Core. Bitcoiners must oppose blockstream/core/segwit/uasf and fork away to big blocks soon!
sr. member
Activity: 276
Merit: 254
June 04, 2017, 12:24:58 PM
#13
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives. So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?

You are obviously living in a 1st world country. I travel between countries and I experience both worlds. I can tell you from experience that there

are only a small percentage of people in this world that would agree with your statement. Some of these countries have no infrastructure at all, so

they cannot host nodes as it is now. Decentralization is one of the most important aspects of Bitcoin and it is in our best interest to prevent the

centralization of Bitcoin. Every Block size increase will decrease the ability of smaller nodes to survive and in the end you will only have big

centralized data centres that can run nodes. {not the ideal situation for the sustainability of the network}  Angry


Decentralization is both about cost of running a node AND transaction fees. If you cant afford transaction fees, there is no point of running a node either. So there should definitively be a ballance for node cost and transaction fees to keep decentralization, and small block size increase to 2 MB might satisfy this ballance.
legendary
Activity: 4410
Merit: 4766
June 04, 2017, 12:02:07 PM
#12

sorry but SPV is not network security... its only personal validation of what you personally see..

the network security is much like torrents. where by you holding the blockchain makes you a crucial 'seed' of the network
with everyone else holding the same seed data. this then makes the dna of a particular seed that is replicated become the most widely accepted strand there is.

..
spv is not about the network. spv doesnt make other peers follow you or change what they hold based on what you have validated.. spv is just about what you think is valid

..
out of say 5million people.. yea we should have 4.9mill that dont need to know or want to know how bitcoin works and they can be fine using SPV clients

but we do need a nice healthy list of decentralised and diversely located/coded ful nodes that do have the full data.
..
now here is the thing.

we are not in 2009 where the min requirements were raspberry Pi1 and a 0.5mb ADSL/3G internet...
we are now in 2017 where the min requirements were raspberry Pi3 and a 5mb ADSL/4G internet... moving to the average being fibre/5G soon

so we can.. yes we CAN cope with ALOT more than before

some nutters scream about "gigabytes by midnight".. (they deserve a wet fish slap across the face)
the reality is that 8mb is safe today, but 4mb is extra safe... even core admit as much.

yet core want to strangle native(legacy) utility to 1mb, of a fake gestured 2mb... which is their delay and posturing tactics of trying to assume control of bitcoin
its also just kicking the can down the road by not allowing the blocksize to naturally grow (within node capabilities) and instead require the devs to spoonfeed out the limits
Ucy
sr. member
Activity: 2674
Merit: 403
Compare rates on different exchanges & swap.
June 04, 2017, 11:45:44 AM
#11
Now I get it.. So the delays and arguments was due to fear of centralization.
The centralization thing is indeed a very valid argument... If increasing the blocksize to 2mb will TRUELY lead to centralization please ruthlessly avoid It. Cryptocurrency wasn't meant to be centralized otherwise it would not be regarded as Crypto. I rather see Bitcoin splitted or dead than the alteration of that cryptocurrency value. Sticking with the best value possible should be maintained... must not be corrupted in anyway. These values are what stakeholders will always fall back on in times of crisis. They are what protect cryptocurrency from harm, chaos and destruction.

By the way aren't there alternatives?? Great alternatives that will not alter the core values should be embraced by all.
legendary
Activity: 2912
Merit: 6403
Blackjack.fun
June 04, 2017, 10:16:31 AM
#10
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives.
It takes a lot more resources than bandwidth and hard drives to run a node. For starters, you do have to have a strong CPU that can verify the blocks in a fast manner. The increase in resources also means that it is close to impossible to run nodes on VPS.
So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?
Since it takes a longer time to verify, more and more miners will be going to SPV mining and skip the verification of the blocks as to save resources. If you don't have a limit, the block size can be as large as you want and if a miner decides to mine a 100MB block, this could take forever to verify. This can negatively harm Bitcoin.

Codswallop. I've been running a full node and on my I7-4790k tower computer and the cpu resource is rarely over 1%. It touches 2% when a new block is downloaded. Bandwidth is not an issue as millions do not use 56k phone lines anymore.

No one is suggesting 100mb block, so calm down and stop fuding.

Yeah , increase the blocks to 2mb we could have 600k transactions a day...in 2 months we will go above that
Then we will go for 4 mb..then for 16..

Let's assume we have 300 million bitcoin users that each make a transaction  a day.
Enjoy  1GB blocks.

Also check your hdd/ssd that's the one that is getting hammered while downloading and verifying blocks not the cpu.
legendary
Activity: 3542
Merit: 1352
Cashback 15%
June 04, 2017, 10:09:39 AM
#9
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives. So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?

Ah no, not every country in this world have affordable internet plans with unlimited bandwidth, let alone the luxury of such storages and powerful cpu capable of verifying transactions in a fast, so omg, your reasoning is invalid. We also need to take resources into account and not just jump directly to the waters of a bottomless lake without knowing whether we have the proper tools and equipment to dive it.
legendary
Activity: 924
Merit: 1000
June 04, 2017, 10:02:19 AM
#8
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives.
It takes a lot more resources than bandwidth and hard drives to run a node. For starters, you do have to have a strong CPU that can verify the blocks in a fast manner. The increase in resources also means that it is close to impossible to run nodes on VPS.
So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?
Since it takes a longer time to verify, more and more miners will be going to SPV mining and skip the verification of the blocks as to save resources. If you don't have a limit, the block size can be as large as you want and if a miner decides to mine a 100MB block, this could take forever to verify. This can negatively harm Bitcoin.

Codswallop. I've been running a full node and on my I7-4790k tower computer and the cpu resource is rarely over 1%. It touches 2% when a new block is downloaded. Bandwidth is not an issue as millions do not use 56k phone lines anymore.

No one is suggesting 100mb block, so calm down and stop fuding.
legendary
Activity: 924
Merit: 1000
June 04, 2017, 09:57:26 AM
#7
What is Core's excuse to avoid increasing bitcoin block size to 2mb?

To force the fees to go up.
To force users to use LN.
To siphon fees away from miners and line their own pockets.
Massive egos and pride among the developers means they won't admit they are wrong.

legendary
Activity: 1904
Merit: 1074
June 04, 2017, 08:45:08 AM
#6
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives. So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?

You are obviously living in a 1st world country. I travel between countries and I experience both worlds. I can tell you from experience that there

are only a small percentage of people in this world that would agree with your statement. Some of these countries have no infrastructure at all, so

they cannot host nodes as it is now. Decentralization is one of the most important aspects of Bitcoin and it is in our best interest to prevent the

centralization of Bitcoin. Every Block size increase will decrease the ability of smaller nodes to survive and in the end you will only have big

centralized data centres that can run nodes. {not the ideal situation for the sustainability of the network}  Angry
legendary
Activity: 3038
Merit: 4418
Crypto Swap Exchange
June 04, 2017, 08:12:15 AM
#5
Omg that's such a stupid reason. So many people have unlimited bandwidth and 1-3TB hard drives.
It takes a lot more resources than bandwidth and hard drives to run a node. For starters, you do have to have a strong CPU that can verify the blocks in a fast manner. The increase in resources also means that it is close to impossible to run nodes on VPS.
So that's totally null and void right there. Even doubling them to 2mb or having blocks that change with how much bitcoins are being used would make more sense. Why have a limit at all?
Since it takes a longer time to verify, more and more miners will be going to SPV mining and skip the verification of the blocks as to save resources. If you don't have a limit, the block size can be as large as you want and if a miner decides to mine a 100MB block, this could take forever to verify. This can negatively harm Bitcoin.
Pages:
Jump to: