Pages:
Author

Topic: Bitcoin Scalability? - page 2. (Read 1968 times)

newbie
Activity: 16
Merit: 0
November 04, 2017, 03:17:49 PM
#9
The one thing you know is that full blocks are bringing much profit for miners than not filled blocks. On the other hand, the Bitcoin developer team want to remain the block size and not bigger blocks while a part of bitcoin community wants bigger blocks. In the next fork, the altcoin B2X brings bigger blocks and the SegWit functionality remains.
Yes absolutely correct but why the dev team donot want to increase the block size to 2mb and why are they opposing.
i can understand that it may not be a permannet solution but it will solve the current situaltion.
And after some time we can move to 3mb size and so on.
member
Activity: 98
Merit: 26
November 04, 2017, 03:07:09 PM
#8
rule that says you must put at least 10,000 transactions in each block.

This is not needed because Tx fees guarantee that it is in the interests of miners to put as many transactions as possible into a block.

Quote
Is there any reason this wouldn't work?

What you're trying to ask is why is there a limit to block size. There are two reasons. First, the Bitcoin network needs to be able to guarantee that consensus will be reached within 10 minutes. This is harder than it sounds because Bitcoin is a peer-to-peer relay network, which is very slow compared to the centralized, YouTube-style networking that we are used to, where huge caches of data are stored at a physically nearby server just waiting to be served up to anyone in the surrounding region at super-low latency and virtually unlimited bandwidth. Relay networks, by comparison, have unpredictable latency and bandwidth (network-wide throughput) is not guaranteed. In principle, the Bitcoin network could probably run a dozen times faster with 90+% reliability. But while 90+% reliability is good enough for serving up video where the worst-case scenario is that a few frames get dropped, this does not work for Bitcoin because a "dropped block" really represents a network-wide fork. So, the 10-minute rule creates tons and tons of padding to ensure that the blockchain reaches consensus on every single block. There are still occasional "orphan blocks" where the miners generate two different, valid blocks almost simultaneously. When this happens, one or the other block will generally win out after another 10 minutes when the next block gets generated. The network can support multiple instances of orphan-blocks; however, the probability of multiple orphan blocks goes asymptotically to zero with each additional block. The reason we can be sure that this asymptotic behavior holds is that the network makes sure there is lots and lots of time for the network to settle on consensus with each block added to the blockchain. If the blocks were large enough, nodes could not process them in time to make sure that this guarantee holds. If it takes a typical full node 11 minutes to process each block as it is mined (including network bandwidth+latency), the network would "fall behind" the miners and would no longer be able to reach consensus. But even if the typical full node can process a block in less than 10 minutes, orphan blocks are still at risk of creating a situation where the network can no longer reach consensus on which proof-of-work chain is the true chain.

Second, because the Bitcoin network is a peer-to-peer network, nodes are free to leave and re-join at will. The time to re-join the network is a linear function of the size of the blockchain (since the Genesis block). Right now, that is 140GB and I think it takes more than 24 hours to sync a full-node on a typical desktop PC. The blockchain will continue to grow at a rate of about 50-100GB per year with the current blocksize limitations; while the sync-time will grow in direct proportion to this, at least we know how much it will grow. If this were unrestricted, the sync time could actually fall behind an "event horizon" where it takes longer than 24 hours to process 144 blocks (24 hours worth of blocks), meaning no new node could ever join the network!
newbie
Activity: 28
Merit: 0
November 01, 2017, 12:19:52 PM
#7
...
Myth: bitcoin does not have the drawback of fiat money - the total BTC number is 21 million.
The developers could change the codes tomorrow to allow 1000 million BTC in 2140
I assume that by "developers" you mean "Bitcoin Core developers", because it's important to note that a significant number of people run clients other than the reference client.

However, this is entirely false.  If the Bitcoin Core client was altered in this way, people would have to choose to upgrade their client.  If people still run clients which involve having a max of 21 million coins and they continue the chain with a max of 21 million coins, there will still be a max of 21 million coins.
But now, we have these "open source" central bankers deciding
on the money supply.      
False.  Central bankers decide on the money supply, whereas what Bitcoin Core developers do (not that they have any intention of doing this), is ask people to support an increased money supply.  People could still choose to support the previous money supply and it would still exist.
https://www.reddit.com/r/Bitcoin/comments/36lzft/can_the_maximum_number_of_bitcoins_be_changed/

"Theoretically yes, if all the miners and payment providers adopted a new system which allowed for more BTC to be issued to miners in future it can but would make a hard fork.

Due to a hard fork needed in practice it is so incredibly unlikely to happen I'd say not, at least not in our lifetime. The issue is if more were to be added in would shake confidence in it being a finite resource so anyone involved in the ecosystem would lose out.

Look at how difficult it is just to increase the block size from the temporary 1Mb to 20Mb being proposed... if bitcoin is still around in 10 years it's going to be even harder to make a significant change like that.

Who knows though, it's hard enough to see 5 years into the future never mind 50."
hero member
Activity: 1792
Merit: 534
Leading Crypto Sports Betting & Casino Platform
November 01, 2017, 12:00:47 PM
#6
1) allowing poweful hardware to having mining advantage is simply against distributed consensus.
Even if BTC's algorithm was ASIC-resistant, the vast majority of miners would be people who could afford huge investments in mining equipment, particularly in expensive GPUs in areas with low electricity.
Myth: bitcoin does not have the drawback of fiat money - the total BTC number is 21 million.
The developers could change the codes tomorrow to allow 1000 million BTC in 2140
I assume that by "developers" you mean "Bitcoin Core developers", because it's important to note that a significant number of people run clients other than the reference client.

However, this is entirely false.  If the Bitcoin Core client was altered in this way, people would have to choose to upgrade their client.  If people still run clients which involve having a max of 21 million coins and they continue the chain with a max of 21 million coins, there will still be a max of 21 million coins.
But now, we have these "open source" central bankers deciding
on the money supply.      
False.  Central bankers decide on the money supply, whereas what Bitcoin Core developers do (not that they have any intention of doing this), is ask people to support an increased money supply.  People could still choose to support the previous money supply and it would still exist.
Surely if the miners can choose how many Transactions to put into their block, there could be some kind of rule that says you must put at least 10,000 transactions in each block.
There is a maximum block size to prevent more transactions from becoming a burden to full node users.  Miners are incentivised to reach the maximum block size with transaction fees.
newbie
Activity: 28
Merit: 0
November 01, 2017, 11:42:38 AM
#5
I think the way the scalability problem is "left" unresolved does support conspiracy theory of how AXA , etc. has bought over bitcoin. There are many questionables in the current protocol:
1) allowing poweful hardware to having mining advantage is simply against distributed consensus.
There is no need to have transfer fees, incentives, etc, to motivate these mining conglomerates. If the next
day, their rigs are all destroyed - it is better for bitcoin; there are million of desktops in India, Indonesia, China
who would willingly do the mining provided the protocol is design to have one-node-one-vote. As of now, you   
cannot  stop others to think the "open source" developers have not been bought.

Myth: bitcoin does not have the drawback of fiat money - the total BTC number is 21 million.
The developers could change the codes tomorrow to allow 1000 million BTC in 2140! They could then sweet
talk about how it would benefit bitcoin, etc. It is still all politics. This myth is very good to bring in those who
have been told the evil of fiat money. But now, we have these "open source" central bankers deciding
on the money supply.      

staff
Activity: 3458
Merit: 6793
Just writing some code
November 01, 2017, 10:19:50 AM
#4
What is the advantage of having smaller blocks?
Smaller blocks can propagate with less latency and bandwidth and also require less resources to validate. Smaller blocks allow for more people to run full nodes as the network bandwidth requirements are lower. Larger blocks will likely lead to fewer full nodes and an increased orphan rate.

All of the data gets converted to a merkle root at a later date anyway so there shouldn't really be a size problem should there?
No, that does not happen. The section about that in the whitepaper only refers to on disk storage for which we have a much more efficient method: pruning. The network and computational requirements still exist as full blocks still need to be uploaded and downloaded and then verified.

Besides, I thought that the block size is undetermined anyway - It's up to the miners to choose how many transactions they want to put in each block.
There is a maximum block size which full nodes enforce.

Is there something in the current protocol that is stopping miners from including, let's say, 100,000 transactions in each block?
The maximum block size which has been redefined by segwit to be block weight.
member
Activity: 74
Merit: 11
November 01, 2017, 08:03:36 AM
#3
The one thing you know is that full blocks are bringing much profit for miners than not filled blocks. On the other hand, the Bitcoin developer team want to remain the block size and not bigger blocks while a part of bitcoin community wants bigger blocks. In the next fork, the altcoin B2X brings bigger blocks and the SegWit functionality remains.

What is the advantage of having smaller blocks? All of the data gets converted to a merkle root at a later date anyway so there shouldn't really be a size problem should there? Besides, I thought that the block size is undetermined anyway - It's up to the miners to choose how many transactions they want to put in each block.

Is there something in the current protocol that is stopping miners from including, let's say, 100,000 transactions in each block?
legendary
Activity: 1059
Merit: 1020
November 01, 2017, 07:08:17 AM
#2
The one thing you know is that full blocks are bringing much profit for miners than not filled blocks. On the other hand, the Bitcoin developer team want to remain the block size and not bigger blocks while a part of bitcoin community wants bigger blocks. In the next fork, the altcoin B2X brings bigger blocks and the SegWit functionality remains.
member
Activity: 74
Merit: 11
November 01, 2017, 05:39:51 AM
#1
I just read the white paper and was wondering why there are so many issues with scalability.

Surely if the miners can choose how many Transactions to put into their block, there could be some kind of rule that says you must put at least 10,000 transactions in each block. Then there could be way more Transactions per second.

Is there any reason this wouldn't work?
Pages:
Jump to: