Those who were terrified by the prospects of "gigablocks by midnight" should look away now.
What Is the Gigablock Testnet InitiativeA lot of people are willing to explore the concept of larger Bitcoin blocks. Now that Bitcoin Cash has somewhat proven this to be a viable strategy, there is no reason not to take things to a new level. The BUIP 065, also known as Gigablock Testnet Initiative, is quite remarkable. The goal is to create a global test network for bottleneck analysis under high levels of stress. Even with 8MB blocks in place, it is certainly possible there could be some technical issues in the future.
The Gigablock Testnet Initiative Is Pretty Intriguing
With a name such as this one, it is evident the people running this initiative should not be underestimated. The Bitcoin network is still somewhat plagued by record-high fees and unreliable confirmation times. Even with Segregated Witness activating on the Bitcoin network, it remains to be seen how well congestion can be handled in the coming months and years. The same applies to Bitcoin Cash, though, as a few recent difficulty adjustments caused blocks to be delayed by hours.
An increase in the network block size limit has shown network fees can effectively be reduced. Bitcoin Cash has showed that lower fees are perfectly acceptable, though reliable confirmation times are not necessarily always there. As it turns out, odd block times can skew up the confirmation time. In most cases, they range anywhere from 10 minutes to 4 hours, depending on the current network situation.
Establishing a global test network to stress test properly is direly needed. This is why the Gigablock Testnet Initiative is of such importance to all Bitcoin users. It will allow bottlenecks to be identified and figures out fixes to alleviate these concerns moving forward. None of the current Bitcoin or BCH testnets are capable of being stress tested in such a way due to built-in restrictions and a lack of dedicated hardware.
This particular project has a few main objectives. The first is setting up the global test network, which should be able to support blocks up to 1 GB in size. It is hardly a given that the Bitcoin network would ever require 1 GB blocks to accommodate transactions, but if it is to rival credit cards it may need to. The Gigablock Testnet Initiative aims to mimic sustained Visa-level throughput of up to 3,000 transactions per second.
Depending on how these experiments pan out, the team can then identify potential bottlenecks and hopefully come up with proper solutions to fix them. On-chain scaling experiments will occur on a regular basis by the looks of things. All of the findings will be presented to the rest of the Bitcoin community accordingly. For now, the plan is to run this initiative for at least five years, although the project could be powered down at any moment if needed. If that were to occur, the project would actually run for another three months thanks to available funding.
The Gigablock Testnet Initiative is a collaboration between Bitcoin developers, nChain, and researchers at the University of British Columbia. Although it was originally designed to be a Bitcoin Unlimited team, it seems evident the team will partner with Bitcoin Cash supporters moving forward. Many people assume Bitcoin Unlimited has been “absorbed” by Bitcoin Cash, although it is certainly possible the former project is still ongoing.
https://themerkle.com/what-is-the-gigablock-testnet-initiative/
Anyone think it'll last the full five years? Also, the article neglects to mention if anyone will be trying to run a full node throughout all this experimenting. That was pretty much the key argument from those adamant about keeping blocksize smaller, so I hope their views are taken into consideration. Otherwise there doesn't seem to be much point. We already know that a fully centralised system could probably handle gigablocks. The real question is, can a decentralised one? One with many users having a full copy of the ledger and possessing the adequate bandwidth and hardware to cope with that task. How would that hold up?