Pages:
Author

Topic: Thoughts on Bitcoin's sustainability on the long term - page 2. (Read 482 times)

legendary
Activity: 3248
Merit: 1402
Join the world-leading crypto sportsbook NOW!
I agree that scalability isn't a solved issue, especially if we think of adoption of Bitocoin as money (because I wouldn't say it's a big deal for hodlers, traders, or even for occasional big purchases, as hodlers don't need many transactions, traders often use centralized platforms, and big buyers can afford to lose some money on a fee). It's rather just something that's put on hold, not currently obvious as a big problem. Perhaps we'll see very user-friendly off-chain solutions OR get used to being okay with accepting zero-confirmation transactions up to a certain amount of money (then the high fees can be avoided) for the purchase. Or maybe the future of Bitcoin is still more of a future of an asset, not money.
As for a fork, I wouldn't say the risk is past us, but if Bitcoin survived the SegWit situation, I don't think it'll suffer much from future disagreements. A new coin can appear, but it won't undermine Bitcoin, wherever it's going.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
60mill people of coinbase.com 25mill people of binance use them as custodians because of the tx restrictions and cost of using the p2p BITCOIN NETWORK
But these services impose much larger transaction fees than the Bitcoin network. Binance, specifically, charges 50,000 sats for withdrawing your bitcoin to a legacy address.

consensus is code. code makes rules to understand what is deemed a purpose of a tx
What about my other argument?
It should be based on initial (e.g. hardware) and operational cost (e.g. internet connection and electricity) of a node, while also considering future initial/operational cost. In past, only BIP 103 consider this perspective seriously.
Isn't it weird that the contributor of this BIP is Luke, when he was the one who proposed 300kb block size?

And let's remember that automatically increasing block size won't mean we will have 4 times as many transactions instantly.
Don't forget that rising the block size by a factor of 4, would make spamming 4 times cheaper. Therefore, we'd definitely have more non-currency transactions. I'm guessing that it'd do more good for spammers than for actual users. See the recent Ordinal thing. You should take it for granted that the network will be clogged up 4 times more than now.
sr. member
Activity: 1572
Merit: 267
What do the developers say about that?
They have to answer Tongue


Don't get a shadow banner on your tail.
sr. member
Activity: 1572
Merit: 267
Admin! You try to ruin my day?

"An Error Has Occurred!
You are searching too quickly. Wait 4 seconds. (There is a Google-based search engine on the search page that does not have this limit.)"

Attention hut.
legendary
Activity: 4410
Merit: 4788
hardening consensus is not a fork creating event

if current rules are "we dont care"
then any new rule that says "only 80bytes on opcodeXX".  is within that "we dont care" range of 'upto 3.99mb amount thus wont cause any disruption

all it means is anyone after activation. that then tried to make a tx. will get their tx rejected if that individual made a bloaty tx
thus punishing the spammer alone not the network

it wont cause any block issues by refining and defining opcode byte limits because any new byte limits will be under the current lax rules thus not cause issues to current rules

the only time there would be a fork risking event is the opposite
if there was a rule of 80bytes in an opcode that someone wanted a new rule of XXXXkb
legendary
Activity: 2912
Merit: 6403
Blackjack.fun
Big blocks, I agree, throw out a significant part of "small players" out of the window, because the cost of infrastructure to run a node rises significantly.

significantly...
Instead of a $50 500GB a $150 2TB SSD that would be full in 8 years, 16GB of RAM you can buy at $50
Common, the increase would have been significant in 2009, not now!
And let's remember that automatically increasing block size won't mean we will have 4 times as many transactions instantly.

Now, the problem of security.
Bitcoins security is simple, if you reward miners enough you will have enough hashrate protection, valued at how much you pay for it, 100 Exahash or 100 Pethash are meaningless if you won't look at how easy it would be for an attacker to get ahold of that, back in 2013 a single S19 could have attacked the network, but there wasn't one available at that time for 2000$  Wink

So at any point, you would need to look at how much it would cost someone to attack it and at every single point this will be related to the reward, so the thing is that in order to keep the same level of protection the users would have to come up with the fees themselves. Right now, we have $21,621,853.65 in income in the last 24 hours and 821,913 active addresses, I choose active addresses because it's better than looking at the number of transactions, so that's 26$ per address used to keep the same level of protection without reward.
Would it work if the reward halves two times and the price is still at 20k? I would say no!




legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
Should we rise the block size limit at some point in the future?

Yes.

If yes, how much?

It should be based on initial (e.g. hardware) and operational cost (e.g. internet connection and electricity) of a node, while also considering future initial/operational cost. In past, only BIP 103 consider this perspective seriously.

I think it's fine if most users run SPV wallets, if casual users use custodial wallets, and only enthusiasts run full nodes. Yes, they won't get all the benefits of being your own bank and "don't trust, verify", but most people don't need them. Just owning and using Bitcoin in sufficiently secure manner is enough.
Do SPV wallets must upgrade their wallet softwares when Bitcoin protocol has upgrades?

Usually no, since upgrade usually released as soft-fork.
legendary
Activity: 3472
Merit: 10611
What do the developers say about that?
They have to answer Tongue

But generally speaking a hard fork comes with a lot of work and risks. Literary everyone has to upgrade to the new protocol, a regular user with a full node can no longer run their old version anymore, the businesses that were accepting payment using their own setup have to upgrade, exchanges, payment processors, mining pools, etc. all have to upgrade. Anybody who doesn't will automatically split the chain. Depending on the changes, the SPV users may also be forced to upgrade.

The bigger the community gets, the harder it gets to pull a successful hard-fork with least amount of network disruption but I wouldn't say it is impossible though. With enough time (like a year) we could make it happen clean enough.
legendary
Activity: 4410
Merit: 4788
But, inevitable at some point. For example, when the cryptographic algorithm breaks, we must make it non-standard the least and introduce the new, resilient algorithm. Same like with block size increase, if a need emerges and benefits every single user of the network, the network should swift to it.
there is a massive difference between softening consensus to let lame stuff in quick. vs when a need emerges that benefits ever user of the network the network should shift

the second part is true hard consensus. the network upgrades their nodes to be ready to verify the new need that benefits everyone. vs the soft consensus where any whim core wants is thrown in even without nodes being ready for it
the importance of network security is having mass nodes ready to verify a ruleset. not abstain and just let things pass without seeing if it complies to a ruleset

But if blocks are big, they can get the benefits of big blocks without giving up custody, whereas if they are not technically competent, and we take the lightning path, they have to give up custody. And it already happens. See BlueWallet.
60mill people of coinbase.com 25mill people of binance use them as custodians because of the tx restrictions and cost of using the p2p BITCOIN NETWORK

if binance alone can cause a week of congestion and fee excess on the bitcoin network. then bitcoin has been held back at limits that should have moved years ago
14 years and we are still at the <7tx/s (its about 3.5tx/s)
this is not about leaping to 100mb blocks like someone who inspires you thinks is the only opposite option to the current situation. its about removing the cludge of current situation to allow lean utility of upto 6000tx/s of current average tx size., then lean up the tx to be more so to get to about 10,000tx a block
AND
making fee penalties for those that spam their value more then X times aday rather then penalising everyone for those spammers abuse
for now
and then move up progressively(not leaping) the block size from 4mb to the next level
AND (as a sub service a new (fully working/unflawed) sub/side networks for niche utility where there is a monetary policy and security to protect users. (not blame users for flaws/loopholes)

bitcoin is not a "permissionless system of freedom to junk it up and expense it"
I'm only saying that you don't know what's "junk", and obviously people who send that "junk" isn't junk to them. If they pay the cost, they can clog the network likewise, there's nothing we can do about it. And as I've said numerously, any rule which would be enforced solely for this purpose would only do harm, because there are nearly infinitely manners to clog the network like this; NFT users will just find another way.

you do know whats junk. bitcoin is a payment network not a meme network.
consensus is code. code makes rules to understand what is deemed a purpose of a tx
im not saying no to taproot for instance. but taproot promised its purpose was lean signatures of 1 signature length. and so the opcode for taproot can have a limit of 1 signature length to ensure it meets its promise of purpose, thus removes the edge cases of abusing taproot for junk purposes
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
Regarding the issue of backwards compatibility, it is an important factor to consider when making changes to the network. Any changes that break backwards compatibility can result in a split in the network, potentially leading to multiple versions of the cryptocurrency.
But, inevitable at some point. For example, when the cryptographic algorithm breaks, we must make it non-standard the least and introduce the new, resilient algorithm. Same like with block size increase, if a need emerges and benefits every single user of the network, the network should swift to it.

And if they can't run a full node because they are bad with technology, then they won't do it, regardless if blocksize is big or small.
But if blocks are big, they can get the benefits of big blocks without giving up custody, whereas if they are not technically competent, and we take the lightning path, they have to give up custody. And it already happens. See BlueWallet.

In my opinion we need a hard fork sooner than later.
What do the developers say about that?

bitcoin is not a "permissionless system of freedom to junk it up and expense it"
I'm only saying that you don't know what's "junk", and obviously people who send that "junk" isn't junk to them. If they pay the cost, they can clog the network likewise, there's nothing we can do about it. And as I've said numerously, any rule which would be enforced solely for this purpose would only do harm, because there are nearly infinitely manners to clog the network like this; NFT users will just find another way.
legendary
Activity: 3542
Merit: 1965
Leading Crypto Sports Betting & Casino Platform
My thoughts on Bitcoin's sustainability is a bit different, because I do not want another fork war. I think we should focus more on creating something that will generate more transactions and not something that will remove transactions or reduce it from the Blockchain.

Take the off-chain solution like the Lightning Network as an example.... it is reducing on-chain transactions ...and we know the Block reward are supposed to be replaced by miners fees, so if we have less transactions in the future and almost no Block reward... who will be mining?

That for me is what sustainability is about.... finding solutions that will run on-chain and something that will generate miners fees in the long run.  Wink
legendary
Activity: 4410
Merit: 4788
you ask about bitcoin sustainability. so you must be seeking understanding about what is bitcoin to understand what needs to be sustained

the answer, its a payment network, that needs to continue to be a payment network to be sustainable. otherwise its just an expensive congested junk network of cat memes
...you thinking bitcoin fails if its not allowed to let in cat memes. is where your hung up missing the whole point

its become soft in recent years which has allowed cat means.. is the honest answer. not that it fails if we should prevent cat memes

cat memes were not on the network before certain softness was allowed years ago and even more so allowed recently

there are many ways to stop it

but first you need to understand bitcoins true purpose and how to sustain it

imagine this
you go to a store and buy a cordless drill with a visacard. you tap to pay. and look for the approval message, thinking it should appear in 2 seconds.. but instead you read
"we are not just processing your payment, we are instead transfering the latest marvel movie in HD from your card to your bank account, please wait for completion before we care about processing your payment for the cordless drill".. "please pay extra fee next time to avoid this delay"
and then you get elevator music playing out of the card reader

is that sustainable
visa suddenly saying its now a movie streaming service and not caring about being a payment service.. nope not sustainable, no one would want to use visa if it played that game. and visa should not change into something it was not first designed for
..

bitcoin is not a "permissionless system of freedom to junk it up and expense it"
we know where you might have got that idea from. or more specifically who.

bitcoin is a consensus payment system
something that person that inspires you does not understand or want bitcoin to remain as/return to being

consensus
consent of the masses

consent is different than permission
as no single person is giving permission(signing off on your payment on route).
 but the masses are(were strongly, now weakly) consenting to rules which strengthen what is deemed acceptable purpose to ensure best practices
so that people can do things lean and purposeful for its real purpose without needing someone else to sign for a payment you wish to go somewhere***

. as for subnetworks
there will new new subnetworks with their own purpose/niche utility. but the ones on offer lack many things to be the solution to a payment system with no middlemen because they cannot cope with the liquidity of peoples needs and they rely on middlemen
***(liquid uses 'federated reserves' middlemen for the pegging and the other L network needs middlemen signatures long the routes to give permission for funds to move through them for routing and require your destination to sign off on permission to receive payments)

notice the difference between consensus vs permissioned networks now, i hope so.

so it will be new and different subnetworks without the flaws that will offer better niche sub services

i will apologise for being the bearer of bad news. but it does need to be said

note there is not one single insult in this post
legendary
Activity: 3472
Merit: 10611
When it comes to scaling I always believed that we need a middle ground to do everything but not rely on one solution alone. For example you can't just increase the block size and call that a solution just like you can't not-increase the block size and say second layer is the solution alone.

We basically need 3 things at the same time: increasing the capacity, compressing the blocks/transactions and working on second layer solution.

This is also why I liked the idea of SegWit2x although it suffered from a flaw. It separated the hard fork and soft fork whereas it should have done it all in one hard fork.
In my opinion we need a hard fork sooner than later. A hard fork not only to increase the capacity (to something like 4-8 MB block size) but also to solve a lot of bugs lets say "weirdness" in the protocol that can help with the compression too. Weirdness like the one extra wasted byte that all OP_CHECKMULTISIG(VERIFY) scripts must contain, or the couple of extra bytes that are wasted by all transactions with witness, the 4 byte transaction version that is almost completely useless,  the flaws in ways [sig]OPs are counted, the existence of P2PKH and P2WPKH at the same time while they serve the same exact purpose (same as P2SH and P2WSH), and a lot more.
full member
Activity: 496
Merit: 142
Hire Bitcointalk Camp. Manager @ r7promotions.com
I think it's fine if most users run SPV wallets, if casual users use custodial wallets, and only enthusiasts run full nodes. Yes, they won't get all the benefits of being your own bank and "don't trust, verify", but most people don't need them. Just owning and using Bitcoin in sufficiently secure manner is enough.
Do SPV wallets must upgrade their wallet softwares when Bitcoin protocol has upgrades?

Users of SPV wallets are simply download, install those wallets to use for their storage, transaction broadcasting. If they can do more technical steps to verify wallet software before using it, it is better.

I don't think users are responsible for too depth technical knowledge and codes or steps. They simply interact with good wallet softwares in wallet interface. To help Bitcoin adoption, I don't think developers should force users to interact with wallet softwares by code.

Increasing blocksize is great when time comes. If Bitcoin had Segwit upgrade for it in 2017, it can have Segwit 2 in future. Community will support the block size increase and upgrade if it brings benefit in transaction fee and average waiting time for confirmations.
legendary
Activity: 3024
Merit: 2148
I don't believe that changing the block size limit is the ticket for a global scaled network. But, neither do I believe that leaving it as is, is too. I'm all in for transaction compression (which is what's scaling in the end), as an off-chain solution like lightning, but the problem with this is that it restricts access to technically incompetent users. It fundamentally does, because no ordinary person who wants to gain the benefits of peer-to-peer cash, has the time to configure a lightning node, a Bitcoin node, and run both.

Why are you saying that alternatives to blocksize increase are increasing Bitcoin complexity? If a user can run a full node, they can run an LN channel. And if they can't run a full node because they are bad with technology, then they won't do it, regardless if blocksize is big or small.

I think it's fine if most users run SPV wallets, if casual users use custodial wallets, and only enthusiasts run full nodes. Yes, they won't get all the benefits of being your own bank and "don't trust, verify", but most people don't need them. Just owning and using Bitcoin in sufficiently secure manner is enough.
hero member
Activity: 504
Merit: 625
Pizza Maker 2023 | Bitcoinbeer.events
Thank you for starting this discussion on the sustainability of the Bitcoin network. These are important questions to consider as the network continues to grow and evolve.

Regarding the block size limit, it is a trade-off between centralization and decentralization. On one hand, larger blocks allow for more transactions to be processed, increasing scalability. On the other hand, larger blocks also require more computing power and storage space, making it more difficult for smaller players to participate in the network as full nodes. This can lead to centralization, where only a few large entities are able to participate as full nodes, while the rest of the users have to rely on these entities to validate their transactions.

In terms of block size limit, there is no one-size-fits-all answer, as the optimal block size will depend on the specific use case and the goals of the network. Some proponents argue that the block size limit should be increased in the future to accommodate more transactions, while others believe that alternative scaling solutions, such as the Lightning Network, should be pursued instead.

Regarding the issue of backwards compatibility, it is an important factor to consider when making changes to the network. Any changes that break backwards compatibility can result in a split in the network, potentially leading to multiple versions of the cryptocurrency. This can cause confusion and fragmentation among users, making it more difficult for the network to achieve widespread adoption.

In conclusion, the future of the Bitcoin network's scalability and sustainability is uncertain, and there are many factors to consider when making decisions about the network's future. Discussions like this can help to gather diverse perspectives and find the best way forward.
legendary
Activity: 2030
Merit: 1569
CLEAN non GPL infringing code made in Rust lang
Yes i agree that there are flaws (or were introduced) in the system. And Ordinals is a clear demonstration of spam attack against Bitcoin that needs to be addressed for any hope of sustainability. It was decent until this happened. Even your idea is with merit, anything to make it more expensive to spammers or deter them in anyway so they naturally go away where its cheaper to do so.

I believe one of the reasons for the existence of PoW was designing a system to address the issue of spam in email. It is ironic that we now need to address the issue of spam in the blockchain, as it has clearly gone out of hands.

So if we go back to the previous pre-ordinals stage, was is sustainable? I think so yes. Even if you didn't have universal instant transactions, it was still good enough for most transactions and the occasional instant transaction, a system which could be improved still sure, lightning has its own de-merits but at least it was optional and you could just ignore it.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
changing the opcodes so each opcode has a defined byte limit, does not censor transactions but defines a transactions utility and purpose.
But I don't understand how's this associated with sustainability. If you think stuff like Ordinals are an obstacle to sustainability, then the system is fundamentally flawed to begin with. Limiting op codes will not prevent anyone from spamming the blockchain. Simple example: split the ordinal into x transactions. The only disincentive is the money they have to pay. Anything beyond that will probably do more harm than good.

defining the opcode byte limit makes transactions lean to support less wasted space usage to allow more transactions per block
So, what you're suggesting is to go on without backwards-compatibility. I agree that doing this might make the system work potentially more efficiently, but I'm not quite sure about this:
Quote
Can there be progression without backwards-compatibility?
legendary
Activity: 4410
Merit: 4788
ill be polite

changing the opcodes so each opcode has a defined byte limit, does not censor transactions but defines a transactions utility and purpose.
EG taproots promise of lean signature space that only requires <80bytes per transaction. instead of their current allowance of <3.99mb
multisig of 15 address limit =15*80byte
and so on
...
hardening the softness of these opcodes does not censor development, it prevents buggy new upgrades from sliding in soo easily. where it requires nodes to upgrade to be prepared to verify a new ruleset(proposal) before it can activate

this means developers can still code things, but they will be coding things the majority would enjoy and want. instead of what a small group has sponsored them to slide in

i understand hard consensus sounds like censorship of development by preventing a corporation lobbying the develops to do a certain thing. but it  can also be opportunity for said corporations to think beyond their personal desires and think about whole bitcoin community desire to sponsor features that help and give the whole community something nice

....
hardening consensus to require node readiness pre activation supports network security

defining the opcode byte limit makes transactions lean to support less wasted space usage to allow more transactions per block

removing the cludge code and fake math of byte counting..  that restricts tx data to 1mb and allows said excess uncounted(discounted) wasteful weight space.. would then allow more transactions per block by allowing them into the 3mb restricted area(restricted for certain formats witness) without changing the 4mb limit for now
where all transactions are seen as the same level of "freedom" to use part of the 4mb without being treated like a limited class
..
including a proper and better fee formulae (thats not just discounting new features to promote push people into using them. and instead making spammer that respend multiple times a day pay more then frugal spenders(once a day) can penalise the spammers without making the rest of the community "pay more"

..
all of this can be done without a re-org or fud of needing to split the network. nor need any opposition wars.*

* well unless one brand of nodes continues to pretend the network is their kingdom and they are the only monarchs that should reign supreme and rule over its peasants
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
Reserved for future summaries.
Pages:
Jump to: