Pages:
Author

Topic: Dynamic Scaling? - page 2. (Read 561 times)

member
Activity: 392
Merit: 41
This text is irrelevant
December 13, 2017, 01:37:37 AM
#3
I kinda liked your proposal at first glance, but then I realized it is already implemented.

You see block size IS already "dynamic". Code only specifies maximum block size, so as a miner you can generate blocks of any length up to that size. Theoretically you could make limitless block size and let single miner empty the mempool in next block but that will end up in blockchain spam and insane chain bloating (up to the point bitcoin becomes absoletely unusable). Therefore you need to specify certain reasonable limit for each block. And that's what has been done. Some people argue that this limit should be higher, and by all means it can be higher but that isn't the solution since bigger blocks means bigger blockchain, bandwidth usage and bloat. Lightning (perhaps not in it's current state) is completely different - it allows to send barrages of valid transactions without actually posting most of them in blockchain, so blockchain will remain it's function of "bank", keeping your money safe, while lightning will allow you to use your money effectively in small chunks.

member
Activity: 67
Merit: 13
December 13, 2017, 01:24:28 AM
#2
I'd like to add that I don't believe that Lightning Network is the best solution to scalability. While it can certainly address the currently high fees, I have some long-term concerns about relying on it. The thing is, the Bitcoin network was designed such that less coins are minted over time so that the miners can gradually transition to being paid in fees. Eventually the only financial incentive miners will have is transaction fees, which means more transactions would lead to better mining incentives.

Using the lightning network to open channels through which several transactions can flow without fees circumvents that and can actually lead to higher average fees for everyone else. At present, it would work great because that would remove from the Bitcoin network large amount of transactions that would otherwise congest the network and improves usability for smaller transactions, but that doesn't side step the need to address network scalability issues. If anything, it seems like the lightning network would be better used as a tool for decentralized exchange between cryptocurrencies, but shouldn't replace the ability of any particular coin to act as an exchange of value.

If node operators were rewarded by the network the way miners currently are according to their bandwidth they would have a financial incentive to scale up their capabilities leading to the network as a whole scaling up if dynamic scaling was implemented. That would ensure that transactions everywhere confirm cheaply, and the lightning network can still be of use for quicker in person transactions at stores where you don't want to wait more than a few moments for the transaction to confirm.

If we go the route of relying on secondary networks to exchange value, how exactly are the miners going to be paid when there are no coins to mint and no on block transactions to confirm?
member
Activity: 67
Merit: 13
December 10, 2017, 12:48:02 PM
#1
Is there a reason why Bitcoin doesn't have an implementation of dynamic scaling the way it has dynamic difficulty settings? For example, instead of having to argue over how big a block should be every few years, why don't we let the network itself decide? From there, maybe offset the increased bandwidth of node operators with some kind of reward for running the node itself?

It just seems to me that this type of approach would allow the size of a block to increase or shrink according to the actual needs of a network, so it would be larger during times of major spending, and smaller during times of relative inactivity, leading to a lower average bandwidth usage than just increasing it. Even if there a reason to keep it from going above a certain point, wouldn't it make more sense to use an arbitrary ceiling so the block can shrink to below that size, but not rise above it - at least initially - and maybe implement some kind of community agreed upon protocol that would set concrete rules for if and under what circumstances the ceiling is raised, and then does so as some secondary dynamic system after a period of initial testing?

So, for example, one dynamic rule would increase or decrease the size of a block according to the actual needs of the network, whereas a second dynamic rule would set the floor and ceiling value that the first rule has to work within even if doing so causes a bottleneck. From there this second rule would adjust the ceiling value according to the average capabilities of the people running the nodes, so if enough people want the ceiling to go up they just have to upgrade their equipment to handle it, and by rewarding people who run the nodes (maybe based on network activity) it gives them incentive to help scale up the network.

I've had this thought for a long while actually. At first glance it would seem to benefit miners more to have more fees, but I argue that the miners are hurt because with higher fees people are less willing to make transactions and then Bitcoin itself becomes a store of value rather than the transfer of value that I feel it should be. With an improved network, the transaction fees go down and more people make more transactions and so the miners are able to survive on transaction fees alone when the last coin is mined.

While secondary coins could also be used to make purchases, there would be exchange fees involved and that would also influence the prices of each coin and other coins have the same problem. By my calculations, in order for Bitcoin (or any cryptocurrency with the same number of coins) to gain the status of a world reserve currency it would need to be worth around a million dollars a coin, which makes a single Satoshi worth around a penny. From there, the maximum digits past zero would need to be increased for the value to increase much more and still be functional in the same manner. I'd argue that there should also be an agreed upon ruleset that says when a digit is added, well in advance of the need for it, based on some predicted value to occur within the network automatically.

Is there any real reason why something like this can't work, or is somehow less preferred than a scenario of people arguing over small details every so often in perpetuity?
Pages:
Jump to: