I was looking through the Burstcoin source code, and one thing that worries me about future scalability is the hard-coded MAX_NUMBER_OF_TRANSACTIONS constant, which limits the number of transactions per block to 255. This is similar to the 1 MB block size limit that is currently affecting bitcoin. Bitcoin is bound by this block size limit, and therefore the bitcoin network can only process a maximum of 7 transactions per second. Burstcoin has a similar limit right now, but its limit is 63 tx/sec. This gives Burstcoin more time to solve the problem - but since solving it requires a hard fork, I think it would be better to solve the problem now while the coin is still young.
I think the best solution is something along the lines of Gavin Andreson's
proposal to increase the maximum block size by 50% per year. I think in Burstcoin's case, the solution should be to increase the maximum number of transactions per block by 3.45% per month (which is about equal to a 50% increase per year).
This is an important change for the future scalability of Burstcoin. And it's an important change to make now, because the more the coin matures, the more difficult it will be to introduce a successful hard fork.
If you're interested, @burstcoin, I'd be willing to take a crack at coding up this change in behavior.
Bumping this since no one commented on it. Does anyone else think it's a good idea to raise the maximum blocksize formulaically over time?
While there certainly is a scalability issue, it won't get fixed just by raising that amount. A 4x increase on that number wouldn't be a bad starting point since nxt has that same per block limit at a 1 minute/block rate instead of 4, but regardless of what it is set to raising the limit only postpones the problem, and the real issue is that full network consensus is required for every transaction. Spamming every transaction to everyone will won't allow for high enough transaction rates for heavy usage over long-term.
I agree that simply changing the MAX_NUMBER_OF_TRANSACTIONS isn't enough to solve scalability issues. However, it is a necessary prerequisite. And, since changing this figure requires a hard fork, I think it is best to do the change sooner rather than later. The longer we wait, the more risky and difficult it becomes to execute the hard fork. It makes sense to get all the needed hard forks out the way now, while the complex politics of a large-market-cap coin haven't emerged yet.
Perhaps a pragmatic solution would be to introduce a second constant that represents a soft maximum. Similar to how Bitcoin has a hard limit of a 1 MB max block size, but also is configured by default to not create blocks larger than 250kb. The second value is user-configurable though, and some miners do indeed create larger blocks. This would allow us to solve all the other scalability issues as they come up without worrying about the complexity of a hard fork. The other scalability issues will also partially solve themselves over time as technology gets better.
In 15 years, and standard desktop machine will likely come with terrabytes of ram, petabytes of hard drive storage, and hundreds of CPU cores, and it will likely have at least a gigbit Internet connection. A machine like this should be able to easily handle 100k transactions per 4-minute block.
I really urge you to look into all the issues and politics surrounding the bitcoin max block size. We can avoid those issues in Burstcoin if we simply plan for the future now rather than putting off the changes until it is too late.