Author

Topic: The MOST Important Change to Bitcoin (Read 17180 times)

hero member
Activity: 1078
Merit: 520
December 11, 2016, 11:42:24 AM
#48
I would say that rewarding none mining nodes a small amount of the block reward would have been a great idea but probably not a need that could have been foreseen, i think i am right in saying that satoshi never foreseen Asic chips and massive mining farms in china.  I think he thought that every node would be mining and thus would eventually be rewarded.
newbie
Activity: 42
Merit: 0
December 11, 2016, 10:47:45 AM
#47
The most important change to bitcoin hmm i think there is nothing to change at all. In my opinion it is already better in terms of the development of it. Rather bitcoin change the world for the better. The internet brought us free distribution of information and bitcoin brings us free distribution of commerce.
hero member
Activity: 952
Merit: 515
December 09, 2016, 01:07:54 AM
#46
I don't think there is still need to change, I trust the one who made it. I know that he deeply study it. And for me, it is great. But, I believe that there is still needs to improve like faster transaction confirmation and more advertisement but over all it is good.
member
Activity: 67
Merit: 10
December 08, 2016, 09:24:59 PM
#45
I'm really not sure to be honest

we need to really keep it decentralised tho
legendary
Activity: 3066
Merit: 1383
Join the world-leading crypto sportsbook NOW!
December 08, 2016, 04:04:30 PM
#44
That was an interesting topic, so I decided to put it on the first page. What can be changed? I think, the quantity of btc in general. It is true that 21 million is too small if we talk about bitcoin becoming very popular and global. The price will increase, because of the limited amount and then 20 000 satoshi per transaction may become too much. And there just won't be enough satoshi to use and there is nothing smaller than a satoshi now.
full member
Activity: 150
Merit: 100
July 29, 2010, 09:34:31 AM
#43

I don't understand what you're saying about speed, protocol buffers were designed by Google to satisfy three requirements:

My complaint here is using XML as the yardstick of efficiency.  It is hardly the most efficient data protocol format, but the advantages it does offer and the typical application which uses XML doesn't necessarily need the hyper-efficient data formatting either.  I'm not disputing what Google has done here either, as it gives a sort of XML-like data protocol framework with increased efficiency, including CPU speed.  It just doesn't solve all of the problems for everybody and it shouldn't be viewed as the ultimate solution for programming either.

I guess I'm sort of comparing an efficient data framework to programming in a high level language like C or Python vs. Assembly language.  Assembly programming is by far and away more efficient, but it does take some skill to be able to program in that manner.  It is also something very poorly (if at all) taught by most computer programming courses of study.

I wouldn't have used XML as a good indicator of performance personally, however it's the only numbers presented on the protobuf website. I might throw together a test using the C# implementation of protocol buffers to get some numbers for protocol buffers vs hardcoded binary packets. However, I suspect speed isn't *really* a problem here, the serialisation time is on the scale of hundreds of nanoseconds - not a problem!

As for size, I suspect protocol buffers are going to be smaller than a handwritten packet layout by satoshi for a couple of reasons:
1) Bitcoin includes reserved fields for forwards compatibility, ptocol buffers don't
2) Protocol buffers include complex things like variable length encoding etc, which would be a silly micro optimisation for Satoshi to include, but comes for free with protocol buffers (and can significantly decrease the size of a packet)
3) Losing a couple of bytes on the size of a packet (if, indeed packets do get bigger, I suspect they won't) but gaining cross language compatibility, standardisation of the packet layout, significant ease of use in code AND forwards compatibility is a *very* good tradeoff.
full member
Activity: 224
Merit: 141
July 28, 2010, 12:32:15 PM
#42
Generally a compact custom format works better in terms of bandwidth and disk usage, but I do see some advantages for something like this.  I am curious about how this works for forward compatibility where a new field or some aspect is added that wasn't accounted for in an earlier specification.  It does become a big deal, and I'll admit that XML and similar kinds of data protocols tend not to break as easily compared to rigid custom protocols.
By the same logic, C is faster than Python. However, I've run across a few Python programs that were orders of magnitude faster than similar C programs. How? The C programs were sloppy and Python's standard library functions are increasingly well-optimized.

It is a myth that C is the most optimized programming language and one that produces the most efficient software.  If you want to get into language bashing wars, I'm all for it and C would be one of my first targets.  For myself, I prefer Object Pascal as my preferred programming language, but some of that is habit and intimate knowledge of the fine points of the compilers which use that programming language.  I have certainly issued a challenge to any developer to compare binary files for similar implementations, and also note compilation speeds on larger (> 100k lines of code) projects.  Most C compilers lose on nearly every metric.  Enough of that diversion.

A custom specification that doesn't rely upon a protocol framework is almost always going to be faster, but it is also much more fragile in terms of future changes and debugging.  It doesn't have to be fragile in terms of forward compatibility, but you have to be very careful in terms of how the protocol is implemented to get that to happen.  A formal protocol framework helps you avoid those kind of problems with an existing structure, but it does add overhead to the implementation.

I don't understand what you're saying about speed, protocol buffers were designed by Google to satisfy three requirements:

My complaint here is using XML as the yardstick of efficiency.  It is hardly the most efficient data protocol format, but the advantages it does offer and the typical application which uses XML doesn't necessarily need the hyper-efficient data formatting either.  I'm not disputing what Google has done here either, as it gives a sort of XML-like data protocol framework with increased efficiency, including CPU speed.  It just doesn't solve all of the problems for everybody and it shouldn't be viewed as the ultimate solution for programming either.

I guess I'm sort of comparing an efficient data framework to programming in a high level language like C or Python vs. Assembly language.  Assembly programming is by far and away more efficient, but it does take some skill to be able to program in that manner.  It is also something very poorly (if at all) taught by most computer programming courses of study.
full member
Activity: 210
Merit: 104
July 28, 2010, 02:15:56 AM
#41
Generally a compact custom format works better in terms of bandwidth and disk usage, but I do see some advantages for something like this.  I am curious about how this works for forward compatibility where a new field or some aspect is added that wasn't accounted for in an earlier specification.  It does become a big deal, and I'll admit that XML and similar kinds of data protocols tend not to break as easily compared to rigid custom protocols.
By the same logic, C is faster than Python. However, I've run across a few Python programs that were orders of magnitude faster than similar C programs. How? The C programs were sloppy and Python's standard library functions are increasingly well-optimized.

As an example, the CAddress serialization in Bitcoin, passed around in at least the addr message and version message, is 26 bytes. That's 8 bytes for the nServices field (uint64, and always 1 on my system), 12 bytes marked "reserved", and the standard 6 bytes: 4 bytes for IP, 2 bytes for port number. While I agree that including the ability to support IPv6 and other services (or whatever nServices is for) in the future is a great idea, I also think it's a bit wasteful to use 26 bytes for what can currently be encoded as 6 bytes. With protocol buffers, this would be smaller now but retain the ability to extend in the future.

I agree that encryption and compression are a bit harder to take into account, but at the very least they could be layered on top of the protocol buffers. Build the byte string with something extensible, then encrypt/compress it and wrap it in a tiny header that says that you did that. You lose 2 or 4 bytes for the trouble, but you gain the ability to change the format down the road. Either that, or encrypt only certain protocol buffer fields and put the serialization on top of the encryption.
sr. member
Activity: 252
Merit: 250
July 26, 2010, 10:30:55 AM
#40
ASN.1  Shocked
full member
Activity: 150
Merit: 100
July 26, 2010, 08:46:20 AM
#39
I don't understand what you'resaying about speed, protocol buffers were designed by google to satisfy three requirements:

1) Forwards compatibility (Google changes their protocols all the time, protocol buffers allow them to do this with ease)
2) Speed (every millisecond matters, and protocol buffers are around the fastest serialisation method out there, the documentations says "are 20 to 100 times faster [than XML]")
3) Size (Protocol buffers are tiny, the documentation says "are 3 to 10 times smaller [than XML]")

Protocol buffers were designed almost for the exact problem bitcoin is facing.

Addendum: If you want to know how they work, have fun
full member
Activity: 224
Merit: 141
July 26, 2010, 08:42:10 AM
#38
It is also possible to make protocols that are forward compatible to changes that may be made in the future.

Making forward compatible protocols is something I've been trying to tackle for my Computer science final year project, the solution I came up with was protocol buffers, it just struck me that they're perfect for bitcoin for a whole load of reasons:

Generally a compact custom format works better in terms of bandwidth and disk usage, but I do see some advantages for something like this.  I am curious about how this works for forward compatibility where a new field or some aspect is added that wasn't accounted for in an earlier specification.  It does become a big deal, and I'll admit that XML and similar kinds of data protocols tend not to break as easily compared to rigid custom protocols.

Another problem with something of this nature is that you have to stay within the framework of the specification language or however else this data structure is organized, and it doesn't anticipate things like encryption and compression.

While there certainly are many applications that could use a tool of this nature, I'm not entirely sure that Bitcoins is necessarily "a perfect application" of this particular tool.  Considering the nature of this project and anticipated scales of operation, overhead and abstraction can be quite costly for bandwidth even if it is just a couple of bytes and a few extra machine cycles.  That is something which matters.
full member
Activity: 150
Merit: 100
July 26, 2010, 06:59:41 AM
#37
Nope, the standard implementation of protocol buffers is under the new BSD license. Furthermore, there are loads of versions of protocol buffers in many different languages all published under a variety of licenses. Some people have even written entire probuf parsers in 100 lines, so if we really wanted bitcoin could have its own implementation of a protobuf parser and that code is entirely ours. However, the standard implementation is just fine Wink
hero member
Activity: 938
Merit: 500
CryptoTalk.Org - Get Paid for every Post!
July 26, 2010, 05:06:48 AM
#36
Google maintains the rights to code they have designed.  Incorporating it would require data reporting to google.  Generally not a good thing.
full member
Activity: 150
Merit: 100
July 25, 2010, 08:59:58 PM
#35
I don't know much about the scripts, but if you're right and this is what scripts were designed to address then I'd push for protocol buffers, they're designed and built by google so (no offence to satoshi) they're probably better Wink

Also, the fact that protocol buffer are supported by lots of other languages would make building clients (without generation) in other languages (with protobuffer support) absolutely trivial.

Edit:: Changing the packet structure to use protocol buffers would be difficult to do, although I would still highly recommend it. However, changing the structure of the local files to use protocol buffers isn't a breaking change, which means that it would be an excellent idea to do in my opinion (smaller, faster, neater code, easier to parse in other languages etc etc)
full member
Activity: 210
Merit: 104
July 25, 2010, 08:39:18 PM
#34
It is also possible to make protocols that are forward compatible to changes that may be made in the future.

Making forward compatible protocols is something I've been trying to tackle for my Computer science final year project, the solution I came up with was protocol buffers, it just struck me that they're perfect for bitcoin for a whole load of reasons:

-> Small (suitable for entworking and hard disk storage)
-> Very fast
-> Implementations in loads of languages (So writing new clients become a lot simpler)
-> Forwards compatible (indeed, this is most of the point of protocol buffers)
-> Dead simple to use in code
-> Support for custom fields in packets (so, for example, a new client could start embedding messages in packets, and all the other clients would silently ignore this field)

So I guess the most important change to bitcoin for me is to start using protocol buffers for networking and saving the wallet file
Wow, that's awesome. That seems a lot like what the SCRIPT part of transactions was designed to address, but this seems like a better way...
full member
Activity: 150
Merit: 100
July 25, 2010, 08:25:22 PM
#33
It is also possible to make protocols that are forward compatible to changes that may be made in the future.

Making forward compatible protocols is something I've been trying to tackle for my Computer science final year project, the solution I came up with was protocol buffers, it just struck me that they're perfect for bitcoin for a whole load of reasons:

-> Small (suitable for entworking and hard disk storage)
-> Very fast
-> Implementations in loads of languages (So writing new clients become a lot simpler)
-> Forwards compatible (indeed, this is most of the point of protocol buffers)
-> Dead simple to use in code
-> Support for custom fields in packets (so, for example, a new client could start embedding messages in packets, and all the other clients would silently ignore this field)

So I guess the most important change to bitcoin for me is to start using protocol buffers for networking and saving the wallet file
Red
full member
Activity: 210
Merit: 111
July 25, 2010, 08:12:39 PM
#32
I would make the rate of coin generation constant ...

You might try reading in the economics forum. Lot's of competing suggestions in there.

Beware, it's a hornet's nest!
newbie
Activity: 2
Merit: 0
July 25, 2010, 07:15:13 PM
#31
I would make the rate of coin generation constant rather than decreasing, so that there would not be constant deflation as people lose coins.  If one assumes that coins are lost at a constant proportional rate (-kC, where C = the number of coins and t = time), and coins are generated at a constant (unchanging) rate (G), then there is some circulation C for which G - kC = 0 (in other words, the circulation stays constant).  That level is C = G/k.  So if we want the circulation to stabilize at Cf, then G should be set to kCf.  For example, if 5% of coins are lost each year, then k is equal to -ln(0.95) per year = 0.0513 per year (a weird unit).  If we want C_f to be a trillion bitcoins, then roughly 51.3 bitcoins per year (a bit more than 5% of 1 trillion bitcoins) should be created.
sr. member
Activity: 252
Merit: 268
July 25, 2010, 06:35:22 PM
#30
A breaking change could be made without breaking the network by setting the change to not take affect until a certain block six months or more in the future. That would give most everyone plenty of time to upgrade their client. The few that didn't notice, would notice still notice sometime after the change and would still be able to upgrade and retain bitcoins obtained before the change.
full member
Activity: 152
Merit: 100
July 25, 2010, 06:10:12 PM
#29
Quote
We might not care that the minting rate is cut by 3/4 for months on end, but taking 4x longer to complete a transaction is a fairly big deal.
You mean it will be slow for months because it was difficulty was ramped up and then someone bailed? In order fall to 25% wouldn't 75% of the computing power have to be pulled? If someone can do that they control the project anyway.

Yes, the complete attack would require adding 3x the current CPU capacity for 3-4 days, long enough for the difficulty to increase by 4x, and then removing it. It's not necessary that all that CPU capacity be under the control of one person, however. A sudden, temporary surge of interest from many unrelated individuals could do the same thing (think "slashdotting"). So the attack could happen even without anyone taking control of the project.
legendary
Activity: 1246
Merit: 1014
Strength in numbers
July 25, 2010, 05:46:33 PM
#28
Quote
  We might not care that the minting rate is cut by 3/4 for months on end, but taking 4x longer to complete a transaction is a fairly big deal.


You mean it will be slow for months because it was difficulty was ramped up and then someone bailed? In order fall to 25% wouldn't 75% of the computing power have to be pulled? If someone can do that they control the project anyway.
full member
Activity: 152
Merit: 100
July 25, 2010, 05:18:12 PM
#27
- The client is distributed with the developers' public keys.
This goes fundamentally against the decentralized nature of the project. Absolutely no one is special except as people choose them to be. Anyone can distribute any version, breaking or not, that they wish.

The point was not to prevent alternate versions (it's still open source, anyone can change the accepted keys). The point was simply to ensure that only those who released the user's client could trigger an update, as opposed to any node in the network.

Please don't make a mistake at presuming that the client version has anything to do with the protocol version.  We are talking two completely different issues here that can (and I think should) be separated if you are talking about bitcoins.

I was referring exclusively to the protocol version supported by a given client. The version/release of the client software in use is, as you say, irrelevant in this context.

A client could be programmed with dual behavior. It keeps using the old behavior unless it sees that more than a certain percentage of clients on the network have signalled that they can support some new behavior and are willing to switch to it. ... The devil is in the details, of course, because different clients have different peers and will see different percentages of those peers being able to support the new behavior.

That would be why I suggested keying the changeover to the transactions in the block list, which are common to all clients, rather than the versions reported by one's peers.

Personally, I think it's going to be years, if ever, before a breaking change is needed.

I wouldn't be so sure about that. Most of the changes I've seen proposed are breaking changes--for example, anything which affects the difficulty calculation must be common to all the clients. As I pointed out, the period of the current calculation leaves Bitcoin vulnerable to certain attacks against the transaction confirmation rate. We might not care that the minting rate is cut by 3/4 for months on end, but taking 4x longer to complete a transaction is a fairly big deal.
full member
Activity: 224
Merit: 141
July 25, 2010, 11:45:13 AM
#26
Versioning sounds like version numbering to me, but anyway, if you were to go back and improve the ability of Bitcoin to be more backwards compatible with then future changes, then yeah, that sure would make it more flexible. Good choice!
I realize there are already version numbers. However, as things stand older clients will just reject anything with a new version number, which results in splitting the block list. There is no protocol in place to allow new clients and old clients to coexist (which would be rather impressive--the changeable parts would have to be moved to some Turing-complete language that could be downloaded at runtime) or, perhaps more practically, to allow older clients to detect when a new change is being introduced and insist on an upgrade rather than proceeding under the old rules.

Please don't make a mistake at presuming that the client version has anything to do with the protocol version.  We are talking two completely different issues here that can (and I think should) be separated if you are talking about bitcoins.

It is precisely for this reason that I have been called for a second and perhaps more full re-implementation of the protocol, as it would be healthier for Bitcoins for a whole bunch of reasons, and to concentrate on the protocol independently from any sort of central control over the CVS repository for the reference implementation of Bitcoins.

It is also possible to make protocols that are forward compatible to changes that may be made in the future.  It does take anticipating what those changes might be or at least providing "hooks" to be able to allow for some future changes, such as allowing for extended precision numerical values on transactions that has been discussed earlier.  It is also important that such future changes can't destroy the fundamental principles that are being sought after for the protocol as well... so in this case it is a particularly fine line to cross if any sort of protocol change will happen.

Even though real money is now being used for Bitcoins, I do think it would be useful for at least the present users to know this is very much experimental and there may be some unstoppable flaw in the current protocol that may require a "reboot" with a more secure protocol.  I've seen that happen with other peer-to-peer networks where a major upgrade forced a major segmentation between the "old" clients and the "new" clients.  Hopefully the change would only be done in such a way that could preserve the Bitcoins that have already been generated in some fashion, but that may not always be a guarantee either.
legendary
Activity: 1246
Merit: 1014
Strength in numbers
July 25, 2010, 07:38:31 AM
#25
Anyone can distribute any version, breaking or not, that they wish.
Absolutely true. If people who propose changes can phrase the changes in terms of "a client" rather than "the client", that will help to filter out impossible proposed changes.

Having said that, there are ways that breaking changes could be addressed. A client could be programmed with dual behavior. It keeps using the old behavior unless it sees that more than a certain percentage of clients on the network have signalled that they can support some new behavior and are willing to switch to it.

At that point, clients which support the scheme will switch to the new behavior, and everyone using other clients will need to change their software if they want to keep playing with the majority.

The devil is in the details, of course, because different clients have different peers and will see different percentages of those peers being able to support the new behavior. You could have two thresholds. A client could switch to the new behaviour if it saw (say) 85% of nodes capable of supporting the new behaviour, AND was receiving new-behaviour transactions. If it saw (say) 95% of nodes capable of supporting the new behaviour, it would switch unilaterally even if it had not seen a new-behaviour transaction yet (someone's gotta get the ball rolling).

Personally, I think it's going to be years, if ever, before a breaking change is needed. To address the topic at hand, the MOST important change to Bitcoin isn't any technical change, it's just to gain more users who actively trade, so that Bitcoin doesn't fade away.

Nice.

Is it possible to put any kind of message in a block or does it have to be 'clean' or 'perfect' in a sense? I ask because that's more important than the number of nodes you talk with being willing to change. If you see that 80 of the 100 most recent blocks were created by clients ready to change then you could be confident and switch.
legendary
Activity: 1246
Merit: 1014
Strength in numbers
July 25, 2010, 07:14:15 AM
#24
Versioning sounds like version numbering to me, but anyway, if you were to go back and improve the ability of Bitcoin to be more backwards compatible with then future changes, then yeah, that sure would make it more flexible. Good choice!
I realize there are already version numbers. However, as things stand older clients will just reject anything with a new version number, which results in splitting the block list. There is no protocol in place to allow new clients and old clients to coexist (which would be rather impressive--the changeable parts would have to be moved to some Turing-complete language that could be downloaded at runtime) or, perhaps more practically, to allow older clients to detect when a new change is being introduced and insist on an upgrade rather than proceeding under the old rules.

What I had in mind was something like this:
- The client is distributed with the developers' public keys.
- When a breaking release is introduced, the developers first add a special transaction to the block list warning about the upgrade.
- The client validates the signature of this warning and, if valid, passes it on to the user.
- The new client is released with support for both the old rules and the new ones. (This is already necessary, as the old blocks must be validated.)
- The client's highest supported protocol version goes into each transaction.
- After a fixed number of block are produced after the warning with no client protocol versions less than the announced version, the upgrade is committed and the new rules take effect. (Blocks with older protocols are accepted, but do not count toward the goal.)
- At this point new transactions with obsolete protocol versions will be rejected, and older clients will cease to operate rather than attempting to start their own independent block chain. Obviously this can be worked around by modifying the code, but there is little benefit in doing so. Anyone who wants to continue using Bitcoin will be driven to upgrade instead.

This way, rather than starting and abandoning a bunch of mini-block-chains as different users upgrade at different times, there would be a window during which upgrades could take place while maintaining compatibility, after which all the upgraded clients would switch to the new rules at the same time. The critical changes are (a) the upgrade notice transaction and associated key distribution & validation; (b) a protocol version field in the transaction structure (already present?); and (c) kill-switch code in the current client to detect a committed upgrade based on the upgrade notice and subsequent blocks.

This goes fundamentally against the decentralized nature of the project. Absolutely no one is special except as people choose them to be. Anyone can distribute any version, breaking or not, that they wish.
full member
Activity: 152
Merit: 100
July 25, 2010, 06:48:21 AM
#23
Versioning sounds like version numbering to me, but anyway, if you were to go back and improve the ability of Bitcoin to be more backwards compatible with then future changes, then yeah, that sure would make it more flexible. Good choice!
I realize there are already version numbers. However, as things stand older clients will just reject anything with a new version number, which results in splitting the block list. There is no protocol in place to allow new clients and old clients to coexist (which would be rather impressive--the changeable parts would have to be moved to some Turing-complete language that could be downloaded at runtime) or, perhaps more practically, to allow older clients to detect when a new change is being introduced and insist on an upgrade rather than proceeding under the old rules.

What I had in mind was something like this:
- The client is distributed with the developers' public keys.
- When a breaking release is introduced, the developers first add a special transaction to the block list warning about the upgrade.
- The client validates the signature of this warning and, if valid, passes it on to the user.
- The new client is released with support for both the old rules and the new ones. (This is already necessary, as the old blocks must be validated.)
- The client's highest supported protocol version goes into each transaction.
- After a fixed number of block are produced after the warning with no client protocol versions less than the announced version, the upgrade is committed and the new rules take effect. (Blocks with older protocols are accepted, but do not count toward the goal.)
- At this point new transactions with obsolete protocol versions will be rejected, and older clients will cease to operate rather than attempting to start their own independent block chain. Obviously this can be worked around by modifying the code, but there is little benefit in doing so. Anyone who wants to continue using Bitcoin will be driven to upgrade instead.

This way, rather than starting and abandoning a bunch of mini-block-chains as different users upgrade at different times, there would be a window during which upgrades could take place while maintaining compatibility, after which all the upgraded clients would switch to the new rules at the same time. The critical changes are (a) the upgrade notice transaction and associated key distribution & validation; (b) a protocol version field in the transaction structure (already present?); and (c) kill-switch code in the current client to detect a committed upgrade based on the upgrade notice and subsequent blocks.
sr. member
Activity: 252
Merit: 268
July 25, 2010, 05:31:27 AM
#22
...
What's the one thing you would change?

Short answer: versioning (point two).

Long answer: The first point is my greatest long-term concern, but since fixing it depends on being able to change the protocol without breaking the network I suppose I'd have to start with the versioning issue. After that, or perhaps for the same release, I'd probably work on the interval between difficulty adjustments, which is a fairly trivial change. With proper protocol versioning in place there would be no need to worry about the transaction issue for quite some time.

On the other hand, if you're serious about only allowing one change (ever), then versioning would be rather pointless and I'd probably fix the long-term blocks vs. transactions issue to keep the transactions going.
No, one change for all time is not assumed. Versioning sounds like version numbering to me, but anyway, if you were to go back and improve the ability of Bitcoin to be more backwards compatible with then future changes, then yeah, that sure would make it more flexible. Good choice!
full member
Activity: 152
Merit: 100
July 25, 2010, 05:22:20 AM
#21
...
What's the one thing you would change?

Short answer: versioning (point two).

Long answer: The first point is my greatest long-term concern, but since fixing it depends on being able to change the protocol without breaking the network I suppose I'd have to start with the versioning issue. After that, or perhaps for the same release, I'd probably work on the interval between difficulty adjustments, which is a fairly trivial change. With proper protocol versioning in place there would be no need to worry about the transaction issue for quite some time.

On the other hand, if you're serious about only allowing one change (ever), then versioning would be rather pointless and I'd probably fix the long-term blocks vs. transactions issue to keep the transactions going.
sr. member
Activity: 252
Merit: 268
July 25, 2010, 04:31:22 AM
#20
...
What's the one thing you would change?
full member
Activity: 152
Merit: 100
July 25, 2010, 03:55:20 AM
#19
Quote
We should be encouraging wealth creation so that people can spend their bitcoins on, not focusing on minting coins.
So far as the long-term economics of Bitcoin are concerned, that is exactly right. However, consider this: the value of any currency has two components, its value as a commodity and its marketability. Right now the marketability of BTC is practically nil--there is very little which you can actually buy with BTC, or sell for BTC, without first converting to some other currency. As a commodity, bitcoins are rather weak, having no direct use and no legal mandate (taxes, legal tender). That pretty much leaves just their novelty value, most of which comes from the (mostly inaccurate) perception of getting something from nothing via minting.

If everyone were to lose interest in minting new bitcoins right now, can you honestly say that Bitcoin would remain a viable digital currency? I think it rather more likely that people would lose interest. Perhaps we will eventually reach a point where the market for bitcoins is self-sustaining, but we're not there yet. In the interim, we have to keep potential BTC users interested, which means small, frequent payouts which pander to the average person's attention span.

Regarding the main topic of this thread, my main concerns/annoyances with Bitcoin are as follows:
  • Transactions are dependent on generating new blocks, and the design of Bitcoin has the generation of new blocks eventually becoming non-cost-effective. What happens when the (fixed) minimum work required to generate a new block becomes more than the block itself is worth? No more transactions?
  • Versioning will probably become an issue at some point. Very few changes to the Bitcoin design can be made backward-compatible with existing clients. Any breaking change will split the block chain unless everyone manages to upgrade at the same time. Some kind of auto-update system might be in order, and/or a mechanism to mark the formal transition to a new ruleset in the block chain. It is important that Bitcoin transactions continue to work between all users, without any inadvertent barriers resulting from bifurcated block chains.
  • It would be nice if the difficult-adjustment algorithm ran more often. The limits could be something like 80-110% difficulty per estimated day (144 blocks) rather than 25-400% per fortnight (2016 blocks), allowing a gradual adjustment over the same period. Under the current system a 4x increase in difficulty immediately followed by a return to previous capacity wouldn't be corrected for up to two months, during which transactions would take 4x as long to be confirmed. With the shorter period and 80-110% limits the same increase in difficulty would begin to be corrected within four days, with complete correction within six adjustment periods (under two weeks). Note that there is no inherent limit on how fast the difficulty can increase in real time; it depends on how much capacity was temporarily added. If we're hitting the limits (either variant) then the minimum rate of increase is about 4x per 3-4 days.
  • Related to the previous point: The limits should not be symmetric, as they are now, because upward difficulty adjustments naturally happen more frequently than downward adjustments. I'm not convinced there should be any lower boundary to the adjustment at all, apart from the absolute minimum work required per block. Limiting the downward adjustment rate creates an opportunity for someone to inhibit timely transaction confirmation for an extended period.
  • Could we please avoid creating dependencies on unstable development releases of third-party libraries (e.g. wxWidgets 2.9)? The statically-linked binaries help, but only if you don't intend to make any changes.
  • Has anyone else noticed that the log.* files in the database subdirectory appear to grow without bound so long as the program is running? Mine were over 160 MB recently, at which point I restarted Bitcoin and the files disappeared.
member
Activity: 70
Merit: 11
July 25, 2010, 02:25:13 AM
#18
I have no problems with Bitcoin's economic model (remember: money is not like other commodities) nor the decreasing rate of minting, but it would have been cool if done in less of a 'lottery' fashion and distributed equally among the users. It would be great if my laptop could generate even 0.1 BTC a day, for example.

I came up with at least a partial solution to get this to work earlier, and I think it could work too, but it wouldn't be trivial to implement:

https://bitcointalksearch.org/topic/m.2686

The problem right now is the network latency, and the fact that if the bitcoin block generation rate increased substantially that there would start to be some significant collisions between generated blocks and degrade the network.  My suggestion here was to have multiple chains off of chains that could help improve the scalability of the network.

Regardless, if the number of users significantly ramps up to be 100x or 1000x the current user base, even these "reduced difficulty" blocks would still be pretty dang difficult to generate.

The one change which I prefer would be to have the rate of bitcoins being minted stay constant forever.

I'd have to agree that generating more coin blocks but having those blocks worth less would be useful, and the notion of keeping the rate of increase in bitcoin blocks constant (instead of being halved every 4 years) isn't necessarily going to destroy the currency either.  What is needed is predictability here instead of chaos.

Considering that the once every 4 years change still has yet to happen, going to a constant increase in the coin supply over time isn't necessarily going to change at least current behavior toward the coins.  It also resolves the "lost coin" issue so far as any lost coins would eventually be "recovered" so far as simply being found money through new generation and IMHO would offer some stability against deflation.

I still think that the money supply itself should remain fixed and that credit markets should satisfy the demand for money; however, drawing the coin increase to a slowdown rather than a complete cutoff may be a solution for the slight monetary deflation that will occur from the lost coins.
full member
Activity: 224
Merit: 141
July 24, 2010, 03:17:48 PM
#17
I have no problems with Bitcoin's economic model (remember: money is not like other commodities) nor the decreasing rate of minting, but it would have been cool if done in less of a 'lottery' fashion and distributed equally among the users. It would be great if my laptop could generate even 0.1 BTC a day, for example.

I came up with at least a partial solution to get this to work earlier, and I think it could work too, but it wouldn't be trivial to implement:

https://bitcointalksearch.org/topic/m.2686

The problem right now is the network latency, and the fact that if the bitcoin block generation rate increased substantially that there would start to be some significant collisions between generated blocks and degrade the network.  My suggestion here was to have multiple chains off of chains that could help improve the scalability of the network.

Regardless, if the number of users significantly ramps up to be 100x or 1000x the current user base, even these "reduced difficulty" blocks would still be pretty dang difficult to generate.

The one change which I prefer would be to have the rate of bitcoins being minted stay constant forever.

I'd have to agree that generating more coin blocks but having those blocks worth less would be useful, and the notion of keeping the rate of increase in bitcoin blocks constant (instead of being halved every 4 years) isn't necessarily going to destroy the currency either.  What is needed is predictability here instead of chaos.

Considering that the once every 4 years change still has yet to happen, going to a constant increase in the coin supply over time isn't necessarily going to change at least current behavior toward the coins.  It also resolves the "lost coin" issue so far as any lost coins would eventually be "recovered" so far as simply being found money through new generation and IMHO would offer some stability against deflation.
legendary
Activity: 980
Merit: 1014
July 24, 2010, 02:06:06 PM
#16
The big problem for me is that the minting rate doesn't adjust with the size of the Bitcoin economy (ie. the swarm size). I have a feeling that if this isn't changed, the project may run out of steam.

While it may annoy old users a bit that they can't create as many coins, new users are waiting weeks to mint any coins. It is unlikely that Bitcoins will become established if new users aren't given a decent shot at their own minting (especially when the user base should be growing rapidly). Users who discovered Bitcoins early will still have amassed more coins than new entrants, even without the difficulty changing.

The software is very impressive, but the minting rules really need to be looked at again. Self minting is such a good idea for distributing new coins, but IMO it needs some basic economic theory applying to it, rather than a relatively arbitrary setting.

We should be encouraging wealth creation so that people can spend their bitcoins on, not focusing on minting coins.

Why not switch off minting altogether then, with that logic?

I am saying that minting is not overly important, rather than saying that minting should be turned off.
full member
Activity: 168
Merit: 100
July 24, 2010, 02:01:18 PM
#15
The big problem for me is that the minting rate doesn't adjust with the size of the Bitcoin economy (ie. the swarm size). I have a feeling that if this isn't changed, the project may run out of steam.

While it may annoy old users a bit that they can't create as many coins, new users are waiting weeks to mint any coins. It is unlikely that Bitcoins will become established if new users aren't given a decent shot at their own minting (especially when the user base should be growing rapidly). Users who discovered Bitcoins early will still have amassed more coins than new entrants, even without the difficulty changing.

The software is very impressive, but the minting rules really need to be looked at again. Self minting is such a good idea for distributing new coins, but IMO it needs some basic economic theory applying to it, rather than a relatively arbitrary setting.

I have no problems with Bitcoin's economic model (remember: money is not like other commodities) nor the decreasing rate of minting, but it would have been cool if done in less of a 'lottery' fashion and distributed equally among the users. It would be great if my laptop could generate even 0.1 BTC a day, for example.

In terms of coin value: A growing Bitcoin economy (number of user making transactions), with no monetary inflation == A static Bitcoin economy (number of user making transactions), with monetary deflation.

Minting is a great way of distributing the coins as the user base grows, but it could be done less arbitrarily. I don't mind the lottery system, but the number of coins in a mined block or the difficulty could be more flexible to demand.

Anyway, I don't want to debate this at length on this thread, as it will take it OT (and we have several threads about this already!); I just wanted to point out my agreement with the OP about this.
full member
Activity: 168
Merit: 100
July 24, 2010, 01:38:54 PM
#14
The big problem for me is that the minting rate doesn't adjust with the size of the Bitcoin economy (ie. the swarm size). I have a feeling that if this isn't changed, the project may run out of steam.

While it may annoy old users a bit that they can't create as many coins, new users are waiting weeks to mint any coins. It is unlikely that Bitcoins will become established if new users aren't given a decent shot at their own minting (especially when the user base should be growing rapidly). Users who discovered Bitcoins early will still have amassed more coins than new entrants, even without the difficulty changing.

The software is very impressive, but the minting rules really need to be looked at again. Self minting is such a good idea for distributing new coins, but IMO it needs some basic economic theory applying to it, rather than a relatively arbitrary setting.

We should be encouraging wealth creation so that people can spend their bitcoins on, not focusing on minting coins.

Why not switch off minting altogether then, with that logic?
member
Activity: 70
Merit: 11
July 24, 2010, 12:09:19 PM
#13
The big problem for me is that the minting rate doesn't adjust with the size of the Bitcoin economy (ie. the swarm size). I have a feeling that if this isn't changed, the project may run out of steam.

While it may annoy old users a bit that they can't create as many coins, new users are waiting weeks to mint any coins. It is unlikely that Bitcoins will become established if new users aren't given a decent shot at their own minting (especially when the user base should be growing rapidly). Users who discovered Bitcoins early will still have amassed more coins than new entrants, even without the difficulty changing.

The software is very impressive, but the minting rules really need to be looked at again. Self minting is such a good idea for distributing new coins, but IMO it needs some basic economic theory applying to it, rather than a relatively arbitrary setting.

I have no problems with Bitcoin's economic model (remember: money is not like other commodities) nor the decreasing rate of minting, but it would have been cool if done in less of a 'lottery' fashion and distributed equally among the users. It would be great if my laptop could generate even 0.1 BTC a day, for example.
legendary
Activity: 980
Merit: 1014
July 24, 2010, 11:49:06 AM
#12
The big problem for me is that the minting rate doesn't adjust with the size of the Bitcoin economy (ie. the swarm size). I have a feeling that if this isn't changed, the project may run out of steam.

While it may annoy old users a bit that they can't create as many coins, new users are waiting weeks to mint any coins. It is unlikely that Bitcoins will become established if new users aren't given a decent shot at their own minting (especially when the user base should be growing rapidly). Users who discovered Bitcoins early will still have amassed more coins than new entrants, even without the difficulty changing.

The software is very impressive, but the minting rules really need to be looked at again. Self minting is such a good idea for distributing new coins, but IMO it needs some basic economic theory applying to it, rather than a relatively arbitrary setting.

We should be encouraging wealth creation so that people can spend their bitcoins on, not focusing on minting coins.
member
Activity: 70
Merit: 11
July 24, 2010, 11:47:48 AM
#11
- No dependencies.
- A better way of handling "dust spam" than just forbidding those transactions for now. I anticipate problems with this in the future.
- The difficulty should be automatically reduced to the previous level if no blocks are produced for x hours.
- I'd like "complete" documentation: all non-trivial parts of the source code written in English.

I think the existing economic model is perfect.

^^^

+ complete server/client seperation, and an independent way of storing BTCs for the purposes of importing and exporting. I've also read that if someone has 50 BTCs, exports them, and then spends 1 BTC, then ALL 50 BTCs backed up are no longer valid. The backup tool should therefore ideally have a good way of visualizing this.
full member
Activity: 168
Merit: 100
July 24, 2010, 08:08:10 AM
#10
The big problem for me is that the minting rate doesn't adjust with the size of the Bitcoin economy (ie. the swarm size). I have a feeling that if this isn't changed, the project may run out of steam.

While it may annoy old users a bit that they can't create as many coins, new users are waiting weeks to mint any coins. It is unlikely that Bitcoins will become established if new users aren't given a decent shot at their own minting (especially when the user base should be growing rapidly). Users who discovered Bitcoins early will still have amassed more coins than new entrants, even without the difficulty changing.

The software is very impressive, but the minting rules really need to be looked at again. Self minting is such a good idea for distributing new coins, but IMO it needs some basic economic theory applying to it, rather than a relatively arbitrary setting.
full member
Activity: 150
Merit: 100
July 24, 2010, 07:17:24 AM
#9
Front end and back end seaprate to allow for different clients to be written without rewriting the backend stuff is the one single most important change for me.
administrator
Activity: 5166
Merit: 12850
July 24, 2010, 03:20:15 AM
#8
- No dependencies.
- A better way of handling "dust spam" than just forbidding those transactions for now. I anticipate problems with this in the future.
- The difficulty should be automatically reduced to the previous level if no blocks are produced for x hours.
- I'd like "complete" documentation: all non-trivial parts of the source code written in English.

I think the existing economic model is perfect.
sr. member
Activity: 308
Merit: 256
July 24, 2010, 02:49:23 AM
#7
I would still make it 50 BTC though, just split it up between the 2nd, 3rd, 4th place runner ups, etc. So the first place PC gets 25, the 2nd place PC gets 12, 3rd place PC gets 7, 4th place PC gets 3, etc. etc.
sr. member
Activity: 252
Merit: 268
July 24, 2010, 02:26:32 AM
#6
So perhaps as the difficulty increases, instead of giving all of the reward to the the client that got the winning block, additional runner up prizes could be given out to the node with the second, third or fourth lowest hash. This would work particularly well with the constant block verification idea that somebody mentioned in another thread since the lowness of all hashes would already be being compared.
Yes, that's what I meant.  The reward would be split among N nodes with the lowest hashes, with N varying with the difficulty, such that most nodes would receive some reward every day or two.  The split could be weighted, with the winner getting most of the reward.
That's a great idea, thanks for mentioning it! Perhaps it could work something like grades on a curve so that all the above average or rather below average hashes receive a certain reward proportionate to their performance. The idea is surprisingly simple, but it sure would require a real clever programmer to make it scale smoothly as the number of nodes increases and as processors become more powerful. I'm sure somebody can come up with an algorithm to make it work.
member
Activity: 182
Merit: 10
July 24, 2010, 01:48:07 AM
#5
So perhaps as the difficulty increases, instead of giving all of the reward to the the client that got the winning block, additional runner up prizes could be given out to the node with the second, third or fourth lowest hash. This would work particularly well with the constant block verification idea that somebody mentioned in another thread since the lowness of all hashes would already be being compared.
Yes, that's what I meant.  The reward would be split among N nodes with the lowest hashes, with N varying with the difficulty, such that most nodes would receive some reward every day or two.  The split could be weighted, with the winner getting most of the reward.
sr. member
Activity: 252
Merit: 268
July 24, 2010, 01:35:35 AM
#4
Bitcoin ought to regularly reward nodes for their contribution to the system, rather than offering increasingly-valuable prizes (blocks) that are increasingly-unlikely.

When I started using Bitcoin a couple weeks ago, ~2,000 Khash/s yielded ~0.8 blocks per day = ~40 bitcoins = ~$0.30 (as I recall).  Now, I get zero blocks per day.  If I got one, it'd be worth $2.70 or so.  Arguably, if I'm patient, I'll get a block every week or two, on the average.

The problem is that most new users probably won't wait a week or two.

This isn't equivalent to keeping difficulty constant.  That would accelerate the production of blocks (and bitcoins) -- which would have other (arguably bad) effects.
So perhaps as the difficulty increases, instead of giving all of the reward to the the client that got the winning block, additional runner up prizes could be given out to the node with the second, third or fourth lowest hash. This would work particularly well with the constant block verification idea that somebody mentioned in another thread since the lowness of all hashes would already be being compared.
member
Activity: 182
Merit: 10
July 24, 2010, 01:14:21 AM
#3
Bitcoin ought to regularly reward nodes for their contribution to the system, rather than offering increasingly-valuable prizes (blocks) that are increasingly-unlikely.

When I started using Bitcoin a couple weeks ago, ~2,000 Khash/s yielded ~0.8 blocks per day = ~40 bitcoins = ~$0.30 (as I recall).  Now, I get zero blocks per day.  If I got one, it'd be worth $2.70 or so.  Arguably, if I'm patient, I'll get a block every week or two, on the average.

The problem is that most new users probably won't wait a week or two.

This isn't equivalent to keeping difficulty constant.  That would accelerate the production of blocks (and bitcoins) -- which would have other (arguably bad) effects.
sr. member
Activity: 308
Merit: 256
July 24, 2010, 01:13:20 AM
#2
Leave everything exactly as it is, except have the difficulty adjusted every 1,000 blocks instead of every 2,016 blocks just to make it twice as responsive to server farms joining the network and then when light CPUs are running the network.

And a splash screen! For windows client anyway, the Linux client always seems to pop up right away.
sr. member
Activity: 252
Merit: 268
July 24, 2010, 12:42:22 AM
#1
If you could go back and make any one change to Bitcoin while it was first being developed, what change would you choose? Feel free to list other changes, but share which you would prefer over all others. You're welcome to change your mind, so don't stress yourself out over it.

The one change which I prefer would be to have the rate of bitcoins being minted stay constant forever.

Another idea I've heard is to have block difficulty constant to so that the number of amount of bitcoins being dispensed increases as the swarm increases. Recently somebody mentioned changing the block history to a balance sheet to reduce the block chain size. Another recent idea was to have the back end and front end of Bitcoin fully separate. Deciding that the client is the same except it makes you rich and other such silly changes don't count.

Now it's your turn. What would be the MOST important change to Bitcoin?
Jump to: