Pages:
Author

Topic: Is it time to think about decimal precision ? (Read 787 times)

hero member
Activity: 718
Merit: 545
December 06, 2017, 04:30:15 AM
#21
Even if BTC only went to 100k.. haha.. it would still require sub-satoshi payments if you were counting every light switch on\off, or every single byte of data sent from a certain server.

Nobody's going to be counting that kind of IoT stuff on the blockchain, there just isn't enough space (bytes are too valuable).

... Lightning network.. will allow near-infinite off chain txns. Your light switch, along with the rest of your house, will have a payment channel open. ping ping ping. Of course they won't be on-chain txns.

It's really bad bad very bad idea!
Only integer numbers.

Not to worry, Bitcoin is never going to use floating-point arithmetic because rounding errors would screw up the total number of Bitcoins over time - cumulative error grows exponentially with number of operations. Integer-math guarantees that the number of satoshis in circulation will always balance correctly. Smiley

I'm not saying use floating point.. I was just using that as an example of doubling precision from float to double.

I'm saying increase the number of ZEROs allowed after the decimal point, whatever the number format. so 0.0000000000000001 would be a valid amount. There are still only only 21 million btc.
legendary
Activity: 3542
Merit: 1965
Leading Crypto Sports Betting & Casino Platform
But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins
Subtract Coin that are hoarded/Cold storage
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^


Adding precision is a completely different thing than adding precision AND stealing people’s bitcoin.  I can see most people supporting the former, but the only ones supporting the later doing so for nefarious reasons.  



I think we are missing each other with this post. By subtract, I mean that you should subtract those coins from this equation NOT steal people's coins. ^smile^

The coins will still be there, but they are out of circulation. So you cannot count them as available coins. This will leave us with a lot less Satoshi to play with.

Or, am I missing your point? ^smile^
member
Activity: 322
Merit: 54
Consensus is Constitution
It's almost too late already.  Bitcoin is the global reserve crypto because every altcoin starts out requiring bitcoin to buy it on an exchange.  This is where the majority of sustainable bitcoin demand comes from.  48% of the crypto market is altcoins.

The problem is that the smallest unit of bitcoin is now worth more than the largest unit of many altcoins.  This makes trading altcoins with bitcoin very inconvenient.  If this continues a more convenient coin will be used to trade altcoins instead, etherium which is divisible to 18 places.

This process is already underway and unless something is done right away, it will continue.  I know you all love your satoshi, but it is already holding bitcoin back as the other big altcoins take over.
member
Activity: 98
Merit: 26
It's really bad bad very bad idea!
Only integer numbers.

Not to worry, Bitcoin is never going to use floating-point arithmetic because rounding errors would screw up the total number of Bitcoins over time - cumulative error grows exponentially with number of operations. Integer-math guarantees that the number of satoshis in circulation will always balance correctly. Smiley
member
Activity: 98
Merit: 26
Even if BTC only went to 100k.. haha.. it would still require sub-satoshi payments if you were counting every light switch on\off, or every single byte of data sent from a certain server.

Nobody's going to be counting that kind of IoT stuff on the blockchain, there just isn't enough space (bytes are too valuable).

Quote
Since I DO see that happening in the next 10 years, and it could take 5 years to pull off a fork (IF we can even pull another one off successfully, since I have a feeling it's only going to  get harder the larger Bitcoin gets ), then I think the next fork is probably the one to aim for.

*shrug - I'm not saying we'll never need it but I think it won't be difficult.

Quote
I know it takes more power to compute and store.. but I would still add 8 bytes. Orders of magnitude more than 4. Then that would be it. Honest miners could limit the minimum spend. But I'd also be up for a variable precision solution, if someone had a good one.

That kind of sub-division can be done off-chain and makes sense to do off-chain, anyway (i.e. micro-payment channels). Obviously, you can't enforce a sub-satoshi payment on-chain but then a satoshi is very small, so you might be able to set up your micro-payments where you "pre-pay" the next satoshi, and then count down your usage with sub-satoshi resolution (e.g. audio-streaming by the byte or something).

Quote
Can it be done as a soft-fork ?

Depends on how you define soft-fork. It can be done without any change to the block itself, so it only requires a client upgrade to process a new tx format. This has happened already many times.
member
Activity: 112
Merit: 10
C++/Golang Dev
It's really bad bad very bad idea!
Only integer numbers.
legendary
Activity: 3122
Merit: 2178
Playgram - The Telegram Casino
I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

6 months go I would have agreed absolutely, but it's pretty crazy out there. It may take 5 good years to get another fork in place.

As I mention in the latter part of my comment, I'm rather positive about a precision update being less debatable than a block size increase. On the other hand you're not wrong. It's indeed something that could be worth looking at rather sooner than later. But in my opinion 5 years for getting a hardfork in place is still way too pessimistic. But who knows, there may be some nuances to the problem that are not obvious at a first glance.


Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Lightning is round the corner. Next year it will start being used properly. I think we're very close now. And there is nothing blocking it's implementation. It's definitely coming.

I also think that lightning is very close now. But the timeframe from "very close" to production ready to the actual deployment and real life usage may still be 1-2 years away. I will be gladly proven wrong though.


Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.

From memory there was a very specific data related reason why bitcoin was capped at 21 million.

[...]

The technical reason is the size of the integer datatype that Bitcoin is currently using, OP is suggesting to use a larger integer datatype instead Smiley


Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising...

Decimal Point Rising....

A movie title?

nice catch Grin


[...]

Can it be done as a soft-fork ?

I too would love an answer on that question.
hero member
Activity: 718
Merit: 545
Even if BTC only went to 100k.. haha.. it would still require sub-satoshi payments if you were counting every light switch on\off, or every single byte of data sent from a certain server.

Since I DO see that happening in the next 10 years, and it could take 5 years to pull off a fork (IF we can even pull another one off successfully, since I have a feeling it's only going to  get harder the larger Bitcoin gets ), then I think the next fork is probably the one to aim for.

Also - I would definitely think over compensation is the order of the day. IPv6 with it's 4 byte addition is way too small. They should have gone 8 and be done with it. They're just going to have to do it all again. Literally DECADES..

I know it takes more power to compute and store.. but I would still add 8 bytes. Orders of magnitude more than 4. Then that would be it. Honest miners could limit the minimum spend. But I'd also be up for a variable precision solution, if someone had a good one.

Quote
I don't see it requiring any major debate, it's a +4 byte delta on tx sizes..

I'll get the popcorn.
 
..

Can it be done as a soft-fork ?



legendary
Activity: 4256
Merit: 1313
But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins
Subtract Coin that are hoarded/Cold storage
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^


Adding precision is a completely different thing than adding precision AND stealing people’s bitcoin.  I can see most people supporting the former, but the only ones supporting the later doing so for nefarious reasons.  

member
Activity: 276
Merit: 48
But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins
Subtract Coin that are hoarded/Cold storage
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^


When you write software, especially something as precise as Bitcoin, you would never be so negligent as to write code that doesn't consider every corner case. So no, removing those not in circulation would never be an option.
member
Activity: 98
Merit: 26
But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

Oops, fat-fingered the calculator... anyway, it's just a few orders of magnitude, lol. But yeah, this clears it up.

At some point, it might make sense to start talking about moving up to 96-bit integers to allow division down to nano-satoshis. I don't see it requiring any major debate, it's a +4 byte delta on tx sizes and people will be wanting finer divisions when price gets high enough.
legendary
Activity: 3542
Merit: 1965
Leading Crypto Sports Betting & Casino Platform
But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins
Subtract Coin that are hoarded/Cold storage
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^
legendary
Activity: 3472
Merit: 4801
December 04, 2017, 11:57:02 PM
#9
But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.
newbie
Activity: 56
Merit: 0
December 04, 2017, 11:53:53 PM
#8
now a day the a day decimal pointing of the bitcoin well be arising,
member
Activity: 73
Merit: 13
December 04, 2017, 10:30:39 PM
#7
Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.

From memory there was a very specific data related reason why bitcoin was capped at 21 million.

Something along the lines of  (bad example:) there's 8 bits in a byte, and therefore we have 0 digit figures - increasing the digits requires adding a full extra byte which increases data requirements / processing requirements / bandwidth / etc.

Pretty sure there may be a legit opposition to this, but then again, it may be inconsequential at this point in time. Perhaps it only mattered 8 years ago.
member
Activity: 98
Merit: 26
December 04, 2017, 10:07:36 PM
#6
I think there will be disagreement about whether to make it a one off 'doubling' of precision (float -> double) or a more permanent variable precision representation.

The bitcoin blockchain does not currently use floating point numbers to represent transaction values.  It uses integers.

I'm actually a little confused about this. According to the dev-guide, TxOut.value is an int64_t. 21M Bitcoins * 100M satoshis = 2.1 quadrillion satoshis. But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800). This means that the upper 23 bits must be zero, no?
legendary
Activity: 2926
Merit: 1386
December 04, 2017, 08:06:06 PM
#5
I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising...

Decimal Point Rising....

A movie title?

legendary
Activity: 3472
Merit: 4801
December 04, 2017, 01:28:46 PM
#4
I think there will be disagreement about whether to make it a one off 'doubling' of precision (float -> double) or a more permanent variable precision representation.

The bitcoin blockchain does not currently use floating point numbers to represent transaction values.  It uses integers.
hero member
Activity: 718
Merit: 545
December 04, 2017, 11:57:58 AM
#3
I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

6 months go I would have agreed absolutely, but it's pretty crazy out there. It may take 5 good years to get another fork in place.

Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Lightning is round the corner. Next year it will start being used properly. I think we're very close now. And there is nothing blocking it's implementation. It's definitely coming.

Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.

I think there will be disagreement about whether to make it a one off 'doubling' of precision (float -> double) or a more permanent variable precision representation.

I can see arguments for both. Although - lol - a 'double' should do it. They may find other things to do with the numbers.
legendary
Activity: 3122
Merit: 2178
Playgram - The Telegram Casino
December 04, 2017, 11:10:32 AM
#2
I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.
Pages:
Jump to: