Pages:
Author

Topic: Date for 25 BTC per Block - page 2. (Read 34970 times)

sr. member
Activity: 477
Merit: 500
November 27, 2012, 06:56:22 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.
I see problems. after random long blocks you would have to set a minimum difficulty. what is the problem with the current design? just set your watch according to bitcoin blocks if you must, not the other way round

Only after a block lasting 2 weeks.

Edit: no, not even then. Halving the difficulty would be enought.
 
legendary
Activity: 1708
Merit: 1019
November 27, 2012, 06:36:47 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.
I see problems. after random long blocks you would have to set a minimum difficulty. what is the problem with the current design? just set your watch according to bitcoin blocks if you must, not the other way round
donator
Activity: 2772
Merit: 1019
November 27, 2012, 05:36:40 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.
sr. member
Activity: 477
Merit: 500
November 27, 2012, 04:14:58 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Well, yes. That would be even better. Perfect.

Actually, in my suggestion, it would cause *some* problems, if the hash rate dropped to half.. ;-)

But it is also true, this is just speculating. Cannot change it any more.
legendary
Activity: 1190
Merit: 1004
November 27, 2012, 04:00:22 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.
legendary
Activity: 2940
Merit: 1330
November 27, 2012, 02:57:19 PM
That's quite much true, ie this is more question of opinion. The functionality of bitcoin practically would not change and current algorithm is very good. Just being as pragmatic as I am, I would have done it differently.

We can argue as to whether Nite69's way would have been better or not, but now that the network is already up and running it would be dangerous to make such a change.  Old clients would reject blocks which meet the new difficulty requirements but not the old ones resulting in a hard split of the blockchain.

Any marginal improvement the change might bring is certainly offset by the difficulty of getting all the old clients updated.
sr. member
Activity: 477
Merit: 500
November 27, 2012, 10:42:27 AM


true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.



It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa.
However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time..

Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time.

Technically addition would be very simple, just add "+ delta hashrate" to the formula.
We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block.

Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all.

I agree. 10 minutes is arbitrarily chosen anyway. Accuracy is not a goal here.

That's quite much true, ie this is more question of opinion. The functionality of bitcoin practically would not change and current algorithm is very good. Just being as pragmatic as I am, I would have done it differently.
donator
Activity: 2772
Merit: 1019
November 27, 2012, 10:24:25 AM


true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.



It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa.
However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time..

Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time.

Technically addition would be very simple, just add "+ delta hashrate" to the formula.
We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block.

Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all.

I agree. 10 minutes is arbitrarily chosen anyway. Accuracy is not a goal here.
legendary
Activity: 980
Merit: 1008
November 27, 2012, 10:21:33 AM


true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.



It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa.
However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time..

Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time.

Technically addition would be very simple, just add "+ delta hashrate" to the formula.
We now have 170 blocks left. If we assume these last blocks are solved at a rate of 6 per hour, the network has spent about 36 days less than four years to solve 210,000 blocks. That's about 2.5% faster than 10 minutes per block, ie. 9:45 minutes per block.

Even if your algorithm is more accurate than the one in place now, is it even worth the added code complexity to get closer to 10 minutes than 9:45 is? Why would it matter at all that blocks take, on average, 15 seconds less to solve? I'd say it's completely unnecessary to adjust anything, and the 2.5% difference will make no measurable difference to how the network operates at all.
donator
Activity: 2772
Merit: 1019
November 27, 2012, 08:25:03 AM
200 blocks left... I can happen tomorow.. Smiley

given that "tomorrow" can mean a lot of things depending on timezone, this is very likely to be true.
full member
Activity: 146
Merit: 100
November 27, 2012, 07:07:38 AM
200 blocks left... It can happen tomorow.. Smiley
sr. member
Activity: 477
Merit: 500
November 27, 2012, 02:12:38 AM


true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.



It would 'make things worse' on cases when first hashrate is first increasing and the suddenly starts decreasing or vice versa.
However it would make things go better when hash rate increases or decreases steadily or stays constant. And this case is 99% of the time..

Ie, if the hashrate has doubled, we have a *good reason* to assume it will most likely double again in the same time.

Technically addition would be very simple, just add "+ delta hashrate" to the formula.
hero member
Activity: 812
Merit: 1000
November 26, 2012, 10:04:04 PM
I think there is a minor flaw in the design of the bitcoin; in calculation differency, the change should be accounted, so instead of "changing diff to a value which would have gave 10 minute interval last 2 weeks" the rule should be "changing diff to a value which gives 10 minute interval, if the calculation power changes as last 2 weeks".

I'm not convinced.  I think it's possible that your scheme could result in a less stable difficulty, with the difficulty adjustment continuously overshooting, overcorrecting.  Like watching a novice trying to steer a boat.

From a macro perspective that's precisely what happens anyway.

Just like anything in nature, or markets; there is never an equilibrium.  Any appearance of such is simply a frozen snapshot in time, that doesn't reflect reality seconds before or after the snapshot was taken.

Equilibrium is forever sought, never attained.



true, but nite's suggestion would amplify changes in difficulty. for example let's say the network hashrate doubles and blocks are solved every 5 minutes. we would have a huge difficulty adjustment sure, but under nite's system, it would assume the network hashrate was going to double *again*.

legendary
Activity: 1596
Merit: 1091
November 26, 2012, 04:49:39 PM
I think there is a minor flaw in the design of the bitcoin; in calculation differency, the change should be accounted, so instead of "changing diff to a value which would have gave 10 minute interval last 2 weeks" the rule should be "changing diff to a value which gives 10 minute interval, if the calculation power changes as last 2 weeks".

I'm not convinced.  I think it's possible that your scheme could result in a less stable difficulty, with the difficulty adjustment continuously overshooting, overcorrecting.  Like watching a novice trying to steer a boat.

From a macro perspective that's precisely what happens anyway.

Just like anything in nature, or markets; there is never an equilibrium.  Any appearance of such is simply a frozen snapshot in time, that doesn't reflect reality seconds before or after the snapshot was taken.

Equilibrium is forever sought, never attained.

legendary
Activity: 2940
Merit: 1330
November 26, 2012, 02:11:33 PM
I think there is a minor flaw in the design of the bitcoin; in calculation differency, the change should be accounted, so instead of "changing diff to a value which would have gave 10 minute interval last 2 weeks" the rule should be "changing diff to a value which gives 10 minute interval, if the calculation power changes as last 2 weeks".

I'm not convinced.  I think it's possible that your scheme could result in a less stable difficulty, with the difficulty adjustment continuously overshooting, overcorrecting.  Like watching a novice trying to steer a boat.
sr. member
Activity: 477
Merit: 500
November 26, 2012, 07:40:20 AM
ok... so aprox. 3 days left...

still don't know if we should celebrate this or not be happy at all Smiley

I think, thos who already have bitcoins, should celebrate. Those who does not have yet, should need not.
The price is already rising.. ;-)
sr. member
Activity: 477
Merit: 500
November 26, 2012, 07:38:18 AM

It's an estimate as it can't know either future hashing capacity or variance.

In 2011 there were adjustment periods with so much new capacity coming online that it took just 8 or 9 days to reach the next difficulty adjustment period.  (The fastest was in 2010, in under 4 days).    Later in October 2011, as a few miners were powering down it took 17 days to reach the next adjustment period.

I think there is a minor flaw in the design of the bitcoin; in calculation differency, the change should be accounted, so instead of "changing diff to a value which would have gave 10 minute interval last 2 weeks" the rule should be "changing diff to a value which gives 10 minute interval, if the calculation power changes as last 2 weeks". Shortly, it should have been extrapolated. If so, the 25BTC day could have been calculated more accurately already 4 years ago.

But it's a minor flaw, current rule just adds some variancy and is just fine.
full member
Activity: 146
Merit: 100
November 25, 2012, 12:47:39 PM
ok... so aprox. 3 days left...

still don't know if we should celebrate this or not be happy at all Smiley
legendary
Activity: 2506
Merit: 1010
November 24, 2012, 01:04:24 PM
Current estimation with 570 seconds: Wed Nov 28 17:22:04 UTC 2012

Except we have one last difficulty adjustment coming before block 210,000.

Interestingly, however, the estimate for the next difficulty is at about the same level as the current difficulty.   

So barring any changes in mining capacity (or extreme swing as the result of variance) it is looking like the halving will occur Wednesday afternoon/evening (U.S. timezones)  near the midnight hour for much of Europe and early morning Thursday in Asia.
legendary
Activity: 910
Merit: 1001
Revolutionizing Brokerage of Personal Data
November 24, 2012, 09:49:40 AM

It seems to be using 600 seconds per block, which makes it not very accurate until a few hours in advance. With the average of the last 100 blocks being about 570 seconds, the estimated time is currently ~5 hours off.

Current estimation with 570 seconds: Wed Nov 28 17:22:04 UTC 2012
Pages:
Jump to: