Pages:
Author

Topic: Date for 25 BTC per Block (Read 35016 times)

sr. member
Activity: 477
Merit: 500
November 30, 2012, 02:55:07 AM
hmm.. did miners already start to turn off their machines.. 51 minutes since last block, 2 prevs are at 20 minute interval, block 210261

I think I'll keep mine running for a while.. but HEY! if the rate halves it will also halve the reward :-(


Edit: normal variance, I guess. F.ex. Slush hashrate does not seem to drop at all: https://mining.bitcoin.cz/stats/
legendary
Activity: 1190
Merit: 1004
November 29, 2012, 07:53:44 AM
Do I understand it right now?

Yes, I was thinking along those lines.

the difficulty should be calculated after every block

yep, based on a rolling 2-week / 2016 block period.


No reason why not to do this, it should keep the difficulty more stable this way because much could happen in 2 weeks.
full member
Activity: 146
Merit: 100
November 29, 2012, 04:33:12 AM
So, it ended up being November 28th in most every populated timezone.

Specifically, 2012-11-28 15:24:38 UTC
 - http://blockchain.info/block-index/322335/000000000000048b95347e83192f69cf0366076336c639f9b7228e9ba171342e

I was really expecting there to be some larger exchange rate volatility over the last few days.  I was expecting GPU miners to already have been selling in quantity on eBay.  

But it has been pretty quiet.


I think people expected asics to be shipped at least in november. That not being the case changed some things... many GPU miners will try for a few days more... Smiley
legendary
Activity: 905
Merit: 1012
November 29, 2012, 04:23:02 AM
I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.

Let us wait 'till december has passed with judging this. I doubt it's been priced in correctly.


I agree with molecular; The production has just halved, if the demand stays the same and no extra reserves are being used, the price should double.

Of course, when the price rises, some reserves are taken in to markets which slows down the rise.

But this is just speculating. We'll see.
Yeah but guess what: you're not the only one who came to that conclusion. And lots of people bought-in anticipating the rise in price... which itself caused the price to rise. That's what I mean by being "already factored into the price." Look at the price 6 mo ago vs today. Hey, it doubled! Coincidence? Only partly.
sr. member
Activity: 477
Merit: 500
November 29, 2012, 02:59:25 AM
I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.

Let us wait 'till december has passed with judging this. I doubt it's been priced in correctly.


I agree with molecular; The production has just halved, if the demand stays the same and no extra reserves are being used, the price should double.

Of course, when the price rises, some reserves are taken in to markets which slows down the rise.

But this is just speculating. We'll see.
hero member
Activity: 812
Merit: 1000
November 29, 2012, 02:54:27 AM
the difficulty should be calculated after every block

yep, based on a rolling 2-week / 2016 block period.
sr. member
Activity: 477
Merit: 500
November 29, 2012, 02:48:27 AM
If you could figure out the expected block height (for all time), the actual block height and the block rate over the last 2016 blocks then...

1. Add 2016 to the expected block height and take away the actual block height to get the number of blocks desired.
2. Determine the expected number of blocks to be created at the current rate.
3. Adjust difficulty to make expected rate hit the desired number.
4. Limit difficulty change to prevent large changes.

So if we have a difficulty change at block 2016000, and 2001 weeks have passed then we expect to be at block 2017008. Let's say over the last 2016 blocks, they were created at 8 blocks an hour. We need to reach block 2017008 + 2016 in two weeks time which is block 2019024. This is 3024 blocks more than 2016000, so we want 3024 blocks. This means we want a block creation rate of 3024/(14*24) an hour which is 9 blocks an hour. To try to get this rate we can adjust the difficulty by 8/9 = 0.889.

I probably explained that terribly but hopefully I got across the general idea. I didn't think about it too much and it is indeed academic, it doesn't matter.

One morea cademinc thought; the difficulty should be calculated after every block, only input counted from a longer period (2 weeks?).
Ie for every block, count a difficulty which will result in expected block height in next 2 weeks (if hash rate stays the same).
legendary
Activity: 2940
Merit: 1333
November 28, 2012, 09:10:53 PM
2. Determine the expected number of blocks to be created at the current rate.

what is the "current rate", how do you calculate/measure it?

I think his scheme, now that I understand it, is to work out how long it is supposed to be until the next difficulty change, calculate the current hash rate from the time taken to find the last 2016 blocks, and set the difficulty such that if the hashrate stays the same over the next period as it was for the previous period then the next difficulty change happens when it 'should', based on 10-minute blocks from the start.

So in my hypothetical situation in which everyone starts mining 10 times faster due to ASIC or alien technology, and assuming that the previous difficulty adjustment happened when it should have (since we've been using Matthew's system for a while), the next adjustment will happen 14-1.4=12.6 days too soon, and so the next adjustment period needs to take 14+12.6=26.6 days.  We know that at the current rate it'll only take another 1.4 days, so we need to increase the difficulty by a factor of 26.6/1.4 = 19 to get back on target.

If the hash rate stays the same, and we do end up with the next adjustment happening on the 'right' day, the next adjustment will reduce the difficulty by a factor of 26.6/14 = 1.9x.

Overall we've increased by 19x and then reduced by 1.9, giving a resulting adjustment over 2 periods of 10x, which matches the increase in hashing power.  So we end up back on schedule, but at the expense of having a very slow 26.6 days during which blocks took an average of 19 minutes each to find.

Do I understand it right now?
donator
Activity: 2772
Merit: 1019
November 28, 2012, 06:45:44 PM
2. Determine the expected number of blocks to be created at the current rate.

what is the "current rate", how do you calculate/measure it?
legendary
Activity: 1190
Merit: 1004
November 28, 2012, 02:47:57 PM
If you could figure out the expected block height (for all time), the actual block height and the block rate over the last 2016 blocks then...

1. Add 2016 to the expected block height and take away the actual block height to get the number of blocks desired.
2. Determine the expected number of blocks to be created at the current rate.
3. Adjust difficulty to make expected rate hit the desired number.
4. Limit difficulty change to prevent large changes.

So if we have a difficulty change at block 2016000, and 2001 weeks have passed then we expect to be at block 2017008. Let's say over the last 2016 blocks, they were created at 8 blocks an hour. We need to reach block 2017008 + 2016 in two weeks time which is block 2019024. This is 3024 blocks more than 2016000, so we want 3024 blocks. This means we want a block creation rate of 3024/(14*24) an hour which is 9 blocks an hour. To try to get this rate we can adjust the difficulty by 8/9 = 0.889.

I probably explained that terribly but hopefully I got across the general idea. I didn't think about it too much and it is indeed academic, it doesn't matter.
donator
Activity: 2772
Merit: 1019
November 28, 2012, 01:49:22 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

What happens when ASICs become cheap and available, and the network hash rate goes up by a factor of 10, say?

Suppose the difficulty only just changed, so we were expecting it to take 2 weeks until the next difficulty change.  We're running 10 times normal speed, so the difficulty changes in just 1.4 days instead of in 2 weeks.

Under the current scheme, the difficulty is adjusted by the maximum factor of 4, bringing block generation time up to a more reasonable 4 minutes, and on the next adjustment it's adjusted by another 2.5x taking us back to 1 block per 10 minutes.

With your proposed scheme, after the 1.4 days there would be very little adjustment.  The average time-per-block over the 4 year history of bitcoin won't have been affected much by the fact that the last 2 weeks' worth of blocks was found in just 1.4 days, because 12.6 days is pretty insignificant compared to the 4 year history.  So the adjustment will be minor, and we'll continue seeing blocks every minute.  It will take quite a while until the 10 minutes per block status quo is achieved again.

I'm having problems thinking about this. At first glance I thought like dooglus (bolded part). Then I thought a little about it and had some sort of unclear heureka moment (if such a thing exists) and thought it would work really well. Now I'm back to dooglus way of looking at it.

This is really moot to discuss as everyone agrees, but how exactly would the new difficulty be calculated?
legendary
Activity: 2940
Merit: 1333
November 28, 2012, 01:43:32 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

What happens when ASICs become cheap and available, and the network hash rate goes up by a factor of 10, say?

Suppose the difficulty only just changed, so we were expecting it to take 2 weeks until the next difficulty change.  We're running 10 times normal speed, so the difficulty changes in just 1.4 days instead of in 2 weeks.

Under the current scheme, the difficulty is adjusted by the maximum factor of 4, bringing block generation time up to a more reasonable 4 minutes, and on the next adjustment it's adjusted by another 2.5x taking us back to 1 block per 10 minutes.

With your proposed scheme, after the 1.4 days there would be very little adjustment.  The average time-per-block over the 4 year history of bitcoin won't have been affected much by the fact that the last 2 weeks' worth of blocks was found in just 1.4 days, because 12.6 days is pretty insignificant compared to the 4 year history.  So the adjustment will be minor, and we'll continue seeing blocks every minute.  It will take quite a while until the 10 minutes per block status quo is achieved again.
donator
Activity: 2772
Merit: 1019
November 28, 2012, 01:34:06 PM
I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.

Let us wait 'till december has passed with judging this. I doubt it's been priced in correctly.
legendary
Activity: 905
Merit: 1012
November 28, 2012, 01:23:04 PM
I said it before and I'll say it again: reward halving has been factored into the price of btc/usd for months now.
legendary
Activity: 2506
Merit: 1010
November 28, 2012, 01:19:57 PM
So, it ended up being November 28th in most every populated timezone.

Specifically, 2012-11-28 15:24:38 UTC
 - http://blockchain.info/block-index/322335/000000000000048b95347e83192f69cf0366076336c639f9b7228e9ba171342e

I was really expecting there to be some larger exchange rate volatility over the last few days.  I was expecting GPU miners to already have been selling in quantity on eBay.  

But it has been pretty quiet.
legendary
Activity: 2212
Merit: 1008
November 28, 2012, 10:00:13 AM
woohoo! today is the big day! Smiley Smiley Smiley
full member
Activity: 146
Merit: 100
November 28, 2012, 09:40:57 AM
3 blocks left Smiley
hero member
Activity: 812
Merit: 1000
November 27, 2012, 11:04:17 PM
Just so people know, this block could be found within 2 seconds of the prior block or 2 hours. That's unlikely, but I did see a block once take over 60 minutes. Because of this, bitcoinclock can give an approximation, but you really need to stay tuned to blockchain.info.

yeah, i still remember this 90-minute one... https://bitcointalksearch.org/topic/90-minutes-for-1-block-118930

what do you mean by "stay tuned to blockchain.info"? for what, the blockcount? that part isn't an approximation Smiley

legendary
Activity: 1190
Merit: 1004
November 27, 2012, 06:06:26 PM
If I designed the difficulty algorithm I'd look at the time taken from block 0 to the current block and compare it to the number of blocks times 10 minutes and adjust difficulty that way. So instead of the difficulty adjusting to the rate, the difficulty adjusts to the expected number of blocks for all time.

Intriguing. Seems to me it would work. Of course it would've had to have been introduced from the beginning.

Yes. It's just another way it could have been done so I was being academic.
hero member
Activity: 588
Merit: 500
firstbits.com/1kznfw
November 27, 2012, 06:00:22 PM
Just so people know, this block could be found within 2 seconds of the prior block or 2 hours. That's unlikely, but I did see a block once take over 60 minutes. Because of this, bitcoinclock can give an approximation, but you really need to stay tuned to blockchain.info.
Pages:
Jump to: