Author

Topic: Proof-of-work difficulty increasing (Read 37247 times)

newbie
Activity: 210
Merit: 0
January 25, 2024, 08:45:19 AM
#76
How interesting it is to plunge into the past and imagine how it all began.
sr. member
Activity: 406
Merit: 250
March 25, 2015, 12:56:10 PM
#75
I thought about that but there wasn't a practical way to do smaller increments.  The frequency of block generation is balanced between confirming transactions as fast as possible and the latency of the network.

The algorithm aims for an average of 6 blocks per hour.  If it was 5 bc and 60 per hour, there would be 10 times as many blocks and the initial block download would take 10 times as long.  It wouldn't work anyway because that would be only 1 minute average between blocks, too close to the broadcast latency when the network gets larger.

Wouldn't this only be true if the total transactions contained in the 1 minute blocks had the same size as the total transactions contained in the 10 minute blocks? And as long as the global transactions per second (tps) doesn't change, there shouldn't be a huge difference between 1 minute blocks and 10 minute blocks in terms of blockchain size, no?

As for whether or not a 1 minute block time would work, I think that really depends. You would certainly see more orphan blocks and more confirmations would be required to achieve the same level of security but I don't think it's completely infeasible. Over 95% of nodes have a network latency of <40 seconds and by 1 minute, I'd expect this number to be over 97%.

Just out of curiosity but what do you think is an appropriate trade off between latency and block generation time? And how do you think number of nodes affects average network latency? Correct me if I'm wrong but shouldn't best and average case latency to be reduced (or at least scale with O(log n)) with a larger network size since all nodes in the network would attempt to increase their share of connections instead of forcing transactions to require more hops? (assuming this is how you implemented it. I could be wrong on this.)
sr. member
Activity: 406
Merit: 250
AltoCenter.com
November 04, 2014, 10:50:09 PM
#74
It was designed to become tougher and tougher as more and more people got involved, wasn't it? At least that's what basic indicates.
newbie
Activity: 14
Merit: 0
November 04, 2014, 10:35:13 PM
#73
We are poised to reach ~44.6 billion and can potentially reach 100 billion difficulty by the end of the year. Over the past year the network has seen huge increases in security.
legendary
Activity: 2618
Merit: 1105
February 09, 2014, 10:55:23 AM
#72
This topic brings back memories...

I generated 5 blocks today on my Pentium processor. Two of them were within 3 minutes of each other.

Heh. Smiley

Damn, hey theymos: Wanna send me a few? ahhahah  Cheesy Cheesy why hadnt I heard of btc back then Sad
hero member
Activity: 868
Merit: 1000
February 07, 2014, 08:34:34 AM
#71
Ugh, from 23 to 244 in 3 weeks. No wonder my 2600 khash/s system is over 10 days on the current block!

The good old days.  Cheesy
full member
Activity: 238
Merit: 100
Stand on the shoulders of giants
February 06, 2014, 05:11:17 PM
#70
Thats should go to the history text book Grin
administrator
Activity: 5222
Merit: 13032
March 21, 2013, 08:54:32 PM
#69
This topic brings back memories...

I generated 5 blocks today on my Pentium processor. Two of them were within 3 minutes of each other.

Heh. Smiley
legendary
Activity: 1227
Merit: 1000
March 21, 2013, 08:35:35 PM
#68
Next increase in difficulty in est. 14 days...

 Wink
member
Activity: 102
Merit: 10
July 27, 2010, 02:33:55 PM
#67
I wonder if that's due to bigger swarm, or heavier iron, or both...?
legendary
Activity: 3878
Merit: 1193
July 27, 2010, 01:41:23 PM
#66
Ugh, from 23 to 244 in 3 weeks. No wonder my 2600 khash/s system is over 10 days on the current block!
hero member
Activity: 574
Merit: 513
July 27, 2010, 06:02:15 AM
#65
I designed a page to show history of difficulty values and published the code at https://bitcointalksearch.org/topic/difficulty-587
sr. member
Activity: 308
Merit: 258
July 26, 2010, 10:09:19 PM
#64
It's a good thing all those phantom super-clusters went off line a few weeks ago or who knows how high the difficulty would have jumped to?  Grin
founder
Activity: 364
Merit: 7248
July 26, 2010, 10:04:58 PM
#63
New difficulty factor 244.213223092
+35%

I updated the first post.

date, difficulty factor, % change
2009          1.00
30/12/2009    1.18   +18%
11/01/2010    1.31   +11%
25/01/2010    1.34    +2%
04/02/2010    1.82   +36%
14/02/2010    2.53   +39%
24/02/2010    3.78   +49%
08/03/2010    4.53   +20%
21/03/2010    4.57    +9%
01/04/2010    6.09   +33%
12/04/2010    7.82   +28%
21/04/2010   11.46   +47%
04/05/2010   12.85   +12%
19/05/2010   11.85    -8%
29/05/2010   16.62   +40%
11/06/2010   17.38    +5%
24/06/2010   19.41   +12%
06/07/2010   23.50   +21%
13/07/2010   45.38   +93%
16/07/2010  181.54  +300%
27/07/2010  244.21   +35%
sr. member
Activity: 294
Merit: 252
Firstbits: 1duzy
July 17, 2010, 08:45:19 AM
#62
I value Bitcoin as an anonymous digital currency.  Although I'm not expecting to get rich, I'd like the ability to continuously generate enough Bitcoin to purchase desired services.

Is there any expectation that economic value per khash/sec (or client) per day will be at least somewhat stable?  Difficulty just increased 300%, and USD/Bitcoin just increased about 500% (although that may turn out to be a spike).  I do get that there's no necessary relationship.  However, perhaps there's an economic basis for one (however approximate it might be).

Try the Bitcoin Economics Forum
You should be able to find many discussions about this topic there.

This thread is about the proof of work difficulty number.
member
Activity: 182
Merit: 10
July 17, 2010, 08:41:26 AM
#61
I value Bitcoin as an anonymous digital currency.  Although I'm not expecting to get rich, I'd like the ability to continuously generate enough Bitcoin to purchase desired services.

Is there any expectation that economic value per khash/sec (or client) per day will be at least somewhat stable?  Difficulty just increased 300%, and USD/Bitcoin just increased about 500% (although that may turn out to be a spike).  I do get that there's no necessary relationship.  However, perhaps there's an economic basis for one (however approximate it might be).
sr. member
Activity: 416
Merit: 277
July 16, 2010, 10:35:32 PM
#60
In the Economy subforum, I have just written a post titled "Get rid of 'difficulty' and maintain a constant rate" which outlines a scheme which a new version of the BitCoin software could use to keep the rate of block generation absolutely constant at the cost of a slight increase in network traffic.

I would be very grateful for your comments.

ByteCoin
full member
Activity: 224
Merit: 141
July 16, 2010, 08:31:32 PM
#59
Now, correct me if I'm wrong, but now that block generation is taking a lot longer, doesn't that mean that the lucky person who got the block is going to take a lot longer to be verified by the network that he/she was the winner before they could ever spend it?

No, this isn't correct.  Finding the hash requires a whole bunch of effort, but the process of verification that the "winner" has found a matching block is by comparison a trivial exercise and doesn't take all that long.  Once they have found the block, they can spend those coins right away.

What this does mean is that the coin allocation system has now become a lottery, where new winners are receiving a block of coins worth a whole lot more (due to exchange rates and scarcity) than earlier blocks which were in comparison relatively trivial in value.  My question is, how valuable will this "lottery" become in the long run (in terms of Euros or Dollars per block generated)?

Essentially what is happening is that the computer is picking a random sequence of numbers (like a lottery) and if that computer happens to pick the correct sequence of bits, you "win".  The change in difficulty is something akin to playing a lottery that has you only picking six numbers vs. one with fourteen or a hundred numbers to win.  The odds are actually worse for a bitcoin block than even the worst of any normal lottery that has ever been conceived at the moment.
sr. member
Activity: 308
Merit: 258
July 16, 2010, 02:05:10 PM
#58
Ah ok, cool. I continue to be astounded by how much thought was put into this system to keep it balanced, nice job!
founder
Activity: 364
Merit: 7248
July 16, 2010, 01:43:51 PM
#57
Right, the difficulty adjustment is trying to keep it so the network as a whole generates an average of 6 blocks per hour.  The time for your block to mature will always be around 20 hours.

The recent adjustment put us back to close to 6 blocks per hour again.

There's a site where you can see the time between blocks, and since block 68545, it's been more like 10 minutes per block:
http://nullvoid.org/bitcoin/statistix.php
member
Activity: 70
Merit: 11
July 16, 2010, 01:09:04 PM
#56
Yes, about 20 hours.  (120 conf / 6 blocks per hour = 20 hours)  That's the normal length of time before you can spend it.  You'll know long before that that you won one.
So if the difficulty was increased so high that it took a day to find a winning block, that means the lucky winner would have to wait 120 day before they could spend it or about 4 months if everyone else was averaging about the same speed? Seems like at the high end of the difficulty, there is an issue with coin generation vs. being able to put it into circulation by spending. Wouldn't the long delay cause a lot of generated coin to be lost because anything could happen to the PC that won in a long amount of time if the winner had to really wait that long? They might un-install the program or the computer get eaten by a virus or power surge well before then.

I think that the overall network is generating the same amount of blocks regardless of the difficulty; the difficulty is intended so that the network generates a block in a relatively constant amount of time. Therefore, this confirmation time should always be around the same.

Satoshi or anyone else can correct me if I'm wrong Smiley
sr. member
Activity: 308
Merit: 258
July 16, 2010, 12:33:57 PM
#55
Yes, about 20 hours.  (120 conf / 6 blocks per hour = 20 hours)  That's the normal length of time before you can spend it.  You'll know long before that that you won one.
So if the difficulty was increased so high that it took a day to find a winning block, that means the lucky winner would have to wait 120 day before they could spend it or about 4 months if everyone else was averaging about the same speed? Seems like at the high end of the difficulty, there is an issue with coin generation vs. being able to put it into circulation by spending. Wouldn't the long delay cause a lot of generated coin to be lost because anything could happen to the PC that won in a long amount of time if the winner had to really wait that long? They might un-install the program or the computer get eaten by a virus or power surge well before then.
founder
Activity: 364
Merit: 7248
July 16, 2010, 12:29:28 PM
#54
Yes, about 20 hours.  (120 conf / 6 blocks per hour = 20 hours)  That's the normal length of time before you can spend it.  You know long before that that you won one.
sr. member
Activity: 308
Merit: 258
July 16, 2010, 11:59:12 AM
#53
It adjusted to 181.54 a few minutes ago.  Typical time to get a block is about a week now.

The difficulty can adjust down as well as up.

The network should be generating close to 6 blocks per hour now.
Yeah, I've noticed the "10 second blocks" are gone, replaced with 419 and 741 second block generation with no more in the last 20 minutes. That should keep those server farms on hold for a while  Wink

Now, correct me if I'm wrong, but now that block generation is taking a lot longer, doesn't that mean that the lucky person who got the block is going to take a lot longer to be verified by the network that he/she was the winner before they could ever spend it?
newbie
Activity: 11
Merit: 0
July 16, 2010, 11:59:04 AM
#52
I'd be interested in seeing something like "expected bitcoins generated/day" next to (or in place of) the khash/s number. I'd rarely need to see the khash/s number since that won't change unless I make changes to the software or hardware.

I think the web c alc does a good job by showing likelyhoods based on khash speed:
http://www.alloscomp.com/bitcoin/calculator.php

That way you can see there is no guaranteed time horizon.
founder
Activity: 364
Merit: 7248
July 16, 2010, 11:56:54 AM
#51
It adjusted to 181.54 a few minutes ago.  Typical time to get a block is about a week now.

The difficulty can adjust down as well as up.

The network should be generating close to 6 blocks per hour now.
sr. member
Activity: 308
Merit: 258
July 16, 2010, 09:53:38 AM
#50
The proof-of-work difficulty is currently 45.38.  (see http://www.alloscomp.com/bitcoin/calculator.php

It's about to increase again in a few hours.  It's only been 3-4 days since the last increase, so I expect it will increase by the max of 4 times, or very nearly the max.  That would put it at 181.54.

The target time between adjustments is 14 days, 14/3.5 days = 4.0 times increase.

Holy....

Satoshi, what happens if the rush dries up for a bit; some of the slashdotters or whoever get tired? Does the difficulty ever go back down?
If I'm reading the source code correctly, it should go up and down based on how much CPU is being thrown at it. So if someone rented a super computer to drive up the difficulty for a week, then it vanished, the difficulty should float back down.
member
Activity: 70
Merit: 11
July 16, 2010, 09:48:54 AM
#49
The proof-of-work difficulty is currently 45.38.  (see http://www.alloscomp.com/bitcoin/calculator.php

It's about to increase again in a few hours.  It's only been 3-4 days since the last increase, so I expect it will increase by the max of 4 times, or very nearly the max.  That would put it at 181.54.

The target time between adjustments is 14 days, 14/3.5 days = 4.0 times increase.

Holy....

Satoshi, what happens if the rush dries up for a bit; some of the slashdotters or whoever get tired? Does the difficulty ever go back down?
founder
Activity: 364
Merit: 7248
July 16, 2010, 09:46:12 AM
#48
The proof-of-work difficulty is currently 45.38.  (see http://www.alloscomp.com/bitcoin/calculator.php

It's about to increase again in a few hours.  It's only been 3-4 days since the last increase, so I expect it will increase by the max of 4 times, or very nearly the max.  That would put it at 181.54.

The target time between adjustments is 14 days, 14/3.5 days = 4.0 times increase.
sr. member
Activity: 294
Merit: 252
Firstbits: 1duzy
July 13, 2010, 07:01:28 PM
#47
13/07/2010 0000000005a3f437d4a7f529fd4a7f529fd4a7f529fd4a7f529fd4a7f529fd4a
sr. member
Activity: 294
Merit: 252
Firstbits: 1duzy
July 13, 2010, 04:49:33 PM
#46
I'd be interested in seeing something like "expected bitcoins generated/day" next to (or in place of) the khash/s number. I'd rarely need to see the khash/s number since that won't change unless I make changes to the software or hardware.

You can use the calculator at: http://www.alloscomp.com/bitcoin/calculator.php

If this is a feature request post in the Development & Technical Discussion Forum: http://bitcointalk.org/index.php?board=6.0
newbie
Activity: 6
Merit: 0
July 13, 2010, 04:30:59 PM
#45
I'd be interested in seeing something like "expected bitcoins generated/day" next to (or in place of) the khash/s number. I'd rarely need to see the khash/s number since that won't change unless I make changes to the software or hardware.
administrator
Activity: 5222
Merit: 13032
July 13, 2010, 09:41:59 AM
#44
The probability of winning per hash went from 9.90701E-12 to 5.12995E-12. So about double the difficulty.
full member
Activity: 199
Merit: 2385
July 13, 2010, 07:52:22 AM
#43
"difficulty" : 45.38582234101263

It jumped from 23 in a couple days.  I think this pretty much puts an end to generating a block a day with a personal computer.. but you can still get lucky.  Now you'll need to build a cluster or hijack a college computer lab for it to be worth doing Smiley  I expect the trading value will increase significantly over the next few weeks as the supply slows down; should be interesting.

founder
Activity: 364
Merit: 7248
June 22, 2010, 11:51:14 AM
#42
Agree.  Certainly too trivial to clutter the user's attention with.

I changed it to every 30 minutes.

If I increased it to every 10 minutes, it would still be a small enough presence in the log file.  Question is whether that would be more output than the user wants when they grep.
legendary
Activity: 1652
Merit: 2301
Chief Scientist
June 22, 2010, 11:04:46 AM
#41
How about in the options menu you can turn it off or on, and specify an interval in minutes for how often it should display?
I say keep it simple; more choices isn't always better, it just makes it overwhelming and confusing for most users.
founder
Activity: 364
Merit: 7248
June 21, 2010, 01:09:17 PM
#40
I integrated the hashmeter idea into the SVN version.  It displays khash/s in the left section of the status bar.

Two new log messages:
21/06/2010 01:23 hashmeter   2 CPUs    799 khash/s
21/06/2010 01:23 generated 50.00

grep your debug.log for "generated" to see what you've generated, and grep for "hashmeter" to see the performance.  On windows, use:
 findstr "hashmeter generated" "%appdata%\bitcoin\debug.log"

I have the hashmeter messages once an hour.  How often do you think it should be?
administrator
Activity: 5222
Merit: 13032
June 09, 2010, 12:41:43 PM
#39
Thanks for clearing that up, fergalish!
sr. member
Activity: 440
Merit: 250
June 09, 2010, 04:22:04 AM
#38
I thought about that. I don't know if 0.5 is valid or not. I'll continue to take observations. I wonder if it writes to the debug log when it has success.
Use a value of 1, not 0.5.  Suppose max=100 and target=10, then 10 out of every 100 hashes will be at or below the target, so your success rate will be 10% NOT 5%.

At the moment target/max ~= 1.5x10^-11 (target~=0x000000000f, which is 36 zeros, so you basically need to throw a dice with 2^36=69 billion sides, and wait until you get a 1), and you're doing 1 million x 86400 = 86.4 billion hashes per day, so you can expect slightly more than one success per day.

It's VERY important to realise that this is the AVERAGE bitcoin creation time, and will only be valid over periods longer than about a week or so.  Because a success event is completely random (I hope, otherwise the hash function is probably not secure and someone will eventually crack it, and therefore bitcoin!), the interval between one success and the next will follow a Poisson distribution with n=0, i.e. an exponential (see wikipedia).  Therefore, with an average rate of, say, 1 success per day, you can expect that roughly 10% of the time, you'll have to wait 2¼ days or more, 1% of the time 4½ days, 0.1% 7 days and so on.
administrator
Activity: 5222
Merit: 13032
June 05, 2010, 11:21:18 AM
#37
Quote
I wonder if it writes to the debug log when it has success.

BitCoin does say when it has solved a block. Search for "proof-of-work found".

Quote
In Bitcoin, aren't multiple nodes working on the same block?

No. Each node's block is unique because it contains their unique public key. Functionally, every hash gives you a totally random number between 0 and the maximum value of a 256-bit number. If the random number is equal to or below the target, you win. So the probability of winning with one hash is target in max.

Starting on another block because somebody else won requires almost no work.

See the BitCoin Wiki "block" article and the BitCoin paper for more info.
full member
Activity: 210
Merit: 104
June 05, 2010, 10:28:29 AM
#36
The main part of the formula that I'm uneasy about is the "target probability" of 0.5. 0.5 is used for calculations involving brute-forcing passwords, but maybe this is different. If your blocks consistently take double the amount of time that the formula predicts, use 1 instead.
I thought about that. I don't know if 0.5 is valid or not. I'll continue to take observations. I wonder if it writes to the debug log when it has success.

Actually, that formula assumes that you're working on one block until you figure it out. In Bitcoin, aren't multiple nodes working on the same block? When one finishes, the others abandon work on it and choose another block? That was the impression that I had, but it might be wrong.
administrator
Activity: 5222
Merit: 13032
June 05, 2010, 10:02:57 AM
#35
The main part of the formula that I'm uneasy about is the "target probability" of 0.5. 0.5 is used for calculations involving brute-forcing passwords, but maybe this is different. If your blocks consistently take double the amount of time that the formula predicts, use 1 instead.
full member
Activity: 210
Merit: 104
June 05, 2010, 01:15:52 AM
#34
About one block per 9.9 hours. Does this match your observations? I'm not totally sure about the math.
My system has jumped up to about 1200 khps. Also, I got Bitcoin running on a VPS which is giving about 2350 khps. The formula which you posted (and I arrived at myself later) predicts a block every 8.2 hours for my laptop and one every 4.2 hours for the VPS. Since I started yesterday afternoon, I've only generated the one block.

Sound strange? Or am I just having bad luck?
full member
Activity: 199
Merit: 2385
June 04, 2010, 10:18:52 PM
#33
Cool, I was just playing around with trying to make it adjust the rate without taking much of a performance hit.. that's why it tries to adjust that variable, to keep it to a few per second or so.. maybe you can figure out a better formula, I just kind of did that by experimenting around with it.
full member
Activity: 210
Merit: 104
June 04, 2010, 09:38:14 PM
#32
Laszio, I've remade your patch to spam the debug log less (1/10th as often). I also the extended JSON-RPC library to add "gethps" (which returns the same string displayed in the UI). It also adds listgenerated, which returns a list of the strings from the UI representing generated blocks. In the third and final patch, I modified net.h to compile with -O2 on my machine (which it wouldn't before).

Here are the patches:
http://www.alloscomp.com/bitcoin/bitcoin-svn-80-perfcounter-less-debug-spam-2010-06-05.patch <- Apply Laszio's patch first, then this one.
Sorry, that patch ^^ is completely broken. The changes are trivial (the same line in 3 places), but I failed at diffing. I'll upload a better patch when I have time.
http://www.alloscomp.com/bitcoin/bitcoin-svn-80-rpcextended-2010-06-05.patch
http://www.alloscomp.com/bitcoin/bitcoin-svn-80-netpatch-2010-06-05.patch

Thanks to Laszio for the original performance patch.
full member
Activity: 210
Merit: 104
June 04, 2010, 12:51:40 AM
#31
Sounds completely reasonable. I just started generating today, so I'll let you know once I get more blocks. Thanks!
administrator
Activity: 5222
Merit: 13032
June 04, 2010, 12:31:31 AM
#30
chancePerHash=target/max
numberOfHashesToWin=0.5/chancePerHash
numberOfSecondsToWin=numberOfHashesToWin/hahesPerSecond

(0.5/(target/max))/hashesPerSecond

Code:
(0.5/(0x000000000f675c00000000000000000000000000000000000000000000000000/0xffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff))/1000000

=

35689 seconds

About one block per 9.9 hours. Does this match your observations? I'm not totally sure about the math.
full member
Activity: 210
Merit: 104
June 03, 2010, 10:58:32 PM
#29
Can someone share the math required to compute the estimated amount of time to generate a pack of bitcoins?

Basically, a formula that integrates the difficulty (either as a hex number (currently 000000000f675c00000000000000000000000000000000000000000000000000) or as the difficulty factor (currently 16.62)) and number of hashes checked per second (currently like 1M for me) which would return the average time between blocks found?
full member
Activity: 199
Merit: 2385
June 02, 2010, 06:21:14 PM
#28
I created a little performance counter for myself to use locally, you guys are welcome to try it.

Satoshi, maybe you could integrate this or something similar and put an option in there to turn it on/off?  It spams up your debug.log and shows the performance of each thread.. it also shows on the UI in the status bar where it used to say 'Generating'.

Patch: http://heliacal.net/~solar/bitcoin/bitcoin-svn-79-perfcounter-2010-06-02.patch

Screenshot:

founder
Activity: 364
Merit: 7248
June 02, 2010, 01:45:38 PM
#27
That's a good idea.  I'm not sure where exactly to fit that in, but it could certainly calculate the expected average time between blocks generated, and then people would know what to expect.

Every node and each processor has a different public key in its block, so they're guaranteed to be scanning different territory.

Whenever the 32-bit nonce starts over at 1, bnExtraNonce gets incremented, which is an arbitrary precision integer.
full member
Activity: 185
Merit: 114
June 02, 2010, 09:27:45 AM
#26
A nice addition to the GUI would be an estimate of how many hashes/sec it's computing. Either present this as a raw number or a "you can expect to generate X packs of bitcoins per week."

This might partially solve the frustration of new users not getting any Bitcoins right away.
member
Activity: 60
Merit: 10
May 11, 2010, 04:50:51 PM
#25
The way I understand it, since the data that's being hashed is pretty much random and because the hashing algorithm exhibits the 'avalanche effect' it probably doesn't matter if you keep starting with 1 and incrementing it or if you use pseudo random values instead, but I was wondering if anyone could support this or disprove it.

Yep, your understanding here is correct. It does not matter what exactly gets hashed, and no, you can't cheat without first breaking SHA-256, which is considered difficult.

The salient property of cryptographic hash functions is that they are as random as is possible while still being deterministic. That's what their strength depends on -- after all if they weren't random, if there were obvious patterns, they could be broken that way. So the ideal hash function behaves just like a random number generator. It does not matter what you feed in, timestamp or not, whatever's put in there, the hash should still behave randomly (i.e. every possible outcome has the same a-priori probability of occuring). Incrementing by one works just as well as completely changing everything every step (this follows from the avalanche property). However, the initial value, before you start incrementing, must be (pseudo-)randomly chosen, or every computer will start at the same point, and the fastest one always wins, which is not what is wanted here.
full member
Activity: 199
Merit: 2385
May 11, 2010, 08:13:07 AM
#24
Maybe someone with a little background in this statistics/math stuff can shed some light on this..

The way this thing works is it takes a (basically random) block of data and alters a 32 bit field inside it by starting at 1 and incrementing.  The block of data also contains a timestamp and that's incremented occasionally just to keep mixing it up (but the incrementing field isn't restarted when the timestamp is update).  If you get a new block from the network you sort of end up having to start over with the incrementing field at 1 again.. however all the other data changed too so it's not the same thing you're hashing anyway.

The way I understand it, since the data that's being hashed is pretty much random and because the hashing algorithm exhibits the 'avalanche effect' it probably doesn't matter if you keep starting with 1 and incrementing it or if you use pseudo random values instead, but I was wondering if anyone could support this or disprove it.

Can you increase your likelihood of finding a low numerical value hash by doing something other than just sequentially incrementing that piece of data in the input?  Or is this equivalent to trying to increase your chances of rolling a 6 (with dice) by using your other hand?
sr. member
Activity: 440
Merit: 250
May 11, 2010, 07:12:08 AM
#23
Interestingly, using laszlo's mac os version of bitcoin, one can see how many hashes per second the computer is performing.  I'm currently getting about 1 million hashes per second.  Given the current difficulty 0000000013ec53, I'll have to perform about 2^35~3x10^10 hashes before I have a decent chance of getting one below the target, and at 10^6/s, that should take about 30000 sec, or about two per day.  The actual interval varies a lot - it's a random process, but that seems to be more-or-less the correct amount.

Satoshi, could you update the first post in this thread, with the complete history of difficulty-of-work increases please?  I'd try, but for some reason, I've lost my logfiles.  Fortunately the wallet is safe.
administrator
Activity: 5222
Merit: 13032
May 02, 2010, 04:03:51 PM
#22
Your CPU is creating SHA-256 hashes. It's not possible to cheat: if the hashes you create are invalid, no one else in the network will accept them. If you inject a 50,000-block chain of "easy blocks" into the network, everyone will immediately see that the hash for the first block in the chain is above the current target and ignore it and every block derived from it.
member
Activity: 69
Merit: 10
May 02, 2010, 12:46:13 PM
#21
I don't know what you're talking about accepting easier difficulties.

We were essentially discussing Sabunir's question about what prevents someone from messing with the program's source code to adjust block-generating difficulty to be very easy, then make a network on his own and create a, say, 50,000-block proof-of-work within seconds then finally propagate it across the real network to steal "votes" towards his new fake blocks as technically, his proof would be "the longest". So is there a way to verify how much work was actually put into a given PoW (for eg. how many zero's are at the beginning of each hash or something)?

I am also wondering about Suggester's question.  It seems like modifying the code to give a node an advantage in generating coins might be possible.

I am confused as to why each node on the network is actually doing when set to generate coins.  What problem are they solving that takes 100% CPU?
newbie
Activity: 30
Merit: 0
February 26, 2010, 05:09:19 PM
#20
I wonder what I could generate with all eight threads...
sr. member
Activity: 252
Merit: 268
February 26, 2010, 04:19:24 AM
#19
I think that no bitcoins generated in 8 hours from within a VM utilizing two modern cores is probably not unusual. Keep it running for a few days and I expect that you'll generate more than a few packs of bitcoins.
newbie
Activity: 30
Merit: 0
February 26, 2010, 03:57:41 AM
#18
This overclocked i7 still hasn't generated any keys after 8 hours...
It may take longer than 8 hours to generate a block.

Have you previously generated bitcoins? Are the number of blocks listed at the bottom of Bitcoin greater than 42650? Those need to download before it can start generating coins. How many connections are listed at the bottom of Bitcoin? Did you click Options > Generate Coins? How much CPU does your process viewer show that Bitcoin is using? Is your Internet connection steady? I had problems when I tried sharing Internet from my smartphone to my computer.
No, but..42663 blocks..8 connections..and yes, generating.  Bitcoin uses 50-80 CPU..but it only has access to two cores until I bump the VM it is in to 4 cores..Operating over tor by the way.
sr. member
Activity: 252
Merit: 268
February 26, 2010, 02:03:09 AM
#17
This overclocked i7 still hasn't generated any keys after 8 hours...
It may take longer than 8 hours to generate a block.

Have you previously generated bitcoins? Are the number of blocks listed at the bottom of Bitcoin greater than 42650? Those need to download before it can start generating coins. How many connections are listed at the bottom of Bitcoin? Did you click Options > Generate Coins? How much CPU does your process viewer show that Bitcoin is using? Is your Internet connection steady? I had problems when I tried sharing Internet from my smartphone to my computer.
newbie
Activity: 30
Merit: 0
February 26, 2010, 01:44:40 AM
#16
This overclocked i7 still hasn't generated any keys after 8 hours...
member
Activity: 97
Merit: 11
February 25, 2010, 08:35:08 PM
#15
I don't know what you're talking about accepting easier difficulties.

We were essentially discussing Sabunir's question about what prevents someone from messing with the program's source code to adjust block-generating difficulty to be very easy, then make a network on his own and create a, say, 50,000-block proof-of-work within seconds then finally propagate it across the real network to steal "votes" towards his new fake blocks as technically, his proof would be "the longest". So is there a way to verify how much work was actually put into a given PoW (for eg. how many zero's are at the beginning of each hash or something)?

It wouldn't work anyway because that would be only 1 minute average between blocks, too close to the broadcast latency when the network gets larger.
Since we're at it, what's the approximate time for proof-of-work propagation across a network of about 100,000 machines? Is there a way to optimize connections so that broadcasting is done via a pyramid-form to minimize the needed time? For example, the block creator sends it to 10 nodes, then those 10 send it to a 100 provided that none of those 100 were among the original 11, then those 100 tell a 1000 provided that none of those 1000 were among the original 111, etc to save time.
founder
Activity: 364
Merit: 7248
February 25, 2010, 06:06:29 PM
#14
The formula is based on the time it takes to generate 2016 blocks.  The difficulty is multiplied by 14/(actual days taken).  For instance, this time it took 9.4 days, so the calculation was 14/9.4 = 1.49.  Previous difficulty 2.53 * 1.49 = 3.78, a 49% increase.

I don't know what you're talking about accepting easier difficulties.
member
Activity: 97
Merit: 11
February 24, 2010, 11:34:59 PM
#13
How do you adjust this difficulty, anyway? (Administrating a decentralized system?) And what would prevent an attacker from setting the difficulty very low or very high to interfere with the system?
My understanding is that every Bitcoin client has the same algorithm (formula) built into it to automatically adjust the difficulty every so many blocks.
Then how is it dependent on how many CPU's are connected to the whole network?

Not only that, but I think that Bitcoin will not accept blocks generated at a different difficulty, so if a modified Bitcoin client tried to send out more easily generated blocks, all the authentic clients would reject the fake blocks.
We need Satoshi to confirm that because clients accept blocks generated at easier difficulties all the time whenever the PoW's difficulty increases.
founder
Activity: 364
Merit: 7248
February 24, 2010, 05:42:24 PM
#12
The automatic adjustment happened earlier today.

24/02/2010 0000000043b3e500000000000000000000000000000000000000000000000000

24/02/2010  3.78  +49%

I updated the first post.
sr. member
Activity: 252
Merit: 268
February 21, 2010, 01:52:43 PM
#11
How do you adjust this difficulty, anyway? (Administrating a decentralized system?) And what would prevent an attacker from setting the difficulty very low or very high to interfere with the system?
My understanding is that every Bitcoin client has the same algorithm (formula) built into it to automatically adjust the difficulty every so many blocks. Not only that, but I think that Bitcoin will not accept blocks generated at a different difficulty, so if a modified Bitcoin client tried to send out more easily generated blocks, all the authentic clients would reject the fake blocks.
jr. member
Activity: 41
Merit: 13
February 21, 2010, 11:58:44 AM
#10
How do you adjust this difficulty, anyway? (Administrating a decentralized system?) And what would prevent an attacker from setting the difficulty very low or very high to interfere with the system?
founder
Activity: 364
Merit: 7248
February 17, 2010, 12:58:03 PM
#9
. Perhaps it has to do with my connection's very high latency (2000ms or more on average)
2 seconds of latency in both directions should reduce your generation success by less than 1%.

and/or my high packet loss (sometimes up to 10% loss)?
Probably OK, but I'm not sure.  The protocol is designed to resync to the next message, and messages get re-requested from all the other nodes you're connected to until received.  If you miss a block, it'll also keep requesting it every time another blocks comes in and it sees there's a gap.  Before the original release I did a test dropping 1 out of 4 random messages under heavy load until I could run it overnight without any nodes getting stuck.
member
Activity: 97
Merit: 11
February 16, 2010, 08:28:27 PM
#8
If I cannot stay online for about fourteen consecutive hours (very hard to do on a satellite connection!), I actually get nothing at all.
Can Satoshi confirm whether the computations your machine had made carries on if the session was interrupted, or do you need to start all over if you disconnected before generating at least one block? If it carries on, maybe a little meter indicating the % left until your block completes can be a nice addition so people would have some hope (actually, it will be a nice addition anyway whether the computations get carried on after disconnection or not!)

I generated 5 blocks today on my Pentium processor. Two of them were within 3 minutes of each other.

Ok, I just realized that I didn't understand how Bitcoin worked to begin with. The blocks get generated anyway whether you're generating coins or not. The average amount of creation conformed what I observed before (120/20 hrs, or 6/hr). This has got absolutely nothing to do with your CPU power, it's  constant for all practical purposes. The CPU power determines the "transactions" that get created and "matures in xx blocks". My head just got a bit bigger now Smiley

This also means theymos that there was probably a coincidence or error for your 3-minute interval observation!
founder
Activity: 364
Merit: 7248
February 16, 2010, 12:36:40 PM
#7
Satoshi, I figured it will take my modern core 2 duo about 20 hours of nonstop work to create ฿50.00! With older PCs it will take forever. People like to feel that they "own" something as soon as possible, is there a way to make the generation more divisible? So say, instead of making ฿50 every 20 hours, make ฿5 every 2 hours?
I thought about that but there wasn't a practical way to do smaller increments.  The frequency of block generation is balanced between confirming transactions as fast as possible and the latency of the network.

The algorithm aims for an average of 6 blocks per hour.  If it was 5 bc and 60 per hour, there would be 10 times as many blocks and the initial block download would take 10 times as long.  It wouldn't work anyway because that would be only 1 minute average between blocks, too close to the broadcast latency when the network gets larger.
jr. member
Activity: 41
Merit: 13
February 16, 2010, 03:51:51 AM
#6
My port is open, both in my software and hardware firewall. My router is handling it appropriately. Perhaps it has to do with my connection's very high latency (2000ms or more on average) and/or my high packet loss (sometimes up to 10% loss)?
administrator
Activity: 5222
Merit: 13032
February 16, 2010, 01:01:51 AM
#5
I generated 5 blocks today on my Pentium processor. Two of them were within 3 minutes of each other.

I have noticed some slowdown since the adjustment, but I still generate a lot of coins. My computer is off while I'm sleeping, and BitCoin bootstraps quickly when I turn it back on. Do you guys-who-are-having-trouble have the BitCoin port open?
jr. member
Activity: 41
Merit: 13
February 16, 2010, 12:18:30 AM
#4
I would like to comment that as of late, it seems almost as if I am generating nearly no Bitcoins. Indeed, my rate of acquisition seems to be greater than ten times slower. If I cannot stay online for about fourteen consecutive hours (very hard to do on a satellite connection!), I actually get nothing at all.

How this exactly relates to the difficulty adjustments is beyond my knowledge; I offer this feedback as a kind of "field report".
member
Activity: 97
Merit: 11
February 15, 2010, 09:15:49 PM
#3
[Edit: I later found that I was generating quite a bit more than that, just didn't realize it because of the "matures in xx more blocks" concept. I still think it will be a major headache when the difficulty significantly increases though. I apologize for my silliness Smiley]

Satoshi, I figured it will take my modern core 2 duo about 20 hours of nonstop work to create ฿50.00! With older PCs it will take forever. People like to feel that they "own" something as soon as possible, is there a way to make the generation more divisible? So say, instead of making ฿50 every 20 hours, make ฿5 every 2 hours?

I don't know if that means reducing the block size or reducing the 120-block threshold to say 12-block only or what, but because the difficulty is increasing I can imagine that a year from now the situation will be even worse (3+ weeks until you see the first spendable coins!) and we better find a solution for this ASAP.
founder
Activity: 364
Merit: 7248
February 15, 2010, 01:28:38 AM
#2
14/02/2010 0000000065465700000000000000000000000000000000000000000000000000

2009        1.00
30/12/2009  1.18   +18%
11/01/2010  1.31   +11%
25/01/2010  1.34    +2%
04/02/2010  1.82   +36%
14/02/2010  2.53   +39%

Another big jump in difficulty yesterday from 1.82 times to 2.53 times, a 39% increase since 10 days ago.  It was 10 days apart not 14 because more nodes joined and generated the 2016 blocks in less time.
founder
Activity: 364
Merit: 7248
February 05, 2010, 02:19:12 PM
#1
We had our first automatic adjustment of the proof-of-work difficulty on 30 Dec 2009.  

The minimum difficulty is 32 zero bits, so even if only one person was running a node, the difficulty doesn't get any easier than that.  For most of last year, we were hovering below the minimum.  On 30 Dec we broke above it and the algorithm adjusted to more difficulty.  It's been getting more difficult at each adjustment since then.

The adjustment on 04 Feb took it up from 1.34 times last year's difficulty to 1.82 times more difficult than last year.  That means you generate only 55% as many coins for the same amount of work.

The difficulty adjusts proportionally to the total effort across the network.  If the number of nodes doubles, the difficulty will also double, returning the total generated to the target rate.

For those technically inclined, the proof-of-work difficulty can be seen by searching on "target:" in debug.log.  It's a 256-bit unsigned hex number, which the SHA-256 value has to be less than to successfully generate a block.  It gets adjusted every 2016 blocks, typically two weeks.  That's when it prints "GetNextWorkRequired RETARGET" in debug.log.

minimum    00000000ffff0000000000000000000000000000000000000000000000000000
30/12/2009 00000000d86a0000000000000000000000000000000000000000000000000000
11/01/2010 00000000c4280000000000000000000000000000000000000000000000000000
25/01/2010 00000000be710000000000000000000000000000000000000000000000000000
04/02/2010 000000008cc30000000000000000000000000000000000000000000000000000
14/02/2010 0000000065465700000000000000000000000000000000000000000000000000
24/02/2010 0000000043b3e500000000000000000000000000000000000000000000000000
08/03/2010 00000000387f6f00000000000000000000000000000000000000000000000000
21/03/2010 0000000038137500000000000000000000000000000000000000000000000000
01/04/2010 000000002a111500000000000000000000000000000000000000000000000000
12/04/2010 0000000020bca700000000000000000000000000000000000000000000000000
21/04/2010 0000000016546f00000000000000000000000000000000000000000000000000
04/05/2010 0000000013ec5300000000000000000000000000000000000000000000000000
19/05/2010 00000000159c2400000000000000000000000000000000000000000000000000
29/05/2010 000000000f675c00000000000000000000000000000000000000000000000000
11/06/2010 000000000eba6400000000000000000000000000000000000000000000000000
24/06/2010 000000000d314200000000000000000000000000000000000000000000000000
06/07/2010 000000000ae49300000000000000000000000000000000000000000000000000
13/07/2010 0000000005a3f400000000000000000000000000000000000000000000000000
16/07/2010 000000000168fd00000000000000000000000000000000000000000000000000
27/07/2010 00000000010c5a00000000000000000000000000000000000000000000000000
05/08/2010 0000000000ba1800000000000000000000000000000000000000000000000000
15/08/2010 0000000000800e00000000000000000000000000000000000000000000000000
26/08/2010 0000000000692000000000000000000000000000000000000000000000000000

date, difficulty factor, % change
2009           1.00
30/12/2009     1.18   +18%
11/01/2010     1.31   +11%
25/01/2010     1.34    +2%
04/02/2010     1.82   +36%
14/02/2010     2.53   +39%
24/02/2010     3.78   +49%
08/03/2010     4.53   +20%
21/03/2010     4.57    +9%
01/04/2010     6.09   +33%
12/04/2010     7.82   +28%
21/04/2010    11.46   +47%
04/05/2010    12.85   +12%
19/05/2010    11.85    -8%
29/05/2010    16.62   +40%
11/06/2010    17.38    +5%
24/06/2010    19.41   +12%
06/07/2010    23.50   +21%
13/07/2010    45.38   +93%
16/07/2010   181.54  +300%
27/07/2010   244.21   +35%
05/08/2010   352.17   +44%
15/08/2010   511.77   +45%
26/08/2010   623.39   +22%
Jump to: