Isnt it interesting that the hextarget isnt the same as what I calculated, maybe not so simple as deepceleron declares
Starting from difficulty 10,076,293
http://bitcoindifficulty.com/ I get .00000000000001AA3EA9EBE... so there is a .0015% discrepancy. Clearly the target is the correct value as its the used value. Looking around it seems that difficulty is actually the multiple of hardness relative to the minimum difficulty which is actually not 32 0s (expected 2^32 tries) but rather 0.FFFFh/2^32 (ie x < .00000000FFFF0000h) expected tries 2^32/0.FFFFh = 4295032833 (100010001h instead of 1000000h).
So convert from target to difficulty and difficulty to bits is even messier:
scale=80
define pow(x,p) { return e(p*l(x)); }
define log(b,x) { return l(x)/l(b); }
define log2(x) { return log(2,x); }
# http://blockexplorer.com/q/hextarget
ibase=16
target=1AA3D0000000000000000000000000000000000000000000000/2^100
mindiff=FFFF/2^10 # the source of the .0015% discrepancy
ibase=A
tries=2^32/mindiff
diff=1/target/tries
bits=log2(diff*tries)
cbits=-log2(target)
gdiff=diff*4*mindiff # difficulty in gigahashes
nhash=70.48*1024
time=gdiff/nhash
I think my unnecessary complexity issue with this page
https://en.bitcoin.it/wiki/Difficulty (and the measure chosen for difficulty) is not so much that it is log2 scale or not. I can handle that. But that it is not even expected number of hashes (or Gigahashes etc). At an approximation it is number of hashes / 2^32. Now 2^32 is not a nice number in both bases (log2 scale and log10 scale); 2^30 is a nice number. That would be a nicer way to report difficulty IMO as thats a GH, and you'll notice ALL of the miners are reporting power in GH or MH; and the network hash rate is in TH. (Not difficulty chunks which are the former divided by 2^32). But on top of that for proper accuracy it is not even hashes/ 2^32 but difficulty = hashes /2^32*0.FFFFh. And that is harder to test at discrete difficulties (whole number). Which is why pool shares are not an exact multiple of difficulty but rather trailing FFF difficulty to counter act this issue.
You know I once knew a crypto math/hacker guy who used to think human huffman encoding was fun. Satoshi? Hmmm
>>> math.log(2**256/int('00000000000001AA3D0000000000000000000000000000000000000000000000',16),2)
55.26448364017038
What "bit" difficulty would be 10% harder?
Well that wasnt exactly my point (my point was that you can get a ball park approximate order of magnitude with your eyes and mental arithmetic with bits). But about your question log2(1.1) = .1375 (call it .14, remember that) so 55.26+10% = 55.40?
Use a base difficulty, where 1 = 1 block find per ~4295032833.0 hashes on average, and higher difficulties are multipliers of that.
I dont find 2^32/.FFFFh a particularly meaningful number. I know the discrepancy is small, but why even bother .. just simplify and use trailing FFF difficulty.
Sorry but simplicity does matter.
Anyway untangling and ignoring the .0015% discrepancy, you could convert difficulty into approx gigahash by multiplying by 4: difficulty *4 = 40305172 GH. And network hash rate = 70.48TH, so expected time = 40305172/(70.48*1024) = 558s. Close enough - network hash rate has grown since that difficulty was calculated. (Or in log2 scale difficulty is 55.26 and network hash rate is 46.14 so > 2^9 tries > 500 seconds. )
Adam