Pages:
Author

Topic: I am going to build a true random number generator ... - page 2. (Read 7864 times)

legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
How about an RNG based on the blockchain?  Smiley
donator
Activity: 1218
Merit: 1079
Gerald Davis
Isn't casting dice enough random? Maybe the only problem is that the process of collecting those numbers and generating a key is lengthy  Tongue

As long as the dice are fair it certainly is, so would flipping a lot of coins.  The collection of data however is manual and slow.
legendary
Activity: 1988
Merit: 1012
Beyond Imagination
Isn't casting dice enough random? Maybe the only problem is that the process of collecting those numbers and generating a key is lengthy  Tongue
legendary
Activity: 1260
Merit: 1008
Well to be clear this isn't "my" approach, just the one I am planning to use.  Smiley   I don't want people to incorrectly give credit where no credit is due.  Fourmilabs in switzerland has been providing true random numbers over the internet produced from observing radioactive decay for the better part of a decay.  The interesting thing is that micro controllers have gotten fast and cheap enough combined with a lot of open source hardware information at there that it becomes economical for a hobbyist to build their own "hotbits" device at home.

I didn't want to grant you the full paternity of the idea (i.e. provide true random numbers using quantum entropy).

I just wanted to underline the fact that your idea has a potential since it could be produced on a large scale with low costs per unit.

kudos in advance Smiley
full member
Activity: 187
Merit: 109
Converting information into power since 1867
But you suggested estimating the median from a known constant, not calculating it from the actual sample.
That introduces the possibility of bias.

Well, yea... The half-life of 241Am is well known and measured, I don't think there's a lot of wiggle room there... But I suppose you don't know the purity of your sample, plus the sensitivity of the counter might introduce some bias...


Maybe it really is preferable to calculate from the actual sample. It's also very easy. You just let the counter run for a very long time, and calculate the average CPM (by "very long time", I mean very long relative to the average interval in the OP's setup, which OP expects to be some fraction of a millisecond. So a few hours are probably more than enough for the law of large numbers to really kick in).

Then you just take ln2 divided by the average CPM, and that's your median interval right there.

In fact, if you have the thing running all the time anyway, you could use the massive amounts of data points you collect to continuously fine-tune your measurement of the average CPM. Say, once a month you calculate the average CPM of the last month, and recalculate the median. That way you're continuously adjusting for the decay of your sample, degradation of the counter tube, and such.
member
Activity: 73
Merit: 10
You could use phosphorescence of fluorescence instead of radiation. They are quantum so random.
sr. member
Activity: 476
Merit: 250
Following what I said above, I think it should be possible to use only one event per bit. Just check whether an interval is shorter or longer than the median of the exponential distribution, which is ln2 divided by the rate parameter (which can be estimated given the half-life).

This will create independent bits but there will be a bias towards 1 or 0, depending on the details of your particular setup.  You need to compare two intervals created by the same process within the same system, instead of replacing one of them with an external constant.

I agree the later is a better solution but using a Von Neumann filter, the bias of independent bits can be removed.  For example in the setup proposed say the system was biased toward producing 0s over 1s.  Since a 00 sequence (or 11 sequence) is discarded and a 01 and 10 sequence are equally likely the bias can be easily removed (01 = 1 and 10 = 0).  Still you end up using at least 2 counts per bit after filtering.  The actual number of counts required will depend on the amount of bias.  The more biased the source the more counts it will take to produce the "rare" 1 needed to complete the sequence.  For example if a system was biased 70%/30% in favor of zeroes then it will require on average 2.38 counts for each bit that passes out of the filter.

Uhm, I don't think there will be a bias. That's exactly why I suggested the median. It's not an external constant, it's a parameter of the distribution of the measured variable. By definition, 50% of the intervals will be longer than the median and 50% will be shorter (a few may be exactly equal to the median, but you will discard them). Therefore, I don't think you'd need a von Neumann filter, nor do you need more than one count per bit.

But you suggested estimating the median from a known constant, not calculating it from the actual sample.
That introduces the possibility of bias.
legendary
Activity: 2156
Merit: 1131

Is this real life ?
legendary
Activity: 2394
Merit: 1216
The revolution will be digital
How random is PHP math_rand(0,n) ?
full member
Activity: 187
Merit: 109
Converting information into power since 1867
Following what I said above, I think it should be possible to use only one event per bit. Just check whether an interval is shorter or longer than the median of the exponential distribution, which is ln2 divided by the rate parameter (which can be estimated given the half-life).

This will create independent bits but there will be a bias towards 1 or 0, depending on the details of your particular setup.  You need to compare two intervals created by the same process within the same system, instead of replacing one of them with an external constant.

I agree the later is a better solution but using a Von Neumann filter, the bias of independent bits can be removed.  For example in the setup proposed say the system was biased toward producing 0s over 1s.  Since a 00 sequence (or 11 sequence) is discarded and a 01 and 10 sequence are equally likely the bias can be easily removed (01 = 1 and 10 = 0).  Still you end up using at least 2 counts per bit after filtering.  The actual number of counts required will depend on the amount of bias.  The more biased the source the more counts it will take to produce the "rare" 1 needed to complete the sequence.  For example if a system was biased 70%/30% in favor of zeroes then it will require on average 2.38 counts for each bit that passes out of the filter.

Uhm, I don't think there will be a bias. That's exactly why I suggested the median. It's not an external constant, it's a parameter of the distribution of the measured variable. By definition, 50% of the intervals will be longer than the median and 50% will be shorter (a few may be exactly equal to the median, but you will discard them). Therefore, I don't think you'd need a von Neumann filter, nor do you need more than one count per bit.
sr. member
Activity: 322
Merit: 250
Decentralize All The Things!
I just need to wait for a missing component to arrive.

(Stupid broken image proxy - direct link http://i.minus.com/ibzPEHrUJ3pByt.jpg )

While I don't really know anything about your setup or geiger counters in general, it does seem like an expensive component. Would the cheap geiger counters on ebay not be good enough for the task?
hero member
Activity: 742
Merit: 502
Circa 2010
What's wrong with the randomness that is used on dice sites like Just-dice?

That one is pretty random too, to be honest.

Because they are not per se truly random but instead pseudo-random being generated by a seed and combined with various other factors. So while they 'approximate' random numbers they are no truly random numbers themselves yet still good enough for the purposes of running a dice game. It's really all about how picky you want to be.
hero member
Activity: 588
Merit: 500
Will Bitcoin Rise Again to $60,000?
What's wrong with the randomness that is used on dice sites like Just-dice?

That one is pretty random too, to be honest.
hero member
Activity: 854
Merit: 1000
Have you thought about a lavarand generator?  Wink


http://www.random.org/randomness/
newbie
Activity: 14
Merit: 0
May I ask - what are you planning on using the RNG for? Because if it's for applications like generating passwords - it might not be that useful. If there are already quantumcomputers powerful enough to predict the movement of E.Coli..they will surely enough be powerful enough to just bruteforce the passwords.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Well to be clear this isn't "my" approach, just the one I am planning to use.  Smiley   I don't want people to incorrectly give credit where no credit is due.  Fourmilabs in switzerland has been providing true random numbers over the internet produced from observing radioactive decay for the better part of a decay.  The interesting thing is that micro controllers have gotten fast and cheap enough combined with a lot of open source hardware information at there that it becomes economical for a hobbyist to build their own "hotbits" device at home.
legendary
Activity: 1260
Merit: 1008
I'm a bit late to the discussion, but I'm a bit surprised that no one posted this resource for quantum generated true random numbers using optics at the Australian National University: http://photonics.anu.edu.au/qoptics/Research/qrng.php

They have a live true random number server as well:
http://150.203.48.55/index.php

The API info can be found here:
http://qrng.anu.edu.au/FAQ.php#api

No idea if it's fast enough for the intended task though...

Thanks for the links.  Very useful.

And more to the point it seems to validate D&T approach. They are only using a different source of quantum entropy. D&T model seems easier to be developed on a large scale, though.
newbie
Activity: 3
Merit: 0
I'm a bit late to the discussion, but I'm a bit surprised that no one posted this resource for quantum generated true random numbers using optics at the Australian National University: http://photonics.anu.edu.au/qoptics/Research/qrng.php

They have a live true random number server as well:
http://150.203.48.55/index.php

The API info can be found here:
http://qrng.anu.edu.au/FAQ.php#api

No idea if it's fast enough for the intended task though...
donator
Activity: 1617
Merit: 1012
Does this provide more entropy than something more common and practical, like the camera on your phone? I would imagine that if you hashed a 24-bit 10 megapixel random image you'd get a random number with pretty good entropy. After all, each pixel can be considered as an independent photon counter.
legendary
Activity: 1400
Merit: 1005
Sure... All I was saying is that making a PROVABLY unflawed rng isn't going to substantially help customer acquisition for reasons I won't bore you with...

What does acquiring customers have to do with this thread?

Thought you were developing it for some business purpose initially.

Btw, why do we need hardware , isn't there enough entropy on the internet that we can access?
It's public entropy, so if anyone knows what you are using, they can generate the same "random" numbers.  A true RNG would mean no one could reproduce the results.

It could be combined with the entropy of the exact time a random number request was made, along with additional pseudo random number from the server, hash the result, grab some random parameters from that, go get some random feed from online that is also changing in real time, hash that, and you have a pretty doggone random number that no one could arrive at even if they had your source code. 
I disagree.  If someone had your source code, they could track all those sources you talk about, and the only thing they'd need to speculate on is the exact time a random number request was made.  If you're going to claim that is random enough, then just use the exact time request by itself - everything else adds no additional randomness to someone who has your source code.
Pages:
Jump to: