Pages:
Author

Topic: I am going to build a true random number generator ... - page 4. (Read 7864 times)

kjj
legendary
Activity: 1302
Merit: 1026
Looks like you are doing things right so far.  Personally, I use a different tube and a chunk of thoriated welding rod, but that's because it is what I had sitting around.  After your A/B filter, you need to feed it through von Neumann's filter (1,0 -> 1; 0,1 -> 0; 0,0 ->discard; 1,1 -> discard).

Next up for your project is monitoring.  Bias creeps in.  You need to keep careful track or you'll end up with garbage.

Finally, you need security.  Ideal would be an old line printer hooked up to a totally offline box.  Every time you get enough bits, encrypt the privkey and print that along with the pubkey (or address).
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
Sounds like a fun project.  Very cool.  But why do you need to prove randomness?
It is unlikely to provide any business advantage in my opinion.
legendary
Activity: 872
Merit: 1010
Coins, Games & Miners
I've always pondered if there's any use on adapting random behavior of certain natural systems as entropy sources. For example, you could use an USB microscope to record E.Coli movements on a petri dish and use frames as an entropy source, or you could even go and use a string vibrating with an electromagnet under a bridge (i know this source has harmonic behaviour, but you could use the noise in the sine wave to generate additional entropy, or even use FM as the entropy and not the amplitude). This could be recorded with a smallish system like the Electric Imp and sent to a server to seed different PRNGs regularly, resulting in a TRNG seeded PRNG. You could use the TRNGs directly, but real life sources have low bandwidth (This could be desirable for certain systems).

This is all very interesting, but just like the chicken idea above, none of this is truly random. Theoretically, the movements of E. coli are predictable. Every gust of wind in the atmosphere is predictable. Highly complex or chaotic systems may be unpredictable in practice, but they are still predictable in theory.
The only things which are theoretically unpredictable are quantum phenomena.

If you had perfect knowledge of the physical universe, plus an infinitely powerful computer on which to run your perfect simulations, you could predict which direction each E. coli will turn, but you would still have no idea when the next radioactive atom will decay. Or so the theory goes...

Although the "predictability" of E.Coli is provable, the computational power needed to predict every and each movement of an E.Coli on a petri dish would be very high, and even then, the computational power needed to predict that TRNG would probably have a lot more of the power needed to break ECDSA.

You could also use Brownian Motion [1] which is a random phenomena that also exhibits the qualities desirable on a TRNG, and has much more bandwidth than decaying atoms. (Again, low bandwidth is desirable on some TRNGs, so it is a trade-off)

[1] http://en.wikipedia.org/wiki/Brownian_motion
full member
Activity: 187
Merit: 109
Converting information into power since 1867
I've always pondered if there's any use on adapting random behavior of certain natural systems as entropy sources. For example, you could use an USB microscope to record E.Coli movements on a petri dish and use frames as an entropy source, or you could even go and use a string vibrating with an electromagnet under a bridge (i know this source has harmonic behaviour, but you could use the noise in the sine wave to generate additional entropy, or even use FM as the entropy and not the amplitude). This could be recorded with a smallish system like the Electric Imp and sent to a server to seed different PRNGs regularly, resulting in a TRNG seeded PRNG. You could use the TRNGs directly, but real life sources have low bandwidth (This could be desirable for certain systems).

This is all very interesting, but just like the chicken idea above, none of this is truly random. Theoretically, the movements of E. coli are predictable. Every gust of wind in the atmosphere is predictable. Highly complex or chaotic systems may be unpredictable in practice, but they are still predictable in theory.
The only things which are theoretically unpredictable are quantum phenomena.

If you had perfect knowledge of the physical universe, plus an infinitely powerful computer on which to run your perfect simulations, you could predict which direction each E. coli will turn, but you would still have no idea when the next radioactive atom will decay. Or so the theory goes...
member
Activity: 84
Merit: 10
PM for journalist,typing,and data entry services.
What are the benefits of a random number generator? Does this have any scientific or mathematical benefits?
full member
Activity: 187
Merit: 109
Converting information into power since 1867
It will be measuring the time between TWO particle detections (if time between this interval is larger than prior interval that is a "1" and if it is shorter it is a "0" and if it is equal we throw it out).  Quantum mechanics tells us that while the average rate of decay can be calculated the time between each individual decay can not.

Is this a good algorithm?

I know that what seems intuitive is often wrong when dealing with things like this, so I may not be thinking this through correctly...

It would seem that while you cannot know how long it will be to the next detection, there will be an oscillating tendency

Anytime you get a "0", it implies that the time was shorter than the previous detection.  While this is not a guarantee that the time is shorter than the average, it certainly is an indicator that the time is more likely to be shorter than the average. (If you average all the intervals when you get a "0", and compare it to an average of all the intervals, the average interval when you get a "0" should be shorter than the average of all intervals, shouldn't it?)

The reverse can be said about any instance where you get a "1".  This would seem to imply that after a "1", there is a higher than average chance that your next interval will be a "0" (and vice versa).

I suppose for these purposes the bias might not be significant enough to be a problem, but I can't help but wonder if there isn't a better solution.

I think it doesn't work that way. Radioactive decay is a Poisson process, and the time interval between decay events follows the exponential distribution, which is "Memoryless". I'm pretty sure that means that the length of one interval has absolutely no predictive value regarding the length of the next interval.



if (A > B)
   then bit = 1
else if (B > A)
   then bit = 0
// if A = B then no bit is recorded

This requires two counts (events) per bit so the bitrate will be roughly CPM / 120 (CPM = counts per minute).

Following what I said above, I think it should be possible to use only one event per bit. Just check whether an interval is shorter or longer than the median of the exponential distribution, which is ln2 divided by the rate parameter (which can be estimated given the half-life).
legendary
Activity: 872
Merit: 1010
Coins, Games & Miners
I've always pondered if there's any use on adapting random behavior of certain natural systems as entropy sources. For example, you could use an USB microscope to record E.Coli movements on a petri dish and use frames as an entropy source, or you could even go and use a string vibrating with an electromagnet under a bridge (i know this source has harmonic behaviour, but you could use the noise in the sine wave to generate additional entropy, or even use FM as the entropy and not the amplitude). This could be recorded with a smallish system like the Electric Imp and sent to a server to seed different PRNGs regularly, resulting in a TRNG seeded PRNG. You could use the TRNGs directly, but real life sources have low bandwidth (This could be desirable for certain systems).
donator
Activity: 1218
Merit: 1079
Gerald Davis
Is it possible to use the hash of each new block as a source of entropy?
Not really - see above.


Ah okay, yeah that makes sense. So a pretty useless suggestion then...

Well, at least I learned something.  Smiley

It isn't useless, it just isn't a TRNG.  You could take block hashes and use the trailing 160 bits (to avoid bias due to leading zeros).  That would give you 160 bits per block or about 48 Mb since the genesis block.   You would now have a random uniformly distributed sequence of random bits (or at least you could test it to verify that assumption). If it is uniformly distributed (and what we know of SHA suggest it is but that could be tested) it has some use however as SpikeSgt pointed out, for cryptogrphic purposes you want uniformly distributed random values not know or reproducible by anyone else.

However by applying a little crypto you could mutate that stream into something that nobody else can reproduce.  One example would be to take a HMAC-256 hash of the sequence (in 256 bit "chunks") using the raw truncated block data and a private key you know.  Now it isn't a TRNG because if someone obtains your private key they could generate your private stream.  However this did get me thinking.  Using a programmable smart card capable of HMAC the key could remain inside the smart card.

Code:
256bit_random_output = HMAC-256(smartcard_private_key, "256 bit RAW input + nonce)

The point of using a nonce would be to prevent someone who has access to the cards from producing previous values (i.e. the same raw block hashes would produce a different output each time they are input to the smartcard).

Still not a TRNG but a pretty cool concept regardless.  Private keys in theory can be extracted from smart cards but it is a difficult process and requires physical access to the card.  If the card is stolen then move your coins.  Don't anyone rush out and use that I am just thinking out loud.  Still in general it is an interesting concept. The blockchain is a large source of KNOWN entropy, a system which mutates that into unknown entropy is easier than one which attempts to produce unknown entropy natively.
legendary
Activity: 3920
Merit: 2349
Eadem mutata resurgo
http://phys.org/news202456660.html

... be careful, there maybe periodic signals in there ...  Cheesy
hero member
Activity: 898
Merit: 1000
Is it possible to use the hash of each new block as a source of entropy?
Not really - see above.


Ah okay, yeah that makes sense. So a pretty useless suggestion then...

Well, at least I learned something.  Smiley
legendary
Activity: 1400
Merit: 1005
I always thought that a microphone could work just as effectively for randomness.  Put a mic outside, record for 10 seconds, take the hash of that, viola!  Or just a straight sampling of it, like 10 bits, although the effective randomness would be less bits than that.
thats a neat idea. I'd assume the codec and/or file extension might not make it too random though
I don't see how the codec or file extension would affect the effectiveness of the hash... I mean, sure, the headers might be known, but random audio would still be made up of random bits, regardless of what codec is used.

Why not use the feed from all those public webcams? Pixel and hue variations should be random enough as data.
It's public data, so by definition, not really safe to use.  Someone else could replicate what you are doing.  The whole goal with a good RNG is that no one else can replicate it even if they see the code you are using.

Is it possible to use the hash of each new block as a source of entropy?
Not really - see above.
hero member
Activity: 898
Merit: 1000
Is it possible to use the hash of each new block as a source of entropy?
legendary
Activity: 3472
Merit: 4801
A better explanation is that each bit will require two independent intervals.

Imagine 3 particle detections a, b, c,

Interval A is the time between a & b
Interval B is the time between b & c

if (A > B)
   then bit = 1
else if (B > A)
   then bit = 0
// if A = B then no bit is recorded
d this tube) and a clock with at least 10 microsecond accuracy.  With a clock accuracy of only 1 millisecond the upper limit would be around 100 bps.

Ah. Ok.  This makes more sense.  Thanks for the clarification.
donator
Activity: 1218
Merit: 1079
Gerald Davis
It will be measuring the time between TWO particle detections (if time between this interval is larger than prior interval that is a "1" and if it is shorter it is a "0" and if it is equal we throw it out).  Quantum mechanics tells us that while the average rate of decay can be calculated the time between each individual decay can not.

Is this a good algorithm?

I know that what seems intuitive is often wrong when dealing with things like this, so I may not be thinking this through correctly...

It would seem that while you cannot know how long it will be to the next detection, there will be an oscillating tendency

Anytime you get a "0", it implies that the time was shorter than the previous detection.  While this is not a guarantee that the time is shorter than the average, it certainly is an indicator that the time is more likely to shorter than the average. (If you average all the intervals when you get a "0", and compare it to an average of all the intervals, the average interval when you get a "0" should be shorter than the average of all intervals, shouldn't it?)

The reverse can be said about any instance where you get a "1".  This would seem to imply that after a "1", there is a higher than average chance that your next interval will be a "0" (and vice versa).

I suppose for these purposes the bias might not be significant enough to be a problem, but I can't help but wonder if there isn't a better solution.

That is a good point.   What I wrote didn't accuracy describe the conversion method and you are right, as described it probably would lead to some alternate bit bias.  A better explanation is that each bit will require two independent intervals.  

Imagine 3 particle detections a, b, c,
Interval A is the time between a & b
Interval B is the time between b & c

if (A > B)
   then bit = 1
else if (B > A)
   then bit = 0
// if A = B then no bit is recorded

This requires two counts (events) per bit so the bitrate will be roughly CPM / 120 (CPM = counts per minute).  It will actually be less than half due to the portion of the counts which need to be dropped because the intervals match.  The amount of the reduction will depend on the accuracy of the clock relative to the average interval.  

Some ballpark numbers:
To produce a throughput of 1 kbps (about a 10MB stream of random digits per day) would require a source and tube combination capable of 120,000 CPM and a real time clock with at least 10 microsecond accuracy.   Initially I am going to try for a much lower ~100 bps using the microcontroller clock with roughly 1ms accuracy.





legendary
Activity: 2142
Merit: 1010
Newbie
Well, as OP already mentioned, radioactive decay is one of the few things in the universe that are truly random (at least as far as we understand physics today).

R u sure? Longer a particle is stable - higher chance that it will stay stable.
legendary
Activity: 3472
Merit: 4801
It will be measuring the time between TWO particle detections (if time between this interval is larger than prior interval that is a "1" and if it is shorter it is a "0" and if it is equal we throw it out).  Quantum mechanics tells us that while the average rate of decay can be calculated the time between each individual decay can not.

Is this a good algorithm?

I know that what seems intuitive is often wrong when dealing with things like this, so I may not be thinking this through correctly...

It would seem that while you cannot know how long it will be to the next detection, there will be an oscillating tendency

Anytime you get a "0", it implies that the time was shorter than the previous detection.  While this is not a guarantee that the time is shorter than the average, it certainly is an indicator that the time is more likely to be shorter than the average. (If you average all the intervals when you get a "0", and compare it to an average of all the intervals, the average interval when you get a "0" should be shorter than the average of all intervals, shouldn't it?)

The reverse can be said about any instance where you get a "1".  This would seem to imply that after a "1", there is a higher than average chance that your next interval will be a "0" (and vice versa).

I suppose for these purposes the bias might not be significant enough to be a problem, but I can't help but wonder if there isn't a better solution.
donator
Activity: 1218
Merit: 1079
Gerald Davis
1 uSv of Am-241 with no shielding produces an exposure of ~1.27 Sv per year assuming constant exposure at a distance of 1m.  That is less than 1/700th of the recommended annual exposure limit of 1000 uSv annually.  So "a little bit" in this case is almost zero which is why it is used in smoke detectors to begin with.

Ah, I hadn't realized such a small quantity will suffice to get your RNG working.

Well it remains to be seen but that is my hypothesis.  The reason for this specific tube (LND 712) is that it is very sensitive to alpha emissions.  The source is going to be permanently attached to a screen on the window end of the tube.  There are more sensitive tubes but they are out of what I am willing to spend on a hobby project.  Looking at the test results of other homemade geiger counters it looks like 1 uCi of Am-241 will register 100K to 120K CPM with this tube.  Assume two counts per bit that works out to ~800 to 1000 bits per second peak throughput.

The source or the tube however isn't going to be the bottleneck (at least through 1 kbps).  The hard part is going to be getting a timing circuit which can register events with sufficient accuracy.  We are talking an average interval of 500 microseconds so a timer with microseconds scale accuracy (or at least tens of microseconds) is going to be necessary.  This is beyond the capability of most micro controllers, and it probably going to mean a dedicated real time clock ( something like http://www.maximintegrated.com/datasheet/index.mvp/id/4627/ln/en ).  

As a proof of concept I am going to start out without a RTC but that means much lower timer accuracy and lower throughput first.  Something in the order of <3,000 cpm which produce ~24 bps of entropy.  Even that will depend on micro controller having true 1ms clock accuracy.  For the early test I am going to use a gas lantern mantle (thorium & beta emitter) as the particle source.



full member
Activity: 187
Merit: 109
Converting information into power since 1867
A truly random generator is kinda impossible. Output numbers seem to be random, but there is some equation behind it.... I don't think its possible to make a truly random number. Maybe if you analyzed like something minute like the position of an atom or quantum mechanics?

Well, as OP already mentioned, radioactive decay is one of the few things in the universe that are truly random (at least as far as we understand physics today).
member
Activity: 84
Merit: 10
PM for journalist,typing,and data entry services.
A truly random generator is kinda impossible. Output numbers seem to be random, but there is some equation behind it.... I don't think its possible to make a truly random number. Maybe if you analyzed like something minute like the position of an atom or quantum mechanics?
full member
Activity: 187
Merit: 109
Converting information into power since 1867
1 uSv of Am-241 with no shielding produces an exposure of ~1.27 Sv per year assuming constant exposure at a distance of 1m.  That is less than 1/700th of the recommended annual exposure limit of 1000 uSv annually.  So "a little bit" in this case is almost zero which is why it is used in smoke detectors to begin with.

Ah, I hadn't realized such a small quantity will suffice to get your RNG working. This is indeed negligible. I work with beta emitters far more dangerous than that practically every day, and I'm still standing... kinda   Undecided
Pages:
Jump to: