![]() Maybe the wall-clock time is different, but that’s pretty easily guessable. A computer is a deterministic machine running predictable code. How much uncertainty is there in the system? It’s running exactly the same BIOS as last time you booted up, and probably all of the same OS and higher code. Turn on your computer, or fire up a microcontroller. Quick quiz: If you generate a 256-bit random number from a good cryptographic PRNG, it will have 256 bits of entropy, right? Wrong! If you didn’t pick a seed completely randomly from at least 2 256 possibilities, your number will have fewer than 256 bits of entropy. (The English language has around 1,000,000 words - you should never lose at “twenty questions”, or use a single word as a password.) A good long-term password should probably have in excess of 128 bits of entropy. A random number between 1 and 1,000,000 has just under 20 bits of entropy. Expressing 128 in binary requires seven bits,, and if I started asking you if the number was greater than 64, and subdividing the remaining interval each time, it would only take me seven guesses.Įntropy is a measure of the number of possible choices from which our secret value could have been drawn, and it’s a way to measure hardness-to-guess, strength of passwords, and it’s what people mean when they say that some procedure generates “kinda random” versus “very random” numbers. There are only 128 possible states that your PRNG machine can be in. Let’s say you had started your counter in the hash chain at 1, and had used no more than 128 values so far. That number is the same as the base-2 logarithm of the number of possible secrets that could have been drawn, or equivalently the number of digits it would take to write out the number of possible secrets in binary. I’d have a much harder time cracking your series if I only knew that you were using the same algorithm but started somewhere between 1 and 1,000,000 instead of at 1.Ĭlaude Shannon‘s concept of entropy is essentially this: count up the minimum number of yes/no questions it would take to figure out whatever your secret is. If I knew you were doing this, and that you’d only used up 100 or so numbers so far, it wouldn’t take me long to do the same thing and figure out where you were in the series. The result will be a hash chain, a “random” looking series of numbers that is entirely predictable - that’s the “pseudo” in PRNG. So how does a computer, a deterministic machine, harvest entropy for that seed in the first place? And how can you make sure you’ve got enough? And did you know that your Raspberry Pi can be turned into a heavy-duty source of entropy? Read on!Įverything You Needed to Know About Entropy…īut first, to hammer von Neumann’s point home, here’s an algorithm for a PRNG that will pass any statistical test you throw at it: start counting up integers from one, hash that counter value, increase the counter by one, and append the new counter value to the previous hash and hash it again. Almost anything that your computer wants to keep secret will require the generation of a secret random number at some point, and any series of “random” numbers that a computer generates will have only as much entropy, and thus un-guessability, as the seed used. And while “un-guessability” isn’t a well-defined mathematical concept, or even a real word, entropy is. This shouldn’t be taken as a rant against PRNGs, but merely as a reminder that when you use one, the un-guessability of the numbers that it spits out is only as un-guessable as the seed. But if you know, or can guess, the seed that started the PRNG off, you know all past and future values nearly instantly it’s a purely deterministic mathematical function. Granted, if you come in the middle of a good PRNG sequence, guessing the next number is nearly impossible. What von Neumann is getting at is that the “pseudo” in pseudorandom number generator (PRNG) is really a synonym for “not at all”. For, as has been pointed out several times, there is no such thing as a random number - there are only methods to produce random numbers, and a strict arithmetic procedure of course is not such a method.” And this conjecture is not likely: When \(Δy = 0\), it predicts that \(Δ(xy) = 0\) no matter the value of \(Δx\) (this prediction is explored also in Problem 5.12).Let’s start off with one of my favorite quotes from John von Neumann: “Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin. ![]() Simplicity indicates low entropy indeed, the only plausible alternative to the proposed rule is the possibility that fractional changes multiply. ![]() ![]() ![]() This fractional-change rule is far simpler than the corresponding approximate rule that the absolute change is \(xΔy yΔx\). ![]()
0 Comments
Leave a Reply. |