C
Chris Angelico
[snip spurious answer]Does adding 1 to a random
number make it less random? It adds determinism to the number; can a
number be more deterministic while still no less random?
Ah! I know. The answer comes from common sense:
I know you're being funny, but in fact adding a constant to a random
variable still leaves it equally random. Adding, multiplying, dividing or
subtracting a constant from a random variable X just shifts the possible
values X can take, it doesn't change the shape of the distribution.
In real numbers, that's correct. However, computers don't work with
real numbers, so there's the very, uhh, REAL possibility that some of
the entropy will be lost. For instance, multiplying and dividing when
working with integers results in truncation, and adding huge numbers
to small floats results in precision loss.
I was deliberately playing around, but unfortunately there have been
many people who've genuinely thought things similar to what I was
saying - and then implemented into code.
However, adding two random variables X and Y does change the
distribution. In fact, a very cheap way of simulating an almost normally
distributed random variable is to add up a whole lot of uniformly
distributed random variables. Adding up 12 calls to random.random(), and
subtracting 6, gives you a close approximation to a Gaussian random
variable with mean 0 and standard deviation 1.
Yep. The more dice you roll, the more normal the distribution. Which
means that d100 is extremely swingy, but 11d10-10 is much less so, and
99d2-98 quite stable. The more randomness you add, the more
predictable the result.
<quote source="Jubal Early">Does that seem right to you?</quote>
ChrisA