I've been comparing various different methods for generating random numbers using the ATMega328 and running the results through a couple of random number test suites to check their quality and a very odd pattern has emerged that baffles me. One test suite is called ent, and among other things, it reads a file of random numbers byte-by-byte and lists the number and percentage of each byte value out of the whole. If you have a dramatic lack of randomness, this points it out pretty easily.
Here's the weird part. random() using any seed shows a distict lack of values 15 and 22. In a 20 meg file, there are between 80,000 and 82,000 of every single possible byte value (pretty good distribution) except for 15 and 22 in which there are only 331 and 296 instances respectively. That's absurdly low. At first I chalked this up to a weakness of the PRNG, but ran into it again in other places.
For instance: analogRead() on an unconnected pin gave really bad distribution but the byte value 15 only occurred 17 times in a 20 meg file. That's bonkers. Here's another example: Take the two lowest significant bits of analogRead() on an unconnected pin and XOR them to get a single bit and then bitshift those into a long integer. A 20 meg file built up using that method resulted in byte value 15 occuring only 90 times and byte value 22 never occuring once.
At that point I figured the mehod I was using to pull the serial data off the USB was faulty (screen -L) but when I put /dev/random through it there is no such similar oddity.
My next step is to try programming a 328 with single instructions such as "unsigned long testNumber = random();" and then disassemble the resulting hex file to see what exactly it's doing, but then I figured I'd just ask and save myself the trouble.
Anyone have any ideas why this oddity occures?