Random-number generation
Re: Random-number generation
I think the crucial difference here is that the LFSR genuinely makes a bit at a time, and only as a matter of usage do we collect 8 for a byte or 32 for a word.
As for Arlet's, it feels to me that it is byte orientated, in which case we might expect the byte-stream to have somewhat different statistical properties from an assembled word-stream. But perhaps not!
It's interesting to me too, that Arlet's scheme is empirical, without a mathematical underpinning. But it's certainly small and cheap, and apparently passes lots of tests.
(As a terrible idea, I did wonder about omitting the LDA to save a couple of bytes, and replacing the CLC with a SEC, in the mere hope that the generator is not called with worst-case values in A - in this case, perhaps an FF. As I say, a terrible idea.)
As for Arlet's, it feels to me that it is byte orientated, in which case we might expect the byte-stream to have somewhat different statistical properties from an assembled word-stream. But perhaps not!
It's interesting to me too, that Arlet's scheme is empirical, without a mathematical underpinning. But it's certainly small and cheap, and apparently passes lots of tests.
(As a terrible idea, I did wonder about omitting the LDA to save a couple of bytes, and replacing the CLC with a SEC, in the mere hope that the generator is not called with worst-case values in A - in this case, perhaps an FF. As I say, a terrible idea.)
Re: Random-number generation
No algorithm can produce truly random data, only a physical random source can do that (like counting the decay number of an atom). Then everyone uses the LFSR and even the army. And LFSR is a building block that can be used for a cryptographically secure encryption system. this is the big difference with your PRNG. On the other hand as you created it through trial and error, it was almost certain that there were problems in it.
On my side I prefer algorithms that are based on math, we are sure not to have a problem then.
On my side I prefer algorithms that are based on math, we are sure not to have a problem then.
- barrym95838
- Posts: 2056
- Joined: 30 Jun 2013
- Location: Sacramento, CA, USA
Re: Random-number generation
I'm all for rigorous "mathy" solutions when our nuclear codes are in jeopardy, but Arlet's spunky little solution seems like a great fit in so many other contexts. I admire his workmanlike spirit, and I personally look forward to further inventions from him.
Got a kilobyte lying fallow in your 65xx's memory map? Sprinkle some VTL02C on it and see how it grows on you!
Mike B. (about me) (learning how to github)
Mike B. (about me) (learning how to github)
Re: Random-number generation
BigEd wrote:
As for Arlet's, it feels to me that it is byte orientated, in which case we might expect the byte-stream to have somewhat different statistical properties from an assembled word-stream. But perhaps not!
Re: Random-number generation
The mathy nature of the LFSR is a double edged sword. It means you can construct a set of linear equations to solve the polynomial and the state, based on a small number of output samples, using the Berlekamp-Massey Algorithm. In addition, the LFSR generator does not create high quality random numbers, as it quickly fails all modern test suites.
Re: Random-number generation
Arlet wrote:
The mathy nature of the LFSR is a double edged sword. It means you can construct a set of linear equations to solve the polynomial and the state, based on a small number of output samples, using the Berlekamp-Massey Algorithm. In addition, the LFSR generator does not create high quality random numbers, as it quickly fails all modern test suites.
For example the US KW-7 encryption device used only 1 LFSR (39-bit LFSR) :
Re: Random-number generation
Quote:
if we add logic gates to make it nonlinear and this is only possible precisely because there is no pattern in the LFSR.
Re: Random-number generation
I think it's interesting, though, if your scheme's core has 2^32 states whereas a 32 bit LFSR has only 2^32-1. (Acorn's BBC Basic uses a 33 bit LFSR, which is of course just one bit better, and is suitable for 32 bit outputs. It should fail some tests, though, for the distance between consecutive values, compared to a true 32 bit random source.)
Re: Random-number generation
Indeed, and not only is the missing state bad from a uniformity point of view, it can also be annoying in practical applications. For instance, if you want to create an effect where you toggle every pixel on the screen in random order, it is nice if there are no missing values. Also, it's easier if there are no bad seed values that you have to watch out for.
Of course, the 2nd half of my random generator messes up the uniformity a little bit, for the sake of code density. If desired, the output function can be replaced by a bijective 32->32 bit function that preserves uniformity. That would require a few more instructions.
I am currently working on a RNG optimized for the AVR processor that uses 64 bit state, and produces a 32 bit random value. My strategy is to come up with some simple ideas, test them to see which one works best, and then make multiple copies with tiny changes, like changing ADC into EOR, swapping two operations or values. Each of these copies is then evaluated and if I find an improvement, I keep that one as the new base. If reordering things doesn't result in improvements, I try adding an operation. It's a guided form of evolutionary programming. Breeding digital orchids.
Of course, the 2nd half of my random generator messes up the uniformity a little bit, for the sake of code density. If desired, the output function can be replaced by a bijective 32->32 bit function that preserves uniformity. That would require a few more instructions.
I am currently working on a RNG optimized for the AVR processor that uses 64 bit state, and produces a 32 bit random value. My strategy is to come up with some simple ideas, test them to see which one works best, and then make multiple copies with tiny changes, like changing ADC into EOR, swapping two operations or values. Each of these copies is then evaluated and if I find an improvement, I keep that one as the new base. If reordering things doesn't result in improvements, I try adding an operation. It's a guided form of evolutionary programming. Breeding digital orchids.
Re: Random-number generation
> if you want to create an effect where you toggle every pixel on the screen in random order, it is nice if there are no missing values
In this case you might want a random permutation, or you'll find yourself waiting a long time!
I do recall, back in the day, filling a Beeb's high resolution screen (20k bytes, 160k pixels) with, in effect, a bar graph of bucketed values from the random number generator. One hopes to see a broadly even growth in the single-pixel-wide bars, but not a precisely even growth. (In fact, the degree of unevenness of the growth is a measure of how good the pseudo random numbers are, as you no doubt know.)
In this case you might want a random permutation, or you'll find yourself waiting a long time!
I do recall, back in the day, filling a Beeb's high resolution screen (20k bytes, 160k pixels) with, in effect, a bar graph of bucketed values from the random number generator. One hopes to see a broadly even growth in the single-pixel-wide bars, but not a precisely even growth. (In fact, the degree of unevenness of the growth is a measure of how good the pseudo random numbers are, as you no doubt know.)
Re: Random-number generation
I remember an interesting effect I did on my Atom high res mode. On a clear screen, I drew the two diagonals across the screen, and then had a machine code loop that basically did
When done over and over, it creates some interesting patterns.
I wonder what it would look like with ADC instead of EOR.
Code: Select all
loop:
EOR (ptr), Y
STA (ptr), Y
INY
BNE loop
... and repeat for next pages until entire screen was done ...
I wonder what it would look like with ADC instead of EOR.
Re: Random-number generation
I notice that in my translation of the Arlet's algorithm, where I forgot a double carry, there are fewer patterns: 9 instead of 12.
Quote:
Found '00000000100010101010100001110011' at index 0 !
Found '00000000100010101010100001110011' at index 6852B3F7 !
Found '00000000100010101010100001110011' at index BD7D473F !
Found '00000000100010101010100001110011' at index E9154CB2 !
Found '00000000100010101010100001110011' at index 217CA96FD !
Found '00000000100010101010100001110011' at index 324CC6D91 !
Found '00000000100010101010100001110011' at index 34BA75919 !
Found '00000000100010101010100001110011' at index 3C1ABA1E1 !
Found '00000000100010101010100001110011' at index 4630BEC41 !
Found '00000000100010101010100001110011' at index 7AC32606F !
Found '00000000100010101010100001110011' at index 6852B3F7 !
Found '00000000100010101010100001110011' at index BD7D473F !
Found '00000000100010101010100001110011' at index E9154CB2 !
Found '00000000100010101010100001110011' at index 217CA96FD !
Found '00000000100010101010100001110011' at index 324CC6D91 !
Found '00000000100010101010100001110011' at index 34BA75919 !
Found '00000000100010101010100001110011' at index 3C1ABA1E1 !
Found '00000000100010101010100001110011' at index 4630BEC41 !
Found '00000000100010101010100001110011' at index 7AC32606F !
Re: Random-number generation
Arlet wrote:
Quote:
if we add logic gates to make it nonlinear and this is only possible precisely because there is no pattern in the LFSR.
In the case of your algorithm, logic gates change the internal state of the algorithm, making its behavior mathematically unpredictable.
We see this with the carry, no double carry and there is less pattern without really knowing why.
Re: Random-number generation
Quote:
In the case of your algorithm, logic gates change the internal state of the algorithm
Code: Select all
CLC
LDA #$45
ADC seed+0
STA seed+0
ADC seed+1
STA seed+1
ADC seed+2
STA seed+2
ADC seed+3
STA seed+3
Re: Random-number generation
(Fun fact: you'd need 226 bits to have a chance of shuffling cards into any possible order!)
Slightly OT, I found The Unreasonable Effectiveness of Quasirandom Sequences rather interesting. Short version: just keep adding 1/phi and you get a nice evenly spaced walk around many possible values. A 40 bit or 48 bit version might be interesting, and would be extremely cheap. I would expect it to fail a good test of randomness - but perhaps not too badly.
1/phi is 0.61803398875 or $9e3779b97f4a as a scaled value.
Slightly OT, I found The Unreasonable Effectiveness of Quasirandom Sequences rather interesting. Short version: just keep adding 1/phi and you get a nice evenly spaced walk around many possible values. A 40 bit or 48 bit version might be interesting, and would be extremely cheap. I would expect it to fail a good test of randomness - but perhaps not too badly.
1/phi is 0.61803398875 or $9e3779b97f4a as a scaled value.