Alex1 wrote:
When I announce a period for my algorithms, they remain random throughout the period. You have to do the same otherwise it's cheating for the challenge
I too can create an algorithm with a period of 2^1024 but which will only be random on 100 KB :
It's not cheating. It's just a different category. I mention both the period as well as the PractRand results. The minimum period is important if you want to be sure to avoid bad seeds where the period is really short. For instance, if you initialize an LFSR with all-zero value, it gets stuck there, which is highly undesirable. When I say that the period is guaranteed to be 2^48, it means you don't have to worry about any bad seeds. As far as random quality is concerned, most 6502 users would happily pick a decent algorithm that's smaller & faster over one that's super high quality. When PractRand says it's good until 1 GB, it doesn't mean that it suddenly gets really bad. It just means that it's getting enough data that subtle patterns start to emerge. If you toss a coin that's heads for 50.00001% of the time, and you throw it billions of times, you start to realize that the coin is bad, but that doesn't mean it's useless. There are many applications where such a coin would be perfectly acceptable.
Quote:
I let my Raspberry PI (only one core used) run for 70 days to prove that my LFSR64 + LCG64 passed the 16 TB
That's fine if you only want to test one particular algorithm. If I'm trying to optimize for cycles, I run hundreds of different trials, sometimes thousands. I can't run hundreds of variations if each attempt takes weeks to run.
Quote:
1TB seems to me very little to prove that an algorithm is random
1TB for a 6502 system is exceptionally good.
Quote:
Time is not an issue, just use a system that can stay on for a long time because it does something else in the meantime.
It's an issue for me, because having the test running on my PC slows it down for other things, including other random generators that I'm testing.