Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

it's 128 bits per card. That's vastly more than 226 bits.




You could use something like 12 bits of randomness per card (a very rough approximation of the log_2(n^2)) to get the probability that you reuse a number down to a manageable level, check if you’ve reused a number (which is basically free once you’ve sorted), and then repeat the whole process if you reused a number.

Or you could have a lazily calculated infinite precision random number per card and use more like 6 bits per card on expectation. Other than using more (and annoyingly variable) memory, this may well be faster than a properly unbiased Fisher-Yates.

Or you could assign a few bits per card, sort, and then recurse on each group of cards that sorted into the same bucket.

In summary, there are lots of genuinely unbiased solutions (assuming a perfect binary RNG), and they all boil down to something roughly equivalent to rejection sampling.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: