Hacker News new | past | comments | ask | show | jobs | submit login

Mathematicians have a general aversion to anything using the digits (or bits) of a number because it's generally too provincial. Numbers that are 'interesting' in one base tend to be boring in other bases but catch peoples eyes. It's like a self-defense mechanism to ward against quacks that sometimes catches legitimate uses as well.

Cantor's diagonalization argument for example, when used to show that R and Z have different cardinalities. It's a legitimate use, but huge numbers of incorrect counter-examples (that are really just examples) surface around it using the decimal expansion incorrectly. And the decimal expansion used in the actual diagonalization requires some careful consideration, since you have to be careful to remember that 0.10000... = 0.09999... so decimal expansions aren't unique.

Your use seems fine (and dealing only with integers the expansion is unique), but most of the time when a mathematician sees someone using digits they get a sense of unease and wonder if there were a better way to do things.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: