Hacker News new | past | comments | ask | show | jobs | submit login

I have had this same experience of finally understanding a mathematical idea by seeing it implemented in a programming language. You can always eventually understand a program because there can't be any ambiguity or the compiler couldn't decide how to compile it. Math is not supposed to have ambiguity but it arises because there is too much convention and assumptions in the notation (granted sometimes they are saying something broader than can be expressed in code). But higher mathematics as a field seems to me to be like a programmer who uses single letter variables, never write comments and really like clever bitwise operators.



Every math paper I've read and written have written out definitions for all the notation they use, even have a section called "notations". Don't you read math papers? Or do you mean applied math papers? Applied math is more like engineering though so they get sloppier with notation, I guess the same thing happens when computer scientists do maths.

But that isn't really a problem with math, but a problem with sloppy people. Or you just didn't read the notation sections explaining everything.


As a programmer and occasional electronics hobbyist most math I encounter is in the context of whitepapers, or physics/EE texts or similar. I don’t really read papers in pure mathematics. Perhaps the difficulty is due to sloppiness by non professional mathematicians.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: