Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you've got this backwards. You don't need to understand the notation, you need to understand the math. If you don't understand the notation then you probably don't understand the math in the first place so learning notation won't help. Learning how a matrix is written is useless without understanding what a matrix is.

You need to find out exactly what type of math is used in the paper and learn that. Learning notation will come along with that. If you're interested in machine learning then you should probably start with:

* Proofs and logic (a general requirement to understanding math) * Linear algebra * Statistics * Some multi-variable calculus



Partly, perhaps. For me, a large part is the notation. I know conditionals, I know loops (as in for loops), I know SQL... but if you show me logic, set theory, a sigma, or, uh, what's this sigma but for multiplication, I wouldn't have known what you meant a few years ago. It's not only understanding the math, it's also the notation.

I still don't know how to read things like curly braces or matrices or a dozen other things that I don't even know the name for, and figuring it out is usually more work than it's worth. As a result, I don't understand Wikipedia articles that are supposed to be introductory and understandable to the masses, but because the article's writers are being fancy, it's only understandable for a privileged few. Often the concepts, if I do bother to understand them or look them up elsewhere, are rather simple.

It also doesn't help that people use things incorrectly or ambiguously. Sometimes a line over some math will mean the average (or "mean", to use the ambiguous lingo) and sometimes it's an infinite repetition.

Or the lingo: a friend of my girlfriend was baffled I didn't have X in school and didn't know what it was (I think it's derivations). She kind of explained it to me and I kind of understood, not recognising it. Later that evening I used it, unknowingly, in programming. When showing my girlfriend what I made she pointed it out to me.

Heck, I recently used logarithms fairly intuitively where they were never properly explained to me. I just knew the formula for entropy calculation (log(possibilities)/log(2)) by heart and managed to make something that converts a number from any base into any base (e.g. base 13 to base 19). I can do math, just nobody bothered to tell me about the fancy symbols. To me, the symbols mainly seek to obfuscate, look smart, and ensure job security.


Pretty much every math textbook I've read has defined the notational conventions early on.

(I'm not saying this is the case with every math text, just the ones I've read that are intended to teach the reader the subject.)

I see all these comments about reading papers with confusion and I get the impression that they didn't read the prerequisite information from textbooks for whom they're the right audience as that's generally how you pick up notation.


I second this. The notation was driven by the concepts, problems, and approaches. If the notation feels bizarre it’s because you don’t see how it improves/clarifies the underlying concepts, problems, and approaches.

Learning the notation itself can be useful as a way to point you towards relevant concepts, e.g., “Where did this symbol come from and why did it become the convention? What concept/problem/approach motivated it?”

But if your goal is to memorize a bunch of symbols you’re gonna have a bad time.


I agree: learning the specific notation becomes trivial once you understand the underlying mathematics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: