Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Someone should create a wiki to collect these types of math books/resource recommendations.

Basically a definitive list of math for AI/CS with subpages for the various branches (ML, NLP, haskell-esque typed FP, etc) and focused on self-learning or even hobbyist entertainment instead of focused on being good for formal university classes - which is hard to discern from just browsing Amazon reviews which are mostly full of anecdotes from people's old college days.

I remember when I started down the relearning math rabbit hole last year and found so many threads on HN via the search feature recommending different math books in each.

It also doesn't help that there are 100 math books written each year thanks to the backwards university-fueled incentivize systems to write new ones each year.

I ended up spending a ton of time hunting down the best ones for each subject. Which always seems like a great opportunity for optimization if someone takes a crack at it.

Although once you get past the basics of math I've found a good general rule is to get one the Dover [1] math book series for the particular subject. These were written largely in the <1990s but are almost always still relevant and always my favourites. And notably frequently far more succinct than the university professor ones.

[1] https://www.amazon.com/s/field-keywords=dover+math



Wiki didn't catch on for whatever reason, but github "Awesome ____" lists did:

https://github.com/sindresorhus/awesome

You can find one for machine learning:

https://github.com/josephmisiti/awesome-machine-learning


i suggest that it's more important to collect results (theorems) than texts. course descriptions on university sites would be a good thing to scrape (and and NLP >>?) for both.

WARN: the following is ranty.

it turns out that, no matter where you find them, the results in math are always the same. Hahn-Banach is Hahn-Banach, whether you read Folland or Rudin.

moreover, language and notation are very consistent across sources, and math tends to be very "optimized" for /human/ learning. i haven't read much ML theory (yet?), but my (uninformed) gripe about it so far is that the math feels "noisy" in much the same way that Java is often called a "noisy" language: things are often expressed using the minimal mathematical machinery (e.g. multivariate calculus rather than geometry for gradients). as a result, i find the ML stuff imposes a lot of cognitive overhead to decipher equations that could be written more simply.

so while the ML community might challenge the programming community to learn more maths on a regular basis, as someone who studied math in school and has historically worked as a non-ML developer, i would challenge the ML community to develop the theory in such a way that it encourages a more intuitive understanding of mathematics rather than the bare minimum (multivar calc + linear algebra) machinery.

----

for all this i shall plug one (perhaps lesser-known?) multivar calc text that i found recently and i think looks pretty good:

Casper Goffman's _Calculus of Several Variables_


Metacademy might be what you are looking for.

It's easy to find textbook recommendations just by searching "best textbook for x". It's true most people just recommend the books that are the most well known and they are familiar with. It's hard to side by side comparison two textbooks. But it does filter out the garbage which there is a lot of.

Best advice is to look at the table of contents and see what it covers and doesn't cover.


> Someone should

By all means, go for it!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: