No one (well, few professionals at least) will reinvent the wheel when it comes to standard scientific computations and methods. Like numerical math, linear algebra, etc.
Looking through the problem sets in the link, the majority seems to be asking for just that.
If you're wondering whether or not someone knows how to transpose a matrix, or find the eigenvalues, let them do that on the whiteboard. No need to leetcode-ify such problems, because with 99.99% probability they'll provide you with solutions that are subpar compared to industry standard packages. There's more than time and space complexity when it comes to these problems.
EDIT: Also, you'll potentially lose a lot of high-quality candidates if you suddenly start to test people on methods they haven't worked with or seen in quite a while.
If you ask something like "please show us the equations for a support vector machine, and how you can compute a SVM" you could fail even world class ML scientists, if they haven't touched those for 10 years. Which is a very real possibility in the current ML scene.
I'd say that almost every ML interview I've had, or been part of, have been more big picture whiteboard interviews. Specific programming questions have ranked quite low on things to prioritize.
I really enjoyed andrej karpathy’s zero to hero videos and I like the concept of you don’t know something till you build it, so I made this site, probably should of come up with a better title because it is made as a learning tool not as interview prep like leetcode
First off, thanks! This does look like a fun way to learn things.
Secondly, FWIW, when I read the the term 'excercises' in the HN title I interpreted that to mean exactly a learning tool and not interview prep. The term "Challenges" in the website title is maybe a little less specific.
I can appreciate that. It wasn't until I implemented a few matrix factorization routines that I appreciated the decisions that go into Eigen, etc. It wasn't until I tried it with SIMD that I appreciated the speedups and knew where to look to coax them out.
> No one (well, few professionals at least) will reinvent the wheel when it comes to standard scientific computations and methods. Like numerical math, linear algebra, etc.
> because with 99.99% probability they'll provide you with solutions that are subpar compared to industry standard packages.
Most orgs need drivers but they interview like mechanics. If I am a driver, I am expected to drive different vehicles. Sure I know to do basic stuff like change tires/oil etc, but I am not going to know how to fix the engine or something else under the hood, right?
This kind of interview questions come from the mind of software developers, because that's the only thing they know how to do. When faced with some new area of knowledge, their instinct is to try to implement that in Python or some other language and imagine they have "learned" it. It doesn't occur to them that implementing things is not that helpful when it comes to most math topics.
Looking through the problem sets in the link, the majority seems to be asking for just that.
If you're wondering whether or not someone knows how to transpose a matrix, or find the eigenvalues, let them do that on the whiteboard. No need to leetcode-ify such problems, because with 99.99% probability they'll provide you with solutions that are subpar compared to industry standard packages. There's more than time and space complexity when it comes to these problems.
EDIT: Also, you'll potentially lose a lot of high-quality candidates if you suddenly start to test people on methods they haven't worked with or seen in quite a while.
If you ask something like "please show us the equations for a support vector machine, and how you can compute a SVM" you could fail even world class ML scientists, if they haven't touched those for 10 years. Which is a very real possibility in the current ML scene.
I'd say that almost every ML interview I've had, or been part of, have been more big picture whiteboard interviews. Specific programming questions have ranked quite low on things to prioritize.