Hacker News new | past | comments | ask | show | jobs | submit login

I believe the fundamental problem is, most organizations interview in a style similar to testing in college: ask a question about a technical topic, evaluate if the answer is correct. This doesn't correlate especially well to how well a candidate will program if you hire them, but it's the method people learned in school for evaluating someone's knowledge of a topic, so it's what they use. Most developers, and most managers, have almost no training on how to interview well in order to determine who would be a good candidate, so they fall back on the only halfway-similar experience they have.

Now, this raises the question of why our society's method of educating people (or determining if that education has been effective) is not very well correlated to their real-world performance, but that's a whole 'nother conversation.




The weird part is that this is almost exclusively found in organizations that think of themselves as "tech"-y.

I work somewhere on the border between neuroscience and machine learning. Interviewing in a "neuro" group usually involves a lot of talking--what you've done on project X, how would you approach Y, what do you know about Z. Coding does come up, but in the context of stuff that I've done or would do on the job.

Applying for a very similar job in a tech company usually starts with me reversing strings on a whiteboard or abusing C++ templates to calculate factorials or something....


Exactly. I worked in Business Intelligence and having both an accountant and a programmer background, whiteboard coding never comes up in interviews. Its more domain knowledge that's important, which is mostly about finance/accountancy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: