Hacker News new | past | comments | ask | show | jobs | submit login

I was talking to a friend of mine about just this issue recently. He and I have been on both sides of the interview table at different times. I have gotten interviewees to solve toy programming problems and I've asked them the funny little logic problems, just as I have been asked the same sort of things in the past. But eventually I too came to the realisation that these kinds of things had little if any real significance to the work that the interviewee was going to be doing. Is it usually going to be the case that the guy you hire will have to write his code with unusual speed, under the pressure of three or more people looking over his shoulder and judging his answer, without the benefit of Internet access or Intellisense or what have you?

My approach is to throw out the whiteboard coding and logic questions whenever I can and to look for examples of a prospective employee's real world work, namely contributions to open source (e.g. GitHub makes this easier) and if not that, then perhaps they can directly provide me with private real world examples of code they've written.

I realise that not everyone will necessarily be able to provide such examples of real world work and yes, looking over someone's pre-existing project will probably take more time than seeing if they can reverse a string. But maybe this is also part of the problem: hiring practices in some places have gotten lazy.

I can't think of a way to entirely, in 100% of cases get rid of FizzBuzz, and the guy who has to escape the fire spreading from one side of the island etc. but I really wish I could, because it just doesn't feel right.




I'm constantly surprised when I read articles like this one, or when I take interviews myself, that noone considers how the employee deals with a "project as a whole". Toy logic problems or algorithm optimisation tell you nothing about how good that person is at actually working as a software engineer. Do they care about code unit tests and code coverage? If they find a process missing - no continuous integration or something - do they just mumble and make do or do they find out why and, if necessary, implement it? Are they capable of delivering quality software, or can they merely write neat, optimised algorithms?


Those are questions a manager should be asking, ie, outside the scope of the interview I'm conducting. They're also the easiest to fake. Everybody knows the answer is to say they write test cases for every change before they check it in. That's way easier to memorize than any coding puzzle solution.

I've had some frustrating experiences where the obviously talented candidate was nearly passed over because they didn't use the right software engineering buzzwords while at the same time the manager came close to overruling the technical veto because they felt some other idiot sounded like a good fit.

Teaching someone to write unit tests is, imo, far more likely to succeed than teaching someone to understand recursion.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: