Hacker News new | past | comments | ask | show | jobs | submit login

I don't want to hold up whiteboard interviews focus on algorithms as the be all, end all. They're a poor proxy for day-to-day skills: the hard stuff is organizational, knowing all the things that can go wrong, and technical design that's both flexible to changing conditions and easy to work with. None of those are really amendable to probing in a 45 minute interview, or even several hours of pair programming.

But it's about friction: I want my coworkers to be focused on the genuinely hard problems, not spending a day writing a BFS. The current interview process does manage to probe that.

Going a bit deeper, the whiteboard interview process is a good proxy for ability to prepare over the medium term (a month or three of consistent studying should give you as good a chance as anyone to get into a generalist position at a prestige company) and of IQ. The latter is controversial and most organizations can't test for it directly (owing to legal concerns), but a relatively high IQ is a core requirement of technical roles, and whiteboards provide a solid proxy for that when coupled with the opportunity to prepare for them beforehand.

That said, I'd always go for someone who has the ability to deliver on complicated, large projects over someone great at whiteboards or who has a high paper IQ. It's just that it's pretty much impossible to evaluate for that in a way that works on a general application pool.




This is a very acute answer. I actually do think well-structured algorithm questions that are variants of common ones might be strongly g-loaded. E.g. They test knowledge + IQ.

I would say big companies that hire new people or generalist programmers benefit greatly from them. These types of questions test for intelligence at scale without outright being an IQ test. In fact, I think I read somewhere that Google engineers' performance is strongly correlated with how well they do on their interviews.

Sure, these questions might not test for conscientiousness, curiosity and general agreeableness, but that's why you have the other portions of the interview.

All this being said, I don't necessarily think these questions work at most companies where the engineering teams are significantly smaller and you can really spend a lot of 1-1 time probing knowledge and experience and reviewing coding exercises. However, I would still say basic DFS/BFS and using a hash to solve a problem is still a must. I use them on occasion, even as a generalist.


I think the goal of hiring is always: "We have a particular business problem, and need to determine: if we pay this person will they be able to help us solve it?"

As much as I think certain companies shoot themselves in the foot by spending their interviews asking trivia questions about e.g. Rails (because that's what they use), it's not unreasonable that they might just more highly value having someone who can be productive in their Rails world immediately against someone who, while excellent, nevertheless needs a week or several to onboard the ecosystem. Or in other words, they think the goal question is answered in the affirmative more strongly for someone already in the ecosystem. A dangerous assumption, but possibly valid. There are always tradeoffs to be made in the specificity and category of your questioning, but interviewers need to keep in mind how they contribute to the overall goal's question.

For larger companies, it's more likely that there are more problems to solve, and more likely a need to get people working on them sooner rather than later, so the incentives start pushing for addressing the goal of hiring (if not for all positions, at least for many positions) by answering a simpler goal question of "is this person smart and gets things done?" (which is just high IQ + conscientiousness) and hiring en masse. If you put such people to work on any of the problems you can rightly expect they will advance them some amount, even if they have to ramp up on a programming language / framework / whatever feature of the problem's environment first, and as time goes they can shift around the company to where they're even more effective. In addition to being fast it's also very fair with respect to people's backgrounds, suddenly it doesn't really matter if you have a thousand widely used github repos or not, have a 10 year experience headstart or just graduated a bootcamp, if you have a degree or not, if you know the framework already or not; people from the "wrong background" can still be hired if they can demonstrate they are sufficiently smart and conscientious to start working on one of the various problems you need work on yesterday. The fact we have to proxy this with whiteboard hazing sucks, it's expensive for everyone involved compared to just asking for ACT/SAT scores or the results of a previously taken IQ test when available and giving an IQ test when not. In my own interviewing I try to proxy for "smart & gets things done" (because that's all my teams at my current company have needed) in a less asymmetric/aggressive way while answering other questions since we can't hire everyone. But even the aggressive whiteboarding is better than every job interview screening for highly specific backgrounds and/or trivia knowledge...

And I agree that BFS/DFS is appropriate, though needlessly so (i.e. there are even simpler questions that take less time), as a first step in answering the business goal which is simply verifying: can this person who says they can program, actually program? Unless you're willing to train people to program, you have to start the cutoff somewhere. It's modestly appropriate to answer the question of: does this person know their craft's basic tools?


> Going a bit deeper, the whiteboard interview process is a good proxy for ability to prepare over the medium term (a month or three of consistent studying should give you as good a chance as anyone to get into a generalist position at a prestige company) and of IQ.

This is where I have a problem. Why are we the only industry that has this expectation of months of preparation? Engineers don't do it. Doctors don't it. Professors don't do it. Why?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: