Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Should we change the way we do software interviews? What's stopping us? (hxngsun.wordpress.com)
51 points by d0paware on March 25, 2018 | hide | past | favorite | 90 comments



It's entirely bullshit, and I regret being on both sides. I used to do coding screens, but I feel like some moderate knowledge of syntax and a decent education is enough to skip the rest. The whole cargo cult of coding screens came up during the 00's when Google made it look important, and you're correct that it's largely an exercise to convince the team that they're special.

Software engineering is the science of avoiding coding! And the code that these elite teams produce is often pure shit anyways.


The things they ask you to code too... anyone who would ever let you commit that code to the codebase isn’t someone you’d want to work with.

I’ve been trying to cook up plausible examples but it’s a challenge.


Do any of the solutions I've personally suggested or mentioned here in this thread seem reasonable to you though? I was hoping we could at least talk about trade offs or pitfalls of various things people have suggested.

I feel like maybe I screwed up here with the thread since it got flagged early on for having Ask HN in it, and people responded with a knee-jerk reaction going straight to the comments section. Not sure if it's possible to get a mod to rename this and remove the Ask HN, I originally tried to paste in my content directly into the self-post textbox but 9k chars exceeded 2k chars.


At my work we give coding exercises to interviewees with screen-sharing and give candidates lots of time and as much interaction as they want. The questions we pose are simple and do not require years of algorithmic experience and generally don't involve coding that couldn't be done by somebody with a couple of introductory software courses. Candidates choose their own programming language and environment.

The problems can be solved naively with a straightforward solution, or much more optimally after some further considerations.

The approach is absolutely not b.s., and we observe:

* some candidates completely misunderstand the very straightforward problems posed and their solution has no hope of addressing the issue

* some candidates have a rough idea but can't write the two nested for-loops required, for example, for the naive solution to one particular problem

* some candidates don't ask for clarification or confirmation that they understand the problem correctly even when they'd benefit -- others do and make better progress

* some candidates don't consider the full range of allowed possible inputs for the problem statement and assume the happy path always ... this is plenty forgivable, but when this is suggested or pointed out, some candidates find an easy way to resolve, others jump through interminable hoops to address, and others have deficit in communication that doesn't let them discuss the issue

* when discussing perceived flaws in implementations, some understand the language used to discuss software, others have a hard time communicating or answering questions about their implementations -- e.g., when one candidate wrote a recursive function all of whose branches ended up calling the same function recursively and we asked "when will your recursive function ever stop?" in many forms, they could not see or understand the intent of our question

* some candidates misunderstand writing a solution addressing a particular test case vs a general solution

* a few candidates are able to see the more optimal solution and write a good second algorithm based on that approach

* very few candidates resort to stepping through code in their IDE, unfortunately, to see or demonstrate what happens

We've eliminated many candidates we feel would have been a poor match based on inability to understand requirements, consider all possible input situations, inability to write naive algorithms, inability to communicate effectively about interview problems. ( Note that we also let candidates continue addressing the problem as "take-home" after the interview if they show promise. )

One junior candidate we just recently hired in a South Asia office did quite well on the algorithms portion, but unfortunately did not know Linux/UNIX/macOS command-line ... and assumed she was going to fail that portion of the interview -- she'd worked only in Windows contexts before. We had pointed her to a Linux-in-the-browser site and we just asked her to Google/search for the solutions to the various fairly basic operations we asked her to perform on the command-line -- she was surprised we'd let her do that. She struggled through and found solutions, and that's exactly what we wanted to see. There's nothing that exposes a candidate's suitability more than an open environment with a problem to solve and opportunity to discuss.


You have described a process that helps your company eliminate candidates efficiently.

Your company has become very good at finding what people are not good at.

Your company is not alone in that approach and this is exactly where the tech industry interview process fails.

It looks for the wrong thing.


I would fail utterly at a real coding interview. I've been out of school too long for that crap. And also, my tolerance for it has dropped as well.

But, I'm now a dev manager so I get to sit on the other side. I've been doing interviews in the last week for a dev position and I don't subject candidates to coding. I talk with them about what they work on, one of my senior devs sits in and goes even deeper -- learning what they know without even really asking formal questions, for the most part. Past that I care more about them being smart enough and willing to learn new things, and easygoing enough to get along with the rest of the team. Hell, that's at least half of what I care about to be honest.


I encounter people who can make pleasant conversation about programming all day... but literally can’t seem to actually program in practice. Do you see that? How does your interview check for that?


My experience has been that devs who struggle, will struggle with a simple for loop. This seems consistent with what I've heard from an Amazon bar raiser and the existence of fizz buzz. I'd be reluctant to give up this one sanity check of asking in real time a for loop question for whiteboard, or paper, psuedo code prefferably.


At one point for phone screens, I just asked people to code up strcmp. I got some amazingly bad answers.

Some of this code-screen mistrust seems to come from the flood of poorly-qualified candidates; maybe there's a more diverse pool than before, but it comes at the cost of lots of wannabes.


I have not yet run into anyone who could deep-dive into the details of what they're working on and talk about the good, bad, and ugly, who turned out to be incapable of coding. Maybe I've been lucky, but I think it's pretty hard to fake that level of knowledge.

For remote developers I do deviate slightly and I actually ask them to do a little coding, but that's because it's always someone in Hyderabad and the cultural differences make a deep-dive conversation with body language a lot harder to pull off. So I ask them to whip up a demo program that scores a game of Bowling, take as long as necessary, and bring the results to the interview and show me what they did. Not much but it works (and scoring a bowling game is actually not a bad test -- not as trivial as it sounds, but doesn't take a whole evening of work just for an interview).


Great approach! What you describe is pretty close to my understanding of what a good hiring process should be like.


I interview in the same way and have never had a problem with false positives. I think you overestimate the ability of people to bullshit their way through without triggering any red flags for the interviewers.


Honest? Smart? Positive attitude? Fun to work with? Willing to learn new stuff? Learn new stuff on your own because you like coding?

Ok. We're done. Now check references.

We'll follow that up with an audition, show up on a Saturday and pair up with team members for some random hacking, see how you handle working closely with people on a problem.


I actually love the idea of teaming up with some people to do some random hacking. That sounds like a super fun interview! But it also sounds really time-consuming for the team members as well as the candidate, especially if the time goes flat


Skipping over why do you think "likes to code" is important metric when evaluating candidate for a _job_, is it common in your company to interview people on weekends?


A job as a developer is making technology do things people want. It is important to both like people and technology -- and be able to work in both seamlessly.

Whenever people are available we can see how they'd work out. Sometimes it's a Saturday, sometimes a Tuesday. I think you're reading far too much into my comment than was intended.


We absolutely should. Here's what I'm doing to help change it:

I'm launching a new venture that will try to help connect great engineers with great companies. Companies with great teams, competitive benefits, and the belief that engineers shouldn't have to go through bullshit interview practices and shouldn't have to study 150 hours for a round of interviews.

The flow is pretty simple: a quick Skype/Hang out call to learn about you, what've you done, what you want, and what you like/don't like.

Next, you'll get access to a git repo. In that git repo is a project with real code and a few features/bug fixes for you to implement. You commit against that repo just like you would at work. After a certain amount of time, you'll lose access to that repo.

One or more engineers will review the repo with no knowledge of your age, gender, work history, etc. Based on that engineer's feedback, we match you with companies that we think there will be a good fit. If there is mutual interest, you meet the companies in person for a final round where you can't get rail-roaded with bullshit questions.

Email me @ interviewingisbroken@gmail.com if any of these are you:

1. You're in the Bay Area or NYC and interested in fair interviews with great companies

2. You're in the Bay Area or NYC and work for a company that wants to offer fair interviews to get great engineers

3. You're a great engineer who would like to get paid to help create interview projects or perform reviews of candidates code - we're paying up to $100/hr. Great side hustle opportunity.

I wasn't planning on announcing anything about this yet but this post got me so fired up.


It's really easy to say "interviewing is broken" and get a bunch of people excited and agreeing. It's almost like saying "politics is broken," everybody agrees but doesn't agree how to fix it.


I can tell you we've gotten rave reviews from the companies we've partnered with, and almost all of the candidates we've tested with this love our process better than the alternative. I can't fix interviewing for everybody, but I think we're on a pretty good track for at least some of the world. I'll take that, for now.


Love this, although not living in that area.

Just to chime in: I'm a PhD student in CS/machine learning with a competitive enough CV to interview anywhere, and I have gone through the motions at many companies previously (e.g. for interning), and there is nothing I hate more than coding interviews.

It literally makes me hate companies, it turns me off from interviewing at a place, and it makes me deeply resent them even if I get an offer after a ridiculous series of interviews.

After you have subjected me to a terrible/demeaning interview process, I might still work for you, but I will never be a loyal or feel any inclination to contribute more than I need for personal gain, because you have established that nature of the relationship via the interview. Sorry for the rant.

I hope you succeed, so very much.


I think you need to grow up some and experience this from the other side. In the past I've felt the same kind of contempt for this process, but after having experienced being on the other side I respect the difficulty in identifying(accuratly) good candidates.


Honestly I don't think I need to do, I have interviewed many people for doing very code heavy research projects and not once have I felt asking them to do a white-board code exercise would have contributed to my evaluation meaningfully.


Here is idea for you, aspiring recruiter - you want great engineers to interview? Pay them $150/hour to work on your github repo that you want to use in interview.

Everyone wins - engineers don’t get their time wasted, and interviewing companies are quite interested in not to waste theirs.


I addressed this in another comment, but just to respond to you: we've done test interviews where we paid candidates $80 and hour. The expense works into our model so I would anticipate seeing some form of that in the future.


This is essentially how Wordpress does interviews. It’s the best process I’ve seen.


So your solution is to have people work a second for free with the hope that maybe they'll get a job after a while?


We're still new to this, but we have tried a few interviews where we pay the candidate $80 an hour (~$160k salary annualized). On average, good candidates complete the projects in about 3.3 hours, so that's a ~$270 expense to us, which we've been fine with.

Candidates who did the unpaid version saw the clear value: they invest 3-5 hours once, and potentially cut out 20+ hours of first round interviews, or even all the studying like the OP did.

Nobody that we've tested in the beta complained about a few hours of free "work" - they were actually excited about it.


I really like that you compensate candidates for their time. I can't think of anyone else who follows that model presently, so thank you for keeping in mind being fair to prospective hires. Are you all SF-only?


We're just in SF/NYC right now, but this model travels well. Hopefully you'll see us a in a lot more places by the end of the year.


London please. I'll be a customer.


Do any of those SF/NYC companies hire remote employees?


To be fair to OP, I think there's ways to structure throwaway projects in such a way that it contains flaws and they're useless to a real company but complex enough to gauge experience/familiarity, etc. I don't think GitHub would be optimal since then people could clone these throwaway projects and create brain dumps.


We actually pay contracted engineers (up $100/hr) to write these projects. They're usually recreations of interesting problems have had to implement or work on in their careers. Some are just forks of open source projects.

We don't turn around an use the code the candidate wrote. If you consider the cost to use to write the project AND the cost to have it reviewed N number of times, we'd be overpaying drastically for that feature to be implemented.


Why don't you get the companies who are hiring to have their engineers setup & write some code?

Code reviews worth both ways.


> In that git repo is a project with real code and a few features/bug fixes for you to implement.

Is this from you or from the company? If it’s from you what kind of project is it? Web development? It seems like the technology stacks are so diverse it’s hard to give a coding test that’s applicable to a wide range of companies.


We have a library of these projects. We add more all the time. We built the first set. Now we're paying engineers to build these based on really interesting problems they've solved in their careers.

We don't reuse the projects across different stacks. We might borrow some of the underlying themes or principles, but they're different projects for different stacks, each written by somebody with very precise domain expertise.


So this is just minor tweaks (not that anonymity is not a bad idea per say) to the already despised status quo.

How is this going to broaden the application pool to include neurodiverse candidates and not just pander to the credentialist notion that rote leaning to get a piece of paper makes you the best candidate.


Can you help me understand how I can help "broaden the application pool to include neurodiverse candidates?"

We're only in private beta now, but the goal is that when we launch, it will be open for any person to apply to. I don't see our model ever being some that "panders to the credentialist notion that rote leaning to get a piece of paper makes you the best candidate." It's the exact opposite, really.


I'm on the autism spectrum, and I like hearing about your approach.


This is a good idea, please make it happen (and allow more locations!)


I personally like the way we currently do software interviews. I got a nearly perfect SAT/GMAT through rote preparation (despite not doing so well in school generally), and brought this skill to bear on software engineering interviews. As such, I’ve had job offers at some of the best companies in the world, something my pedigree otherwise wouldn’t have allowed. I’m not able to get a job at McKinsey or BCG, no matter how well I do in case interviews, because they weight academics and prestige of school highly.

I get that many people, especially seniors, are annoyed by the general software engineering process because they feel entitled to certain positions. But for the most part it seems to make things as meritocratic as possible (aside from holding back certain populations because of bias, something people seem to be working on fixing).

So yeah I agree several day-long interviews for a company is draining and a huge time suck, but aside from that relatively minor inconvenience it puts everyone on a level playing field, and those who are “too good” for the game are free to opt out.


> I get that many people, especially seniors, are annoyed by the general software engineering process because they feel entitled to certain positions

I’m annoyed by the process - as both interviewer and interviewee - because I don’t think that it has much to do with the actual work.


Absolutely. I've seen companies send an intern to do a technical interview for someone with over 10 years of experience. The intern's feedback was that the guy had a lot of experience, but made some syntactic error on a whiteboard exercise so it was a pass. I don't know who gets to dictate shit like that, but it's absolutely broken. For reference two days later I overheard the same intern saying that JavaScript was fully synchronous.

As a fun aside, I've seen people send out HackerRank evaluations for entry level positions and I think that's ludicrous.

The best interview I have had was me Skyping with one of the devs and he gave me a link where we were able to share a browser session (for the life of me I can't remember the name of the site), but he described the problem and as I typed he could see what I typed. There was no emphasis on result that I can remember, just the process of writing code, talking it out loud and seeing code evolve and asking whether certain edge cases should be considered.

What's most broken, I think, is the fact that interviewing is an uneven, broken experience that's different everywhere you go because no one really knows how to interview. I haven't seen any company take metrics on whether their process is good or helpful or filters out too many people or not enough.


Agreed. The best two interviews I've had was one that was similar to your experience. The other was being invited to pair with a team member for an afternoon.

Interviews where people just wait for you to make a mistake should be not a thing anymore.

(Looking at you, Google)


Personally I'm very defensive against more academic interview processes (like Google's) because I'm a college drop-out and feel some amount of insecurity being ranked among peers with better academic backgrounds who might excel at such tests. In truth it is a personal failure of mine, and something i'll eventually get over through study/training the next time I'm looking for a job (hopefully years from now).

Sure it would be great if we found some super-magical process that didn't irritate or inconvenience incumbent and new developers alike. But you probably can't please everyone, and in the meantime we need to do the best we can because hiring still has to happen. Companies with slightly better processes will be rewarded (hopefully) and companies with more painful processes will face some penalty from losing candidates that would move their company forward. Hopefully this is enough of an incentive to keep companies continually working towards better hiring practices. But I kind of doubt it.


I can tell you from experience ppl with college degrees go through the exact preparation process, aren't well off because they took an algorithm course 2 decades ago.


I agree. But I think there is a skillset to taking tests that can become highly developed in a college environment (where 3/4 finals exams are taken quarterly). You're more likely to get a fully grounded education, such as database and operating systems semantics (compared to the self-taught web developer who began and exists at a much higher abstraction level). It's my understanding sometimes these questions are sometimes thrown in and could be crucial to a candidates success.

Of course this applies more to fresh grads.


Most thinking people hate them. Everyone at my job believes they wouldn’t would be able to pass the interview again for a job they’ve held for years.


I actually really like the way that my company does software interviews. We do an initial resume check and screener phone call. After that, we send prospectives a take-home code sample that they can work on. The problems are very small and well-scoped and they can take as long as they want to work on it.

We do an initial code review. If we have notes for improvement, we give the candidate feedback and ask if they'd like to do one more pass at it. If they don't want to, but we think that the general thinking and instincts behind the logic is sound, we bring them in for an in-person where we ask them to go over their code review on where the pitfalls are and where they think they could potentially improve.

That gives us the most realistic sense of how well a candidate is going to do at the company, since this is essentially our process.


I like that it's transparent, and I've gone through a similar enough process several times, but as much as it gives your company a realistic sense of future performance, I think it's unfair to candidates. One job made me do this with two onsite interviews and then I waited three weeks to hear back. They were comparison shopping during that time, but as a candidate it was stressful.


My impression of the big 4 interview process I went trough for my current position wasn't at all that it was all about having done a lot of preparation in the form of hackerrank problems etc. On the one question that most closely looked like something from my preparations, I did worse than the others. This was because I didn't actually think the solution trough properly and mostly repeated what I "knew" was the answer.

My conclusion was that it's mostly about being able to solve novel (which means preparation matters less) problems in code and explain the reasoning behind the solution. It is of course possible that I was just lucky in having good interviewers, in which case I guess that would answer the question, more of that and less of syntax checks or finding the one true answer-type of questions.


Most people hate them when they get rejected, consider them fair when they pass. Like anything, they are mostly subjective and very gameable (compare it to getting accepted at a college)

It's not remotely scientifically-objective, but most things in life aren't. I'd liken it more to dating. Both parties are evaluating each other on innumerable criteria through their own subjective lenses, and trying to reduce it a scientifically-quantifiable measure is dubious.


There are real results based on how successful someone is on a job. Being dismissive that there is a problem because you think people only complain when they don't get a job is simplistic and incredibly naive.

If an interview process is completely detached from what it takes to succeed at a job, what good is it?


What makes you say it's completely detached from what it takes to do the job? The only evidence that I see cited often is some study by Google that says success in the interviews didn't predict performance, but Google continues to use the same model for interviewing. I would think an organization that big would have people devoted to improving their interviews and would change it if they thought it had bad outcomes.


I especially love being asked questions from university CS courses that I completely forgot in the 20 years since I graduated - because they were completely irrelevant in the real world. It's like reliving the exam nightmare throughout your entire life.


I'm getting a bit frustrated with the talk about interviews, as we repeat the basic message every now and then, but nothing changes. I feel we all agree that the current status quo is not good, but so far there has not been any good alternative:

- Whiteboards are unnatural. Besides, you don't want to check their knowledge of university-grade algorithms or data structures.

- Problems to take home are an artificial setting - you can't be absolutely sure that they did the work themselves, and you have no idea how the mesh with other people. And you probably get some false negatives because they're stuck and can't ask their colleague.

- On the other hand, if you're using real code from your project, you're just asking for unpaid work. And paying every candidate is a significant expense.

- Using a proxy measure like HackerRank or a Github portfolio excludes people from minorities or who only want a job and not a mission.

- Not doing any coding opens you up for people who can talk, but not code.

Have I missed anything? Is there a good alternative to find the right people? I'd love to improve my company's interviewing process, but sooner or later we always hit one of the mentioned problems.


> - Not doing any coding opens you up for people who can talk, but not code.

I think candidates should definitely code at least fizz buzz or something as a sanity check. It probably shouldn't take more than 10 minutes.

> - Problems to take home are an artificial setting

How do you feel about potential solutions I have mentioned in the post, e.g. exposing the candidate to a project during the on-site for 3 or 4 hours?

I'm also not a huge fan of take home projects unless they can for sure be time-boxed to be less than 4 hours, and it would be ideal if you actually got paid for the 4 hours of time.

> - On the other hand, if you're using real code from your project, you're just asking for unpaid work. And paying every candidate is a significant expense.

Paying every candidate is an expense, but is it less expensive than hiring the wrong candidate and then needing to fire them? It seems like other comments here have already pointed out that they pay something like 160k annualized salary hourly rate, which seems pretty reasonable.


> How do you feel about potential solutions I have mentioned in the post, e.g. exposing the candidate to a project during the on-site for 3 or 4 hours?

It's certainly another interesting idea, but from the top of my head, I find several drawbacks: The first is NDAs, as not all code may be exposed to non-employees. Secondly, it will disrupt your team for a bit, because at least one developer should be available at all times and can't really join discussions or go into deep focus time. Finally, and most importantly, I think it's impossible for such an interview to have internal consistency: Every candidate will probably solve a different problem (or do you keep a bug open just for interviews?) in a very different social situation. If you're more the nervous type, you suddenly have a whole room full of full-blown developers see you sweat. Internal consistency, however, is one of the most important themes in interviews for me. Everyone deservers teh same chance.

> Paying every candidate is an expense, but is it less expensive than hiring the wrong candidate and then needing to fire them? It seems like other comments here have already pointed out that they pay something like 160k annualized salary hourly rate, which seems pretty reasonable.

But is it less expensive than other methods of interviewing? If a classic whiteboard has a slightly worse chance of hiring the wrong person, but has no additional cost, it might still be better, all in all. Additionally, the expense depends on how many you interview for a given position (is there a filter interview beforehand or do you invite every reasonable resume?). A better idea that we sometimes use it to hire uncertain, but hopeful candidates for a paid internship. If they're good, the get a full contract; if not, we haven't lost much. Also, it's an easier decision after several weeks than after four hours. However, we can't do this for everyone.


> But is it less expensive than other methods of interviewing?

I'm going to assume you are including the costs of firing people as part of "interviewing" after they have been hired and found to be a bad fit.

I'll take a stab at this. 160k annualized per hour is about 77 dollars an hour, if we're assuming 260 work days a year and 8 hours of work a day.

$77 * 4 hours of a day = $308 additional cost per interview. I'm going to assume you're already doing some panel interviews with at least 2 people, so I won't include their cost in there. And there's potential savings if we only have 3 people interviewing for the 4 hours instead of 4 or more.

Let's say I am trying to hire for 20 positions, and I am willing to interview n=15 times per position.

20 hires * (15 interviews * $308 per interview) = $92400 additional cost, whether I get 0 or 16 successful hires out of this process.

If we think about the cost of onboarding someone with 3 other employees who are getting paid about the same over a period of 3 months:

308 * 3 employees helping with onboarding * 65 work days * 8 hours * 10% of time spent onboarding = $48048 cost of onboarding 1 employee

Let's define a bad hire as someone who contributed no value and we'll treat the salary we paid them up until the point we fired them as a sunk cost. It takes maybe 4 months (260 * 4/12 = ~87 work days) to figure out that someone should be fired if it's not super obvious?

Let's say we would have normally only hired half of our target candidates (10 out 20). I'll also be conservative and say the new process reduces the number of bad hires by 10%. So assuming 1/10 of our actual hires was bad:

~87 work days * $308 per day + $48048 cost of onboarding 1 employee + ~$5000 for health benefits = $79844 cost of firing 1 person, not including the cost of other employee benefits, the effect on morale, etc.

So here you're adding a cost of about $12556 dollars. I think the cost of firing someone is actually a lot higher than what I have calculated here, though I don't know what the actual figures are for that.

But I feel that if candidates at least enjoy the process more (which based on this thread, it seems like a lot of people here would), then isn't that a drop in the bucket in the company's bottom line for something pretty valuable?

There's obviously a lot of other intangibles at play here and we can model this all sorts of ways. Let me know if I've made some egregious error.


My opinion is we need to have more variety of interview options. Everyone has their strengths and weaknesses and any given Dev has to have some kind of idea of what they excel at. Some people are really good at tough algorithms.

I recently went through a job search and I ended some coding tests because they took too much time and the questions I was asked seemed more like a random candidate filter rather than a test of developer knowledge.

My personal opinion is to make an easy coding exam similar to triplebyte or other apply once talk to many companies and randomly accept a portion of candidates.

That way people will know it's not their skill and can optimize for market exposure rather than arbitrary coding problems.

Then have an in person where you see if communication can happen. Ask them to walk you through the code used on the entrance test.

Can the industry move towards something like match day for doctors, but more ongoing?


Whiteboarding is a waste of time in my opinion. The problems are recycled anyway (reverse singly linked list, needle in haystack search, etc). It doesn’t give you a useful signal about the person’s coding ability when they can just do rote memorization on a toy problem.


I try to ask a simple question that pretty much any half decent developer should be able to solve quickly. Then we have a conversation about it and discuss why they did it, possible problems, how they'd handle new requirements, etc. It's more important to me that they can speak intelligently and collaboratively than memorize a bunch of puzzles. It is tough to determine if a candidate will be a good fit in the hour they give us.


For pure algo Q I would agree, but being able to use a white board to sketch out solutions to wider problems can be usefull - remember some people are more visually biased and may find visual aids like this usefull.


There are way too many problems to perform rote memorization.

Maybe that's the approach one takes when practicing, but hopefully they are seeing the pattern and learning algorithms to apply to solve each class of problem - or they already have learned that through their career, and such problems are just practice & exercise.

It's not about whether you can solve it, it's about how you solve it. But unfortunately, both interviewers and candidates focus too much on the former.


These two below resonate after hiring dozens of engineers:

> – What kind of quality work does this person produce in a normal work environment where someone isn’t breathing down their neck?

> – Can I work with this person on projects that span weeks, months, or years?

No matter how well they anwser questions, how they whiteboard, how high is their IQ, or how well they do homework - none of it would tell you anything about their work quality or actual on the job performance.

Just ask them a for loop question and give them a contract to work on something. If it works, maybe hire them.


Interesting, have you had success with this approach? I might imagine there could be some stigma associated with this since why waste time if there's no guarantee of full time employment?

In California, it is at will employment anyway but I get the feeling that contractors are easier to fire than employees. Correct me if I am wrong.

On the other hand, candidates who are actually good will do very well and get through without any problems. I feel like it would take at least 2 to 3 weeks to find red flags, then at least 3 months to determine if someone is a superstar, and at least 6 months for everyone else to shine through.


I've thought a lot about this (and I'm sure others have to); I think candidates generally fall in two buckets:

1. Can code:

    a. Gifted: 
These have a natural talent for coding and can crank out code at the drop of a hat

    b. Persistent: 
Have moderate-good talent for coding, they make up the gap with gifted coders with sheer persistence and relentless pursuit for making themselves better

2. Cannot code/haven't coded in a long time

How can you test for coding? In my view it's certainly not some academic question or the in-vogue programming interviews book. This is clearly proven to be red herring. Also at least in the bay area, there are set questions that FANG and other companies have which are shared all over the internet and amongst friends who work there. So an interview becomes a casting process for a role, where the script is the questions you have "rehearsed". This is an absolute facade and needs to stop!

I ask simple programming questions and look for "naturalness" in the way they answer and talk through the solutions. No complicated brain teasers or the smug "look at me, I know this obscure/complicated algorithm invented before I was born" questions.

Coding is one facet, the other one is design. No matter how good a coder you are if you have bad design skills it doesn't matter. How do you test for design?

1. Resume experience: Talk through technical issues/challenges, pros/cons etc. of projects they've done before

2. A simple-moderate design question: If someone lists distributed systems as one of their skills, ask them a question on it. A decent amount of determination can be made about a candidate they way he/she answers.

I'm refining my process after each interview I take and would love feedback.


I create a sample project based on real world coding problems we face everyday. The actual class is a skeleton with non working methods and comments describing what the methods should do.

I then have another test class with simple failing unit tests. They have to write the code to make the tests pass.

Then I give them more complicated failing unit tests that they also have to make pass without causing the first set of test fail.

We do pair coding and they can ask me any business question they have if they don't understand a requirement.

They can use Google if they need to during coding.


Thanks, that sounds like a good way to screen someone.


I've been through and used several different styles of interview and the one that stands out as being "the best" at predicting whether a person will fit on the team is just to bring them in and work with them for a day on your actual product

1) You don't do problems you've already solved that they have to come up with same answer to.

2) You don't do algorithm / problem testing that they're never going to use on the job.

3) You don't do whiteboarding except as you would in real life.

Engineering teams are people working collaboratively to solve complicated problems where nobody knows the "right" answer. Any interview that's going to tell you something is going to measure along those metrics, and not "can you invert this b-tree on a whiteboard."


I'm assuming you pay a fair rate for that day of work?


Of course.


There are tons of threads on HackerNews on this and they all usually follow the same exact pattern of discussion.

Everyone hates interviews and how they are useless but in the the end we settle for the status quo.


There is no right answer to this. Every company is different, so there is no right way to do software interviews.

Enterprise SaaS, Consumer Social and ML API companies will all have different working environments. Add on to that options of remote vs on-prem, CI/CD vs TDD, customer integration requirements, staff numbers, perks etc... and you have completely different needs for workers and teams.

A team earns their hires, and people will naturally gravitate to environments that suit their work styles.


> there is no right way to do software interviews.

In the context you presented this in, I agree with you. However, interviewing is a skill that must be learned or you are going to make loads of beginner mistakes. In that sense there are many right things you should be doing in an interview and whatever your company style is, you're going to want to use at least some of those best practices in your interviews if you want to find the best candidates in the shortest time.


Isn't this true for anything though? More experience = better (on average).

In theory there is an optima between a new hires ideal workplace and the hiring company's style of work. As with most things it's about knowing yourself and identifying in others if there is a match.


Sure it's true for anything. Which is why it's surprising when 95% of the engineering interviews you go on the interviewer doesn't have even a remote clue about interviewing best practices. Which is also why you get so many whiteboard coding interviews. "That's what happened to me once in an interview, so I'll just do the same thing".

It's the very reason I mentioned it. Too many people haven't figured out the very obvious fix: get some training in how to interview!


I always ask them to read code and tell me what it does. 100% better than whiteboard coding because you can hear them working through it.


I conduct this style of interviews. It's easy to complain, but much hard to propose better options. Most alternative hiring proposals do not meet the fundamental criteria of Big 4 hiring: they must reject approximately 99% of applicants while consuming an average of two hours of engineer-time per candidate.


In a sense maybe this is how it should be. The interview should reflect the company's culture. The candidate is trying to make a decision just as much as the company is.


I enjoy the coding interviews. I would interview just to do them.


I would like to offer a partial list of practical ways to make software engineering interviews better.

1. As a candidate, if you are in a strong enough position to refuse the job, then refuse to do white board exercises with no resources. No one codes like this in real life so refuse to do it in an interview. Be willing to walk out. I just say "I'm a good programmer and your interview process is broken. If you want to hire good programmers it's your interview that needs to change. We can start that right now and talk shop as programmer to programmer about coding, or I can leave. I'm happy with either option but I'm not going to waste your time or my time with whiteboard exercises that don't find the best programmers. How often do you use composition over inheritance here? I've been learning a lot about that and it's really helping me write better code. Can I show you some of my recent code where I've been doing that and get your opinion on where it could be improved?" Remember that your job on the interview is not to follow their script. It's to prove to them that you are the best candidate for the job. Go off script and take control if they are bad at interviewing. If good programmers are regularly walking away from these types of interviews, the interview process will change.

2. Unless you've been trained to interview people, you are probably really bad at it. Think about how bad someone is at coding who never did it before and has no training. Interviewing people is no different: it is a skill that takes time and effort to learn. Get trained up, and practice with role playing with colleagues. As much training as you have time for. Educate yourself in your free time on a regular basis with articles about best practice. Adding more words won't make this sink in more, but I will end by saying I can't stress this point enough. You are bad at interviewing. You need to fix that before anything else if you want to hire good devs.

3. How are you measuring success as an interviewer? This is a really difficult thing to measure because how can you find out that your interview process is rejecting candidates who would make great employees? I'm not going to try to answer this here, but it's an excellent question to think about. One thing that might help is the next point.

4. Whatever your first impression is of the candidate, spend a good part of the interview trying to prove yourself wrong. Candidate seems like a perfect fit for the team and a great coder? Instead of relaxing because you're sure you already found the right person, see if you can figure out why they might be a bad fit or if their coding skills aren't as good as first impressions. Same for someone who seems weak at coding. Maybe they are just nervous. Can you make them relaxed enough so their awesome skills can come through? Maybe in both cases your first impressions will be proven correct. But if not you might have just found a diamond in the rough that other companies are skipping over.

I hope this very incomplete list is enough to get folks interested in following up on point 2.


Really good points here, especially #3. In the past, we've looked at all the candidates we hired who turned out to be high performers (gets stuff done in a timely manner, writes tests, pushes back against unreasonable deadlines, doesn't require too much hand-holding) and looked at how they scored on the rubrics / and who was interviewing them. We have even been able to tell whose hires have been relatively not so great, but this takes earliest 4 months and usually up to 6 or 8 months to figure out.


Isn't this obvious blog traffic generation against HN guidelines? I'm flagging it.


Just because it’s a link to a blog? That’s the whole point of HN to like to blogs and other content. Which guideline is it breaking?


That was never "the whole point" of Ask HN posts. The standard is, and had been for years, to just post your question in the submission. That being said - I don't see anything in the guidelines addressing that.


I tried to paste all the content in but it exceeds 2000 characters, I thought providing my experience might help generate more discussion.

Does it make more sense to paste the link into the body of the text and just ask the question?


No and why do you think the current system is good?


I did not make any comment regarding the "goodness" of "the current system"


I can move this to pastebin? Is that better?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: