Hacker News new | past | comments | ask | show | jobs | submit login
Hiring Is Broken (medium.com/evnowandforever)
163 points by zerogvt on Jan 17, 2019 | hide | past | favorite | 307 comments



Yikes... Interviewing sucks but half the reason interviewers ask these types of questions is to see your attitude and how you respond. Writing an Instagram clone in Angular doesn't really tell me much about your problem solving skills when faced with a unique problem.

> How many people can actually write BFS on the spot, without preparing for it in advance?

Ughhh, should I tell him?

> why would they ask me this question, what does breadth-first search has to do with front-end development

Tree data structures are really common in front-end.. like the DOM or JSON.


This feels condescending.

I've been developing for well over a decade as full stack engineer. I've worked as a successful software dev at some big corporations (like Intel, and yes it was fulltime for several years) and almost always outperformed my peers. I work in finance now where everything is about managing portfolios for people - lots of numbers and heavy calculations.

I have no fucking clue how to write a BFS. I've never needed to know how to write a BFS. I will probably never need to know how to write a BFS. Coders like me write business applications where these things are literally never an issue. If there's something I need to know and don't, I simply research it. That's the point he's trying to make. Don't interview people on algorithms they will never ever use. It's as simple as that.

There are more productive ways to interview someone; for example take a bug in your product and see if the candidate can fix it. Or if you're worried about sharing proprietary info, then make a sample program that simulates a bug or feature that your application will use and observe the candidate working on that.


A breadth-first search is a pretty basic algorithm, but even if you don't have it "memorized", the question is simply: here's a data structure, we need to traverse it in a certain order. Traversing data structures is so common that, even without a formal CS education, I'd expect an engineer with years of experience to have the strategies in a sort of unconscious memory, like a rock guitarist who plays the right chord progressions without knowing what they're called.

A candidate who freaks out upon being asked that question is not a candidate whom I'd trust to be a good problem solver. Solving a problem with simple constraints is something that happens all the time. If you don't know how to approach that, you could just "look it up", but how would you even know what to look up?


> Solving a problem with simple constraints is something that happens all the time.

I disagree. My experience has been that these problems are exceptionally rare.

They're so uncommon that programmers get excited when presented with such a problem. It feels like "real programming" as opposed to fixing bugs, writing prototypes, or writing business logic.

> I'd expect an engineer with years of experience to have the strategies in a sort of unconscious memory, like a rock guitarist who plays the right chord progressions without knowing what they're called

This is fantasy. You're comparing the average programmer to musical savants? What?

You happen to have memorized a bunch of algorithms for traversing data structures. That's great, but nothing about that is "unconscious memory". The fallacy is expecting everyone else to have memorized the same thing you have memorized.

> If you don't know how to approach that, you could just "look it up", but how would you even know what to look up?

Google for "tree traversal" and spend 10 minutes researching common algorithms. Pick one that fits. Spend 20-30 minutes writing it (hopefully with some tests). Worst case you've spent an hour.

Someone who memorized the algorithm could probably write it (with tests) in half the time. And that's okay because how often are you doing tree traversal? And even if you're doing tree traversal all the time then you'll have memorized all these techniques in 1 month anyway.


> This is fantasy. You're comparing the average programmer to musical savants? What?

They didn't describe a musical savant. They described someone who learned to play the guitar without any background in music theory. The analogy was not someone playing a chord without knowing the name of the chord, it was someone playing a chord progression without knowing the formal musical basis for that progression.

That kind of intuition is very common in guitarists, and I thought the analogy was rather apt.


> They described someone who learned to play the guitar without any background in music theory.

I have no idea what guitarists and other musicians you know. I know quite a few and they've had to put in serious work on music theory (most from a young age). Many also started out with instruments like early piano lessons and transitioned to guitar.

It's quite literally a language you need to learn (memorize); kinda like maths actually. Some things may seems intuitive but only savants can compose actual music without training in music theory.


I would guess that the _vast_ majority of guitarists learn that "G D Em C" sounds good _long_ before they understand that it's an I V vi IV progression. I started off playing guitar with three open string power chords, and I didn't have any understanding of theory for years of playing.


No one is talking about composing music without a classical background. We're talking about playing music without the background.

Composing music is to playing a chord progression as writing a naive algorithm implementation is to designing an optimal algorithm.


> The fallacy is expecting everyone else to have memorized the same thing you have memorized.

I don't see why any memorization should be necessary. I have never written a BFS algorithim. Yet I can immediately think to two different ways of implementing it. In fact, when reading the article I googled BFS to make sure it wasn't referring to something else because BFS seemed so simple to implement. All you have to do is place every node's children in a FIFO que and process the nodes in the order they are added. You could also use two arrays, one for the current level of nodes and one for the next one. It is only slightly more complicated if you are traversing a graph, rather than a tree, as you need to track visited nodes to avoid repeats.

Also, the utility of BFS and DFS in frontend seems hard to ignore, the majority of the data structures are trees (html, xml and json) and these are the two ways to traverse them.

> They're so uncommon that programmers get excited when presented with such a problem.

What about "process these things in this order" is uncommon? If anything, the constraints are far simpler and easier to understand than most of the business logic I implement.


> I don't see why any memorization should be necessary. I have never written a BFS algorithim.

Neither have I, but we both have prior knowledge that accounts for solving most of the problem. We know how trees are implemented. We clearly know what BFS is and what's "tricky" about it.

Someone hasn't dabbled with tree traversal has to grok a lot of information in the heat of an interview.

> Also, the utility of BFS and DFS in frontend seems hard to ignore, the majority of the data structures are trees (html, xml and json) and these are the two ways to traverse them.

I'm not 100% sure about frontend, but I haven't heard of anyone writing tree traversal algorithms for a DOM tree. That's all handled by either browser APIs (document.querySelectorAll?) or lower level libraries (ReactDOM?).

> What about "process these things in this order" is uncommon? If anything, the constraints are far simpler and easier to understand than most of the business logic I implement.

That's exactly the point. Think about it... there are literally dozens of ways to implement a BFS. The currently accepted standard way of writing a BFS is the most concise and efficient.

A minuscule amount of your business logic is as concise/efficient. Most of the time your business logic will balance finding the best solution with meeting milestones (milestones usually win).

So let's say someone doesn't have any experience traversing trees (why would they? you rarely have to, if ever). Let's also say that they haven't memorized what BFS is and what the constraints are. You're presenting them with a new problem and expecting them to come up with the most efficient/concise solution to that problem. The only people who get rewarded here are people that simply memorized the solution or the solution space. In effect you've validated nothing.


> So let's say someone doesn't have any experience traversing trees (why would they? you rarely have to, if ever). [...] You're presenting them with a new problem and expecting them to come up with the most efficient/concise solution to that problem.

I did the some of those interviews (not specifically with BFS, but with similar problems) and my answer to that is: Nope, I do not care about "most efficient/concise solution to that problem". I just need some evidence that you are trying to solve this problem. I am here to help them -- after all, for the interview to be useful, I need to see your work.

So I'd formulate the problem (if the person forgot what those 3 letters mean), draw a diagram if candidate is having difficulty, give increasing hints for the solution ('How would you do it manually? Which node would you visit first? What about next one?' and so on...)

Even then, a surprising number of people would just give up. It is kinda weird -- it looks like when they hear some words they just think "OH NO THIS IS DIFFICULT" and stop even trying.

Those get rejected. We have often have difficult problems. If you stop at one thought of algorithms, you get 0%.

And if you have not memorized what BFS was, and I had to draw a diagram of the tree and trace the path, and give you a hint about maybe using some sort of queue for unprocessed problems and you did not even finish the solution at the end -- you'd still get 80% and might very well get an offer.


> So let's say someone doesn't have any experience traversing trees (why would they? you rarely have to, if ever).

Ummm... what? Basic DFS of trees with nested foreach loops is incredibly common. It is not a generic, depth blind traversal, but it most certainly is tree traversal.

> You're presenting them with a new problem and expecting them to come up with the most efficient/concise solution to that problem. The only people who get rewarded here are people that simply memorized the solution or the solution space. In effect you've validated nothing.

If somebody gives up on solving such a simple problem, why would I ever want to hire them?

> The only people who get rewarded here are people that simply memorized the solution or the solution space

Says who? You are making a bunch of assumptions about why the question is being asked.


I think the point was, the algorithm is so basic that a moderately experienced software engineer could invent it - it doesn't need to be researched.

> as opposed to fixing bugs

bug: "takes too long to find a <file / json field / html node>"

It's a real problem.


> I think the point was, the algorithm is so basic that a moderately experienced software engineer could invent it - it doesn't need to be researched.

Right. My point was the problem is perceived as "basic" because the commenter memorized solutions in the problem space. If it's not something you deal with regularly (or have interest in) then it isn't as basic as it seems.

People that haven't memorized the most efficient/concise solution also freak out because they know that their whiteboard solution is going to be judged on the merits of the industry standard solution.

> bug: "takes too long to find a <file / json field / html node>"

This is almost always an issue with an underlying library. The better question is why your programmers are wasting time writing slow BFS code when they should be using well tested solutions.

Example: I recently needed to write some AST transformation code. Did I write my own AST node traversal code? Hell no! I picked a library that did the job. I spent about 20 minutes catching up on state of the art AST traversal (so I'm not blindly importing a bad solution) and let the library do the work.


> Google for "tree traversal" and spend 10 minutes researching common algorithms.

The point is you wouldn't know what to search for. If you don't know anything about cars, you wouldn't be able to search for "alternator fault". The best you could do is "engine problems" which won't help you solve the actual problem.


You're implying that someone who can't invent a BFS on the spot also doesn't know what tree traversal is. That's not true at all.


If you can program [1], you can do BFS - no need to "invent" anything. Given a tree like this [2], asking you to "print the numbers in order" is not a difficult question, it's a trivial one.

I would expect any interviewee that calls themselves a programmer to talk me through a reasonable answer to that question. Forget about what BFS means - you don't even need to know what a tree is. Looking at the image is all you need.

[1] https://blog.codinghorror.com/why-cant-programmers-program/

[2] https://upload.wikimedia.org/wikipedia/commons/3/33/Breadth-...


What drives me a little nuts is all the people on this thread saying BFS is simple, as simple as counting, as if that somehow strengthened their argument. It does the opposite: BFS is indeed simple, so simple that you can simply explain it to a candidate and ask them to implement it, just like you would do with an actual developer (or, more likely, that they'd just do for themselves based solely on the Wikipedia page).

I'll go a little further and say that if you've qualified a candidate as capable of busting out the routine Javascript required to wire up typical front-end code, if you can't teach that developer how to implement BFS within the confines of a single interview, you the interviewer share some of the incompetence. Certainly that would be the case for a manager or team lead!

The reality is that BFS is for most jobs (frontend AND backend) a trivia question, and a status-seeking mechanism for interviewers.


THANK YOU. 16 years I've been writing code and I can count on one hand the people I've met who realize this. I get a little emotional about it because I've seen so many exceptional programmers crumble in interviews with trivia questions.

If an interviewer wants to test problem solving skills with a BFS, no problem:

- Draw a tree on the whiteboard (interviewers writing the whiteboard is highly undervalued; it calms the candidate)

- Explain what a BFS is. It's easy to draw out each step in a BFS algorithm.

- Ask the candidate to write some pseudo code that implements it.


I do, every time! I even drew a little diagram a couple of times and traced the path!

And they still failed it! It is not about trivia or what BFS is, it is all about can you reason about algorithms.

(Disclaimer: our actual interview problem is slightly different, but it it still a very basic CS one)


> And they still failed it! It is not about trivia or what BFS is, it is all about can you reason about algorithms.

Okay.. so now this is about you and a bad candidate rather than the person pointing out that interviewing is broken. Got it.


Well, if at least one person reads the discussion, and then starts inventing the algorithms instead of giving up right away, I'd be happy :)

Because someone saying "How would I know? If, and when, I need to know how tree-shaking is implemented, I will go look it up." makes me sad. The tree-shaking algorithm was not given to us by ancient gods. As a programmer, you job it is to solve problems no one has solved before. Interviewing may be indeed broken, but the original article was not showing any evidence of this.


> the question is simply: here's a data structure, we need to traverse it in a certain order

Because BFS/DFS are so easy to interchange, I'd strip ordering out entirely: here's a collection of stuff where each element may also point to other stuff in the collection, starting with this element search through the stuff to see if you can find a certain element.

If I was going to ask this problem, I'd help a nervous-looking candidate by contrasting the problem (or leading with) a simpler still one: here's a collection of stuff where each element can point to at most one other thing, starting with this element see if you can find a certain one in the collection.

Linear search is, technically, an algorithm. Graph search just extends it. All these people railing against algorithms in interview problems as non-applicable surely have at least written a for loop over an array and had an if statement somewhere inside, right? Linear search. When I've given interviews I don't try to make them based on "algorithmic knowledge"[0] but at the same time I really don't like the trend of thinking of algorithms in general as "oh, that's someone else's job, I never use those" or "I can only implement algorithms by memorizing them."

[0] When I was just starting to get asked to give some I lazily asked a colleague for a reference problem, they sent me https://leetcode.com/problems/jump-game/description/ and I read the problem, immediately recognized that the general approach of graph searching would solve it[1], and coded up the 12-15 lines or so it took, made some tests, fixed some minor mistakes... Total start-finish time was 10-15 minutes, not the fastest. There's also a solution to that problem that doesn't require thinking of it as a graph, too, but I think it's harder to get the insight. Anyway, nice problem, I thought, I'll only give a problem if the candidate has 2x the time it takes me to solve it as a buffer for interview nervousness/whatever, but after giving it to a few people only one of them managed to solve it in like 50 minutes after creating a pretty verbose and pseudo-coded BFS system. (My initial approach used DFS, but it doesn't matter here -- though for the jump game 2 sequel problem which asks to find the shortest number of jumps, BFS makes sense, and it's easy to take the solution for the first problem and make minor alterations. General algorithms like "search" are great and widely applicable.)

[1] In retrospect, before I saw the problem I was writing a simple game of go client as a side project, and had recently implemented a function to, given a point on the board, determine if there's a stone there and if so determine if it's part of a larger group of stones and return their coordinates. So in some sense the problem was already 'primed', and I've written a lot of graph search skeletons for both fun and profit. It's interesting to hear that some people go decades without doing so...


This problem can be solved with my favorite algorithm in the whole world, union-find [1], yaaay! :-p

It is hard to say if U/F would be more efficient for this problem. U/F can perform two `union(a, b)` and `find(a, b)` operations in O(lg n) runtime complexity, or, with some more effort (path compression), nearly constant time (a, b representing node indexes). But for this problem a first pass over all indexes would be required to build the disjoint set from the input.

I think for small inputs DFS or BFS would be more efficient, and have the plus of not needing extras storage (U/F would require a second array the same size than the input). For large arrays, probably U/F would win since the input array may encode potentially a lot of edges (say, if each index contains a number large enough) and DFS runtime complexity is O(Edges + Nodes).

Anyway, I think U/F can be really useful in the context of job interviews, there are many problems that can be reduced to connected components, after some massaging.

Here's an implementation I did a while ago [2]. Even though I love the algo, I don't really remember the details about path compression... time to refresh my memory I guess :-D

[1]: https://algs4.cs.princeton.edu/15uf/

[2]: https://gist.github.com/EmmanuelOga/bcafad14715a3f584e97


Neat structure! I consulted my copy of Skiena's Algorithm Design Manual. He says: "Union-find is a fast, simple data structure that every programmer should know about." Well... Now I do. :) It is simple, a 'backwards' tree with pointers from a node to its parent, which lets you union two separate trees together by just taking the shorter one's root and pointing it at the taller one's (or vice versa, but this way preserves log behavior). The path compression optimization seems to just be an extra pass in find(), after you have the result, to re-parent each item along the path to the found root parent so that any future finds() of any of those items will only have one lookup to reach the component root.

For the jump game problem, I'd expect DFS would still win almost all the time even when there are many branches because it doesn't have to explore every element of the input array except in the worst case (and you can greedily explore a neighbor without having to discover your other neighbors first), whereas to build a complete union-find structure would. Still, the fastest solution in general is probably just the single pass: if you have the insight that you can keep track of a 'max reachable index' and update it at each step to max(current-max, i + A[i]), bailing with failure if you hit an index > the current max and having no more jumps, or bailing with success if your max becomes >= the final index. (I never had this insight myself.) It could be slower of course, e.g. if the array was something like [bigNum/2, lots of 0s, bigNum/2 again in the middle, lots of 0s, end at index bigNum].


> I have no fucking clue how to write a BFS. I've never needed to know how to write a BFS. I will probably never need to know how to write a BFS.

If I give you a description of what a BFS is, you as a programmer should be able to write one. A BFS is rather trivial. You shouldn't have to know how to write it, you should be able to figure it out. The same goes for implementing a linked list or the FizzBuzz problem.

As a programmer, you need to make up algorithms on the spot to solve business problems. You don't know the solution beforehand, you figure it out, that's your job. It's the key skill that basically separates you from a typist.

> There are more productive ways to interview someone; for example take a bug in your product and see if the candidate can fix it.

This is a different skill. It's an important skill, but finding errors in an existing structure and fixing it is different from being able to create that structure in the first place.


Depends on the algorithms. Some people can brute force something basic, and the brute force answer will usually be slow and stupid at scale.

I'm working on something at the moment, and one of the key features is that while there's a fair amount of data, none of the structures are particularly big - and are very unlikely to ever become big.

So my thought process is that stupid brute force algos are absolutely fine for this domain.

If I have performance problems I can look into optimising them.

If I had hundreds of terabytes of data to start with, I'd take a different approach, and I'd research - not invent, because I don't care to reinvent a wheel if someone has already produced a much better solution - more efficient algos.

If none of the above work, then I'd consider improvising something and testing how it performs.

Would this pass your interview process or not? Do you want someone who's going to brute force an answer to your toy problem and think they've solved it with something that works but is inefficient, or someone who has memorised a collection of standard answers but maybe can't improvise something new, or someone who is going to check what's already available to save time, or someone who can improvise a super-efficient answer and do it even if it's not needed?

Who would you pick?

Do you really think this question has a simple answer?


> Would this pass your interview process or not?

Yes, that sounds very reasonable.

> Do you want someone who's going to brute force an answer to your toy problem and think they've solved it with something that works but is inefficient, or someone who has memorised a collection of standard answers but maybe can't improvise something new, or someone who is going to check what's already available to save time, or someone who can improvise a super-efficient answer and do it even if it's not needed?

First of all, let's talk about what I don't want. I don't want someone who can't solve the most basic problems. I'm not interested in all these details at this point in the process. Don't get stuck in analysis paralysis.

> Do you really think this question has a simple answer?

No, but it's not the question I am asking. My question would be, can you solve basic problems? I'm not looking for 100% accuracy in testing, that's impossible. Surely some kid fresh out of college will get an "unfair advantage" with something that's still on their mind, while some genius may have a bad day and fumble the test. I wouldn't personally pick BFS as a test either, but the fact that you should be able to solve it remains. The fact that "you may never need it" is irrelevant.


You're asking the wrong question.

For all software businesses there is only one question to ask candidates: "Can you help us ship working software that solves the problem of our customer"

What I can tell you is that there are a number of books written on this subject if you need help identifying the correct interview questions to ask.


But this question does not really tell me anything. It only has one possible answer, which is "Yes!"

If I had a magic truth-detecting wand, it might be a good one, but in practice, candidates may be lying or honestly overestimating their ability.


That's just the type of answer I look for.

You're not someone who would look at the screen and just say "I don't know" (or thank me for my time and storm out).


The problem is that here you are optimising your test for https://en.wikipedia.org/wiki/Chutzpah.

Which is great and all, but it it's not a quality which correlates particularly well with the stated problem you are hiring for.

Now if you were hiring for the marketing department…

Repeat after me:

A good hiring process is one that will not be affected by the luck of the draw nor the personality of the candidate.

I've been doing this for a quarter of a century and I will tell you right now that the best engineers of my generation would indeed thank you for your time, never come back and quietly advise the extensive network of young people they mentor to avoid your company like the plague.


Yes, I'm starting to realise that finding a candidate who's able to problem-solve something basic really is "luck of the draw".


I feel like you misunderstand me entirely.

You're going to have a really hard time finding good hires if you restrict yourself to the pool of people who can "problem-solve something basic" according to your definition of "something basic".

Maybe ask: "How can this person help the team ship working software that solves the problem of our customer?"

It's less adversarial and opens your recruitment up to that pool of folk who have literally decades experience answering that question.


It's more like: There are so many candidates with impostor syndrome. I don't want to hire another turkey.

So, I will come up with a little problem which is something related to the work they'd be doing - like, a real thing they would have done if they where hired a month ago. (hmm, I wonder how they would have fixed jira-123)

I'm not looking for "the answer" - I'm actually hoping they don't know it (if they do, great: here's another...), but.. do they have what it takes to solve a problem? How much hand-holding would they need? Am I able to have a technical discussion with them? Even, where are they on the passive<->arrogant scale?

This, among other questions, will answer "How can this person help the team ship working software that solves the problem of our customer?"

Job history and qualifications don't tell me this.

If a candidate feels they're too important or experienced to do this, then by all means walk away.


> I have no fucking clue how to write a BFS. I've never needed to know how to write a BFS. [...] If there's something I need to know and don't, I simply research it.

The problem with that is when the best solution to a problem is a BFS, you may never recognize or realize it, because you know nothing about BFS. You won't know what to research for.

I see it when people who don't know what calculus is go to enormous efforts to develop workarounds that sort of half-assed work.

I see it in my own work when I didn't know about a class of techniques, so I invented some crappy solution. For example, reinventing bubble sort when I could have used quicksort.

BFS is awfully basic knowledge. How do you know you've never needed a BFS? Maybe you never needed a BFS because a linked list is your go-to data structure? Maybe you've been reinventing bubble sort. (I'm not the only programmer who incompetently reinvented bubble sort, not even close. I just didn't know any better.)


> I see it when people who don't know what calculus is go to enormous efforts to develop workarounds that sort of half-assed work.

cf. https://escapethetower.files.wordpress.com/2010/12/tais-mode...


At least in my case I'd like to think I absorbed the habit of recognizing sub-optimal traversals of data (for example, yesterday I was writing a parallel iterator and made the intentional choice to "waste" effort duplicating the map result than having to synchronize threads on mutable boundaries) but I don't remember all the buzz word jenga that was in my data structures class in uni.


I've often been able to dramatically improve other peoples' code by selecting the right data structure (array, list, tree, hash, whatever) where the original programmer clearly did not know about them. They were able to make the code work, but not very well.

It's hard to do research when one doesn't recognize there's a problem, nor know what question to ask.


> I have no fucking clue how to write a BFS. I've never needed to know how to write a BFS. [...] If there's something I need to know and don't, I simply research it.

>The problem with that is when the best solution to a problem is a BFS, you may never recognize or realize it, because you know nothing about BFS. You won't know what to research for.

Just because someone doesn't know how to write BFS code doesn't mean they don't know what it is.

> BFS is awfully basic knowledge. How do you know you've never needed a BFS? Maybe you never needed a BFS because a linked list is your go-to data structure? Maybe you've been reinventing bubble sort. (I'm not the only programmer who incompetently reinvented bubble sort, not even close. I just didn't know any better.)

Never had to write a BFS in all my years of programming. I am one of those programmers that write glue code and are not "real" programmers by the definition of some here in HN.


> Just because someone doesn't know how to write BFS code doesn't mean they don't know what it is.

Actually, it does mean they don't know what it is. BFS stands for "Breadth First Search". If a node in a data structure has two links, one going down in the data structure, and one going sideways, breadth first means going sideways first. "Depth First" means going down first.

That's the algorithm. It ain't rocket science. There's no trick involved.


You think every programmer has a CS degree?


Begs the question, if they don't, shouldn't they still get familiar with CS? Honestly, how hard is it to read Grokking Algorithms?

Look, I think the dev hazing ritual that is current hiring processes sucks, but Algos and Data Structs are the language that we speak. We have to be familiar with them.


I don't have a CS degree. But I know the freshman algorithms. It's the basic tools of the trade.


I don't. The ability to figure out how to write BFS still seem pretty basic to me.


The theoretical interview was never about writing a BFS. Its about how you approach answering the question.


The fallacy here is that you are actually getting to observe people's problem solving approach when asking them questions like this (vs their ability to talk) and that your subjective perception of someone's problem solving ability in those situations maps to actual job performance at all (vs your own biases).


The entire point is that I _am_ gauging their ability to talk with me through something they may or may not be uncomfortable with. This has worked well in the past and most people deal with this favorably.

If their response is "this is stupid, no one has to know how to do this" and storm out with the same indignation that is present in half of these comments then they are not the ones who dodged a bullet.


Is that a fallacy with basic algorithm questions, or with interviewing in general? How does another category of question provide objective feedback about problem solving ability instead?


I hear this a lot, but I've been in interviews were, yes, it really is about writing the BFS (or whatever algorithm they were asking for). I can remember one, where I was asked to write write an algorithm that, given a vector of points, calculates the euclidean distance between every pair of points. I wrote out a one-liner in MATLAB that accomplished the task in about 2 minutes, but then they wanted to see it in C++. Then I wrote it out in C++, which I don't know so well, so it ended up being in pseudo-C++. Then we spent the remainder of the time quibbling about syntax errors and missing import statements. It was very clear to me the interviewer only wanted to see perfect, compiling C++ on the white board, and had no interest in my problem solving or thought process.


> It was very clear to me the interviewer only wanted to see perfect, compiling C++ on the white board, and had no interest in my problem solving or thought process.

It depends on the position you apply for, I'd say.

Some jobs require C++ and algorithmic skills. If you don't know how to write a basic algorithm in C++, then you might not be the best fit - and you might not want to join the company as a junior developer.


FWIW, this example was for a company that refused to tell me what the job description was and what the position was. They didn't tell me ahead of time which languages they preferred or what kind of work I would be doing. It's a company everyone knows and is renowned for their secrecy. While they refused to tell me what they were looking for, it became clear during the interview they were looking for something very specific.


I'm fairly certain I know which company you're talking about - I'd recommend you redact the specific question they asked, because I'm aware that they do check online (including HN) for interview questions that were asked. You have plausible deniability as long as you don't name the company, but given the NDA they throw at you it's probably better to be safe than sorry :)

That being said - sorry to hear that was your experience. That shouldn't have happened, but it does.


Well, at least you knew upfront there was a high chance of it being a waste of time.


> Some jobs require C++ and algorithmic skills. If you don't know how to write a basic algorithm in C++, then you might not be the best fit - and you might not want to join the company as a junior developer.

If it's down to syntax errors, and the logic is correct, what the heck does it matter? Almost everyone writes code in an IDE that will deal with those syntax errors.


Let's say you have 2 people in front of you and they both design the same algorithm and come up with the same solution (sometimes that's really the case).

One writes code that cannot be compiled while the other does. Who scores higher?

Now, let's say you are a big company and have 100 applications for that position and do the math...


No I got that point. You missed my point; it's completely unnecessary when there's a more productive way to observe someone approach a problem.

EDIT: I noticed you edited your post so it looks less inflammatory. Not cool.


Not cool that he made it less inflammatory? I think you're missing the goal of HN here.


Disagree. Editing a post like that without leaving a message about why it's edited appears disingenuous.


It does make the reply look more brash, though. Good that [s]he made it less inflammatory, but it would have been courteous to add "edit: made the tone a bit nicer" or something.


That all depends on the intention. If the intention is to make my response look more hostile, than I would say it goes against the conduct of HN. They could have replied or added a remark in the edit. The poster could have said "editing, sorry for getting too heated" or something if the intention was to deescalate, but this was a ninja edit.


The edit button exists for a reason. Your regard for what is and is not cool is of little consequence. No one owes you an explanation.


And no one owes you a white boarded BFS implementation which bears no resemblance to how they would approach a problem in the real world.


Let us all be grateful that you were never asked for one then.


Oh I know you weren't asking, I was merely responding to the incongruence apparent in your own expectations of the world.


If your first contact with a potential hire is asking them asinine pointless riddles meant to try to imbalance them you are going into the process with an adversarial relationship.

Which is probably the root problem. Almost all tech hiring is carried out like a war between companies that want results and candidates who don't want to be treated like shit.


I never understoood this. What does that even means and what is good and what is bad way to approach the question?

Because BFS has two approaches - 1.) I know BFS or similar example and will write it with no thinking 2.) I am figuring it out in head while saying things to keep you pleased.

You don't need to know employee internal thought process. Practically, BFS tests whether you are able to BFS, which is cool by me. It is simple enough so that people who were really unable to learn it should really be filtered out.

(It is ok to not know name, obviously. Just ability to find a thing in datastructure should be question.)


I'd like to say that I appreciate this comment even though I disagree. It's true that you can be a perfectly good or even above average developer without ever using BFS, and interviewers shouldn't pretend this is not so. It's also true that software development is a problem solving job. You are paid for your ability to think for yourself and solve a problem. BFS is artificial but also a nice self-contained exercise that any developer worth his salt should be able to figure out.


Yeah, I've seen some real horror stories about bad interview processes in the past, but this seems more like the guy outing himself as a defensive and impulsive person who immediately shuts down when he doesn't "know" something.

I can't remember Djikstra's algorithm either, but I would happily try and write a 15 line brute force recursive maze solver in an interview.

Of course it's not something I expect anyone does day to day, but it's also the kind of simple problem that I'd expect almost anyone with a CS 102 level of knowledge to be able to reason out by just taking a few minutes with a pencil and paper. As an interviewer, even a brute force solution would demonstrate a good willingness to look at problems using your fundamental skills and reason about something you don't have a ready made solution to.


> I can't remember Djikstra's algorithm either, but I would happily try and write a 15 line brute force recursive maze solver in an interview.

You'd fail the interview. The type of interviewers that ask this style of question are never interested in seeing the the naive brute force approach.


My experience has been different. Usually the best thing to do is to write the naive approach first and then let the interviewer guide you towards what they think you could improve.


This.

Implement -> Evaluate -> Refactor

I'd rather have someone who could caveman the first approach, recognize the inefficiencies, and improve the execution than someone who rattles off a memorized algorithm to a common problem.

If the interviewer is wanting the eloquent solution right out of the gate, then the manager might be hiring for the wrong position.


This works best when you've formed a rapport with the interviewer. If you haven't then you've already bombed.


Have you ever tried to implement the KMP (https://en.wikipedia.org/wiki/Knuth%E2%80%93Morris%E2%80%93P...) pattern matching algorithm yourself?

I tried doing it -- even with the knowledge of how it works, it took me at least a day to get to an acceptable implementation.


OK, so you are just going to assume that any interview you can't pass is broken?

There have been interviews I have failed and instead of blaming the interviewers I took the time to actually study the problem I was given. This has helped me become a better programmer.


No, i was just presenting a counter argument. I never had to actually implement KMP outside of uni though.

Interviews that require extra knowledge outside of what the job actually asks for are definitely broken.


That has not been my experience as an interviewee. Implementing the naive solution for small n has been well received in my interviews, and from there (imperfectly) optimizing it is a collaborative experience.

I don't doubt you've experienced differently. But if you have, I think it's likely the interviewer, not the question. Lots of interview questions can be abused, not just basic algorithms and data structures questions.


Its incredibly frustrating to be in a room with someone who will pop whatever random trivia they want at you and watch you struggle against artificial constraints to appease them.

I'm loathing the potential need to ever eventually interview for a tech position because even on trivial things like Error trait impls in Rust I end up opening 3+ pages of documentation to verify I knew what I was doing at least, or often figure out better solutions with methods not on my brains hot path of solving the problem.

Theres nothing pleasant about having your tools taken away from you and then being looked down at when you are like a fish out of water. The OP even talks about it when describing the drain of technical interviews - nobody wants to be judged upon at their worst by strangers. Its humiliating.


> > How many people can actually write BFS on the spot, without preparing for it in advance?

> Ughhh, should I tell him?

Yeah, that's where I stopped reading. This isnt specialized knowledge. If you know how to program, you can write a BFS. The entire algorithm structure is in the name. The only extra details you need to remember or figure out are "use a FIFO for tracking what to check in the future" and "make sure you don't go backwards". Neither is some great secret. They're obvious if you sit down and think through (or talk about with the interviewer) the structure specified in the name of the algorithm.

This person is either not a good programmer or has a very poor attitude about solving new problems.

Hiring isn't broken. I'd reject this guy too. Knowledge about a specific technology is a lot less valuable than general knowledge and willingness to explore.


My sentiment would be: "hiring is broken, but I'd reject this guy too".


you know what is broken , empathy. Every individual have certain skills and strong and weak points. to that effect, have empathy and understand person being interviewed is not in best comfortable position as the interviewer. In an uncomfortable situation, where everyone is peeping up what you do makes few people thinking messed up.

Programmers need a zen place to concentrate and focus. "Homework" assignments should be a good measure to test skills and questions on decisions made on his and how would he improve a code given the use cases would be more comfortable and now we are level playing field. my two cents.


Unfortunately, homework assignments are being abused to the point of being absurd.

Around here, the trend is requiring a 16-20 hour assignment just after a brief 15 min phone call, almost before starting the process at all.

Not only it is a ridiculous amount of effort to ask someone who may already be doing 8 hours a day programming but it's also a problem for the company. Either they have to spend significant effort evaluating each candidate's submission or -more likely, sadly- they only give it a perfunctory look and discard many on first glance. And considering that often the person doing the evaluation has his own tasks to do and is doing it on some spare time, they will tend to not make much of that effort.

The result is the company still needs to do significant effort and the candidate gets frustrated because they have had to spend a significant amount of time and, after having to wait for a couple of weeks -at the very least-, they get generic and useless feedback saying simply that they did not meet the expectations or whatever.

I would not mind at all doing a two-three hour on-site assignment. If you're there, they will at least have to make a similar effort as you are. I find this much more fair both as a candidate and as an employer.


> I would not mind at all doing a two-three hour on-site assignment. If you're there, they will at least have to make a similar effort as you are. I find this much more fair both as a candidate and as an employer.

I've had this exact thought. I got interview homework last year, was told to spend "no more than 4 hours" on it, etc, was mildly interesting. The next interview we'd have a discussion about the code and I'd present my rationale for various design decisions etc. But then I got the email that the interviewer had reviewed the code with an engineer and they decided not to move forward because there was a mistake and they "thought there would be more testing"! I felt that it was quite imbalanced to have spent 4 hours on this task, only to have it rejected without discussion after a short review.

So in the future I'm going to suggest that I'll be happy to do whatever homework assignment as part of a pairing exercise. That way I can see what it's like to work with them too.


If a recruiter/hiring manager sends one of these nonsense assignments to me, I refuse them outright and thank them for their time. It only took me one frustrated night trying to bang one of these out after a long day of work to learn my lesson. This method of testing a potential hire is lazy, ineffective, and frankly rude IMO.


The strategy here is to set aside one hour to code, write the code, send what you have after an hour and say what remaining work needs to be done, along with reasonable estimates of remaining time (noting that you're too busy to spend actually doing it).

Your success rate will be no less than someone who spends 15 hours doing whatever bullshit assignment there is.


I did exactly what you just suggested. I never heard back from that particular recruiter. Bummed me out at the time, but now I just laugh at it.


I find empathy hard here. If the author had said they froze or panicked and blanked on BFS, then that's understandable. But that's not what happened here.


> I find empathy hard

"If only the author did these very precise things that they just so happened not to do, THEN I would find empathy". I'm pretty sure that's not how empathy works.


> This person is either not a good programmer or has a very poor attitude about solving new problems.

He probably isn't a good "programmer." He is a web developer, and has projects that are basically CRUD input and output.

He can stitch together various different components to build a functioning end product. This is a fine skill for most agency type jobs, and he will likely have a long prosperous career as a contractor.


You don't have time to design algorithm on site, you MUST write it down exactly right now without errors. That's the problem.


That's not how interviews work. You are not alone in the room being judged by an all-seeing eye. You are talking to a human, and each of you is evaluating whether you want to work with the other.

As an interviewer, I'm always happy to talk to the candidate about the question I ask. I'm perfectly happy to answer "What is a breadth-first search? Why does it solve this problem?". I'm perfectly happy to talk about algorithms with them, and ask leading questions if the candidate explains where they're having difficulty.

As a candidate, I'm not interested in working with an interviewer who doesn't approach the interview the same way. Remember, all interviews are bidirectional.

This person's stated reaction to the question was anger that it was asked. Most likely masked in person, but even masked tells us a lot. It tells us he didn't actually talk to the interviewer. If he had talked to the interviewer and got garbage back, he would have been angry about how awful the people were, not that some technical question was asked. If he'd talked to the interviewer and got useful information back, he wouldn't have thought it was a bad question. The only options left are trying to stumble through it silently and getting it wrong, or sullenly saying "I don't know" and moving on.

And that's the problem. A good programmer isn't measured by being able to make things. A good programmer is measured by being able to work with others and make a positive contribution in a group. Everything this person is describing is how to make a bad first impression during an interview. Maybe he's a great programmer, but he's utterly failing to actually convey that in the important aspects during the interview.

I wouldn't hire him, and this writeup is exactly why. He needs to demonstrate that he's someone people want to work with. This article does the opposite.


>As an interviewer, I'm always happy to talk to the candidate about the question I ask. I'm perfectly happy to answer "What is a breadth-first search? Why does it solve this problem?". I'm perfectly happy to talk about algorithms with them, and ask leading questions if the candidate explains where they're having difficulty.

This is something I feel a lot of interviewees aren't aware of, you DON'T have to know EVERYTHING.

I didn't know what a BFS was (I am primarily a front end web developer), so in an interview, I'd simply ask "Could you explain what that is? I'm sure I can find some way to do it."

As an instance, I looked up "Breadth First Search" on google just now, and saw that its just a way to search a tree one generation/level at a time. Once I knew that, the naive approach is simple (I'm sure there is a better way). Start a queue with node(0,0), and loop until the end of the queue, if you don't find the correct value, add the node's children to the queue, and keep going.

I totally feel for the guy, I've been rejected more times than hired, that's for damn sure, and each time is a HUGE blow to the ego, but you have to pick yourself up. I feel he is a great contract developer (not a dig, I am also a contract developer), but not necessarily someone I would hire full time.


> Once I knew that, the naive approach is simple (I'm sure there is a better way).

No, that's it. That's why so many are ragging on him; it really is that simple.


I have asked HR persons (in large corps) and got clear direct answer: you MUST solve two tasks on the interview, it's a necessary condition. If you don't solve within the hour you are rejected. In small companies conditions are much more relaxed, but in large corps where interviewer screens hundreds of applicants he/she just don't want to waste time.

This system is corrupted but looks like it suits big corps just fine. I have never evet been asked about my real experience on large corp interview. They just don't care about your github. Small companies asks about experience, but unfortunately some small companies copy FANG approach to interviews, cargo cult is magnetic.


I've had a successful programming career since the 80s. Including a fair share of browser front end programming.

As far as I remember, I have never written any BFS code, and I would also find it absurd to be asked to do so in an interview. There is great library code that does this in any reasonable situation.

If you're the kind of shop that would rather write your own code than use third party libraries, then that's a really great sign that I shouldn't work there.


If you're the kind of shop that would rather write your own code than use third party libraries, then that's a really great sign that I shouldn't work there.

Do keep in mind that you're eliminating the places that wrote that "third party library" in the first place. Libraries don't just grow on trees (though they can help you navigate a tree).

Just a few weeks ago I had to knock out a test framework, creating an API from scratch (basically an object model on top of some websockets messages), because there's no library that's going to do what we need. Oh, I friggin' abused Python's import keyword to hell and back, but there was a lot of stuff that just needed to be written from scratch. Given there are a few tree-like structures, I'm sure I briefly weighed depth-first vs. breadth-first before hammering out the code. And this is nothing exotic, just a small company working in the industrial controls industry. And such a task is nothing exotic, it's why they hired me. But I think to effectively create something like this in an efficient manner, having at least a rough idea of something like BFS is table stakes.

One is not being asked to implement a red-black tree on a whiteboard, just something that I think a competent software developer could come up with from first principles. A company asks you, "which direction will you be searching, and how would you do that?", and you'll reject them. I mean this with due respect: you're probably right, neither party will want you to work there, and the filter worked. Hurray?

I'll tell you what grates me, though: the companies that insist they need someone like that, and then the new employee finds out that the culture is so borked, that said employee will never get to actually do that stuff. "We need test infrastructure." Turns out the reason they don't have any isn't for lack of someone to write it, it's lack of culture to do anything with it. As just one example.


If I had to write a fresh impl of BFS I for sure wouldn't just go off what my algorithms book said a decade ago. I'd reference how other languages and frameworks have implemented and matured the problem in actual deployment and use their versions as a basis of mine (assuming license compatibility).

Writing code in a vacuum is humongously dangerous even for simple things like piping strings. Its why so much C software ends up chock full of overflows, segfaults, and myriad security vulnerabilities.


All good reasons for not getting stuck in front of a whiteboard redoing the unprotected unoptimized core of just the basic algo.

Wouldn't it be nice to be able to use references / existing implementations while at the board being watched and timed.


At least then it wouldn't be purely artificial. I'm not surprised at all that there are so many people "bad" at technical interviews when they are fundamentally designed to leave you a fish out of water. Nobody likes to flounder, but some people it shakes much more than others.

But I wouldn't think for a second those people would be incapable of being good team members or productive coders. You just let them stay in their comfort zone and approach uncertainty in the ways they are comfortable with. They will know the best way they can approach solving a problem, and if you were testing for that instead in these interviews, they wouldn't be so harmful to participation in tech by "minorities" such as women or people of color.


>> I have never written any BFS code,

I'm primarily a backend dev who did a little js a couple of weeks ago that required me to populate a dynamic tree with the results of an api call.

Although I think I went depth first, so I guess technically you're right....


I can't write BFS off the top of my head, but I can probably do it after reading the Wikipedia page on it. The last time I interviewed (for the company I'm working for now), I faced a similar question, and said up-front, "It's been a decade since I've had to implement that, so let me Google for a quick sec." I proceeded to do so, skimmed the Wikipedia article, and wrote the code in the shared editor we were using. I got the job, so the interviewer must have seen my candor and quick comprehension as positives. But I know (based on the parent comment here, as well as discussion with colleagues) that this sentiment is not universal.

If I had claimed a lot of algorithm-heavy experience on my resume, I would have expected my response to be met much more harshly. But, as my experience was focused more on API design and interactions with business stakeholders, it wasn't a useful question to gauge my competence. However, it was useful for gauging my personality. Like everything, context is vital.


Absolutely this. The better question is, given several data sets, how would you approach traversing them. What is your intuition, and what are the limits of your awareness of how to approach the problem.

That is actually informative on your programming ability, not regurgitating buzzwords (albeit BFS is a light one) or rote memorization.

Like just hearing these kinds of questions is infuriating because I often immediately ask things like "is the data processing complex enough to justify threading it? what are the synchronization points? if the data processing is variable, we probably want a job pool, etc". The performance of code is almost always noninutitive until you have an implementation done in order to optimize it, and questions about searching graphs are almost always these "optimize light" problems where they want you to really know how to do the navigation right because of my precious 10 cycles per branch but don't want to even consider the operating environment that could influence the decision in anything but a purely academic setting.


I was taken aback by the BFS thing at first as well, but after thinking about it, I'm not sure I would be so confident about writing one if I didn't already know the basic structure due to the 50 or so times I had to implement some version of it in college. I don't feel like I can judge because I was forced to implement most of the basic data structures and traversals so many times that I can usually re-create them based on knowing 3 or 4 steps and filling in the gaps myself. I think a lot of software engineers also had this experience, and so they expect it of others, but if nobody forced you to write a path finding algorithm inside a composite pr quad trie in college then you are at a huge disadvantage for interviews, but probably less of a disadvantage for actual practical programming.

Be honest, how many of us implement our own data structures these days? I sure don't. I just build on or use whatever version of map comes with the language 98% of the time.


> Tree data structures are really common in front-end.. like the DOM....

99.999% of front end developers do not solve those kind of problems.


Also there are font-end languages (XPath, CSS selectors, etc) specifically for these types of structures, so you don't have to traverse them.


Yeah, just pile on another layer of code you don't understand. And then wonder why it's slow or doesn't work the way you expect.


I know, so crazy, that's why Facebook was written in x86!

Fun fact: I'm lying.


Software developers always have to work on top of some sort of abstractions throughout our career, correct?


That doesn't mean you shouldn't have a reasonable understanding of what those abstractions are doing, though.


For the purpose of selecting a DOM node in a performant manner, you really don't need to know how CSS/XPath traverse the DOM. You just need to know that querySelector exists.

In general, I think it's enough to know what your abstraction does, rather than how it does it.


There's a huge difference between using an abstraction with vs. without understanding what it does. Often the latter leads to more layers of abstraction piled on top to "fix" the problem, until everything sorta works and is barely workable at the same time.


"All non-trivial abstractions, to some degree, are leaky."

https://en.wikipedia.org/wiki/Leaky_abstraction#The_Law_of_L...


It's build on top of the abstractions; not depend on someone else's implementation of all the abstractions...


And if the candidate will be writing a new DOM traversal API, this is a valid exercise.

As it stands, any of the libraries that do DOM manipulation for me also traverse the tree for me ... so, again, why do I need to implement a toy BFS in an interview?


To debug the traversal algorithm when it breaks in production?

Also, if you know that all the companies do this, why don’t you just study before hand? Tree traversals come up multiple times in the article. I understand the author’s frustration but not their (early) insistence on not studying for an important interview. When they did study later I wonder how seriously they took it based on the tone.


>To debug the traversal algorithm when it breaks in production?

And what's the likelihood of that happening with a well-used framework? Maybe the interviewer wants to see my ability to debug problems - great let's do that instead. You want to see how I work? OK, let's pair on something. Pinning this to "implement BFS at the drop of a hat on a whiteboard or in this project" is bullshit.

>Also, if you know that all the companies do this, why don’t you just study before hand?

Fine. I'll implement a few solutions to the problem in the languages I'm familiar with and put it on GitHub. Now the interviewer can check it out and we don't have to waste time at a whiteboard.


> what's the likelihood of that happening with a well-used framework?

In my experience, higher than you'd hope.


You have heard of the monkey - ladder - banana experiments?

Just because the interview culture has evolved to this awkward unrepresentative generally unhelpful filtering method, is no justification for "well just do it that way and you'll be fine".

If I'm the new monkey, and someone smacks me for touching the ladder, I'm going to ask why and push back if there's no good answer.


Because one day you might need to, and you'll write a 100 lines of code to do it. If you had known what a BFS is and how to write it, you'd be able to accomplish the same task in 10 lines of code.


Counter argument: I could google it, read about it, and implement it. All of it in 1/2H with a computer with internet. Not on a whiteboard. Which is a fucking stupid interview process.


Very unlikely in my opinion. Either you’re told about it beforehand, inherit a project that’s using it, or introduced to it during your PR. Especially in places that ask about them during an interview...

I don’t think it’s about the BFS, it’s about attitude.


Mostly because it is simple enough and does not require special knowledge except knowing what tree is. Comming with original situation is harder and would put candidate to more random position.

The maze solving could be solved by breadth first too. It is algorithm that can be used in many different contexts.


Yikes... Interviewing sucks but half the reason interviewers ask these types of questions is to see how your attitude and how you respond. Writing an Instagram clone in Angular doesn't really tell me much about your problem solving skills when faced with a unique problem.

How many software developers work on "unique problems" that involve hard computer science problems compared to the number that are just using an existing framework to solve business problems?

Tree data structures are really common in front-end.. like the DOM....

And most of the time you're going to end up using a pre-existing implementation....


Most projects require you to solve small simple unique problem once in a while. And any long term position requires you to regularly learn not so simple new things. Depending on your existing knowledge, the BFS question tests either whether you was able to learn BFS in the past or whether you can solve small unique problem.

There are positions that does not involve anything harder then BFS ever, but I would not say they are majority of them.


In 20+ years of development, including 12 doing cross platform C work, two maintaining a bespoke development environment for Windows Mobile (compiler, VM, IDE, etc.), I've only once had what I consider a problem that needed any type of complex CS algorithm. That was to evaluate an algebraic expression in a string in C using the "Shunting Yard Algorithm".

And the 20 years of professional development was after I had been a hobbyist writing 65C02 and x86 assembly language....


Do you really consider breadth first search complex algorithm? It literally is "you have structure of related data, find the object with given id by systematically going through the structure". That is all what it does. Find whether directory contains file bigger then x is example of breadth first search. That sort of thing.

It makes perfect sense to not remember the name and that should be ok. But it really is not something complex - it is pretty much simplest algorithm used to explain the concept of algorithm to students.


I’m not saying that given a scenario, I couldn’t figure out a way to do it, but at the level of a job that I would be applying to after all of these years, would a company really be more concerned about whether I could write a breadth first algorithm or how well I could design a highly available, fault tolerant, redundant system, write maintainable code, my mentoring and leadership capabilities, my experience with Domain Driven Design, my years of experience designing and developing “cloud first solutions” (yeah I grown a little bit when I write that), and certifications?

My current employer asked me no programming questions even though I supposedly came in as a “Senior developer” in my small company.

He was more concerned about everything I mentioned and how could I help mature the organization.

Heck, he didn’t even care that I didn’t know any $cool_kids front end framework.

Whenever my last day on my current job comes and I don’t expect that to be for a few years, I’ll probably end up working for a consulting company as an overpriced “digital transformation consultant”, “implementation consultant”, or “solutions architect”. Do you really think they are going to ask me about my leetCode capabilities?

The same should be true for anyone at a certain stage of their career.


I would kind of expect that if you can design a highly available, fault tolerant, redundant system then you would figure it out. I can see how you could find the question easy and not allowing you to show what you can do. But that is opposite situation. A lot also depends on the position. If the position is not leadership position and does not require specialized knowledge you have, I don't think the company does wrong by not asking about it. Then those easier questions have place. I assume that overpriced "digital transformation consultant", "implementation consultant", or "solutions architect" does not write code or only little of it. As such, code focused interview makes no sense. It is however different for position that writes code a lot.

I don't get insulted over basic questions. Mostly because I worked in a company that did not asked programming questions. As a result had to work with few people who could talk design and maintainability such, looked like great socially and turned out they could not write the code except in simplest situations. It was not good and harm was long term, Largely because of resentments etc that build up in team who had to do someones work while that person was treated as superstar. Such person needs strategy to mask inability and those are all toxic - masking own inability by blaming others etc.

What are alternatives out there? Take home assignment, fizzbuzz, simple algorithm, trivia questions, requiring you to already know exact technology they use. Someone complains about every one of these. There is no hiring process that make everyone happy and fit everyone, but imo, as long as company does not go to some crazy extreme somewhere it should be fine. If candidate have to balance red-black trees then it is very clearly too much, figuring whether string is palindrom is not too much.

There is also something good to be said about repetitive hiring process where company can compare how people did on interview and then how they did in real life. As such, it will contain some generic or easy questions.


If you are getting paid to solve business problems, then why not have them solve simple version of business problems?

My interviewing process for developers is giving them a skeleton of a class with corresponding unit tests that they have to make pass.

Then they get another set of unit tests that they have to make pass without breaking the first set.

The code models a simplified version of real world types of problems we had.


That sounds good, really. It is better then asking about breadth/depth first search. Added bonus is that potential employee gets some idea about what he will do daily too before making decision.

Another way to screw hiring is to select for algorithms loving geek who seeks algorithmic challenge and can recite obscure edge cases - and then put him on position where he has to solve normal business problems or maintain unit tests because that is what job is. And then wonder why genius is demotivated and does not seem to produce.

But, I still don't think either of these search questions is outrageous or a deal breaking question. It is within acceptable questions range - partly because it is so common that I would expect a person to quick google it if they heard it twice already and did not figured it out in stress of interview.


After 20+ years I'd expect you to be able to make it up on-the-fly.


I've only ever implemented BFS for coding challenges. Do you find yourself reaching for it on a regular basis in front-end development?


I've used a version of it before, to traverse a cached client-side hierarchy of domain objects.

The inevitable response to that is "well that's an exception, it's only occasionally needed." True, as far as it goes. But if you exempt all of these "rare special cases," you've suddenly exempted 10% of the work. That 10% is among the trickier bits (though the hardest is always organizational and product).

I also hate to say this, because it's snotty, but as far as obscure algorithms or tricky, complicated algorithms go... BFS and DFS don't really fall into those categories.


>But if you exempt all of these "rare special cases," you've suddenly exempted 10% of the work.

I don't think that's the issue though.

The issue, as I see it, is not that you had to figure out how to implement BFS. I think many of us are confident we could do that under professional working conditions. If I had a day and some data structures to poke around with, I'd write a killer BFS supported by tests if I needed to (I've had to do similar things in the past).

The issue is that interviews expect us to recall this kind of specialized and rare problem solving like it's a day-to-day thing. We as candidates are being judged on how well we can come up with a novel solution on the spot to something that many of us will need to do once or twice in a career, and will be able to solve under completely different conditions.

In short, for most candidates, these kinds of questions don't test anything realistic, and a lot of people have issue with that. It's like judging whether you want to go to a chef's restaurant based on how well they did on Chopped! They might do well under pressure because they have practice with it because they're always in the weeds because their restaurant is poorly run. A terrible dining experience might translate to a win on a cooking competition show, because the show isn't testing the chef on the experience they provide you. Just as these kinds of interview questions don't test you on how you'll actually be interacting with code and solving problems day to day.


I don't want to hold up whiteboard interviews focus on algorithms as the be all, end all. They're a poor proxy for day-to-day skills: the hard stuff is organizational, knowing all the things that can go wrong, and technical design that's both flexible to changing conditions and easy to work with. None of those are really amendable to probing in a 45 minute interview, or even several hours of pair programming.

But it's about friction: I want my coworkers to be focused on the genuinely hard problems, not spending a day writing a BFS. The current interview process does manage to probe that.

Going a bit deeper, the whiteboard interview process is a good proxy for ability to prepare over the medium term (a month or three of consistent studying should give you as good a chance as anyone to get into a generalist position at a prestige company) and of IQ. The latter is controversial and most organizations can't test for it directly (owing to legal concerns), but a relatively high IQ is a core requirement of technical roles, and whiteboards provide a solid proxy for that when coupled with the opportunity to prepare for them beforehand.

That said, I'd always go for someone who has the ability to deliver on complicated, large projects over someone great at whiteboards or who has a high paper IQ. It's just that it's pretty much impossible to evaluate for that in a way that works on a general application pool.


This is a very acute answer. I actually do think well-structured algorithm questions that are variants of common ones might be strongly g-loaded. E.g. They test knowledge + IQ.

I would say big companies that hire new people or generalist programmers benefit greatly from them. These types of questions test for intelligence at scale without outright being an IQ test. In fact, I think I read somewhere that Google engineers' performance is strongly correlated with how well they do on their interviews.

Sure, these questions might not test for conscientiousness, curiosity and general agreeableness, but that's why you have the other portions of the interview.

All this being said, I don't necessarily think these questions work at most companies where the engineering teams are significantly smaller and you can really spend a lot of 1-1 time probing knowledge and experience and reviewing coding exercises. However, I would still say basic DFS/BFS and using a hash to solve a problem is still a must. I use them on occasion, even as a generalist.


I think the goal of hiring is always: "We have a particular business problem, and need to determine: if we pay this person will they be able to help us solve it?"

As much as I think certain companies shoot themselves in the foot by spending their interviews asking trivia questions about e.g. Rails (because that's what they use), it's not unreasonable that they might just more highly value having someone who can be productive in their Rails world immediately against someone who, while excellent, nevertheless needs a week or several to onboard the ecosystem. Or in other words, they think the goal question is answered in the affirmative more strongly for someone already in the ecosystem. A dangerous assumption, but possibly valid. There are always tradeoffs to be made in the specificity and category of your questioning, but interviewers need to keep in mind how they contribute to the overall goal's question.

For larger companies, it's more likely that there are more problems to solve, and more likely a need to get people working on them sooner rather than later, so the incentives start pushing for addressing the goal of hiring (if not for all positions, at least for many positions) by answering a simpler goal question of "is this person smart and gets things done?" (which is just high IQ + conscientiousness) and hiring en masse. If you put such people to work on any of the problems you can rightly expect they will advance them some amount, even if they have to ramp up on a programming language / framework / whatever feature of the problem's environment first, and as time goes they can shift around the company to where they're even more effective. In addition to being fast it's also very fair with respect to people's backgrounds, suddenly it doesn't really matter if you have a thousand widely used github repos or not, have a 10 year experience headstart or just graduated a bootcamp, if you have a degree or not, if you know the framework already or not; people from the "wrong background" can still be hired if they can demonstrate they are sufficiently smart and conscientious to start working on one of the various problems you need work on yesterday. The fact we have to proxy this with whiteboard hazing sucks, it's expensive for everyone involved compared to just asking for ACT/SAT scores or the results of a previously taken IQ test when available and giving an IQ test when not. In my own interviewing I try to proxy for "smart & gets things done" (because that's all my teams at my current company have needed) in a less asymmetric/aggressive way while answering other questions since we can't hire everyone. But even the aggressive whiteboarding is better than every job interview screening for highly specific backgrounds and/or trivia knowledge...

And I agree that BFS/DFS is appropriate, though needlessly so (i.e. there are even simpler questions that take less time), as a first step in answering the business goal which is simply verifying: can this person who says they can program, actually program? Unless you're willing to train people to program, you have to start the cutoff somewhere. It's modestly appropriate to answer the question of: does this person know their craft's basic tools?


> Going a bit deeper, the whiteboard interview process is a good proxy for ability to prepare over the medium term (a month or three of consistent studying should give you as good a chance as anyone to get into a generalist position at a prestige company) and of IQ.

This is where I have a problem. Why are we the only industry that has this expectation of months of preparation? Engineers don't do it. Doctors don't it. Professors don't do it. Why?


I did use BFS/DFS or things like find/union in my job. But I'd say it's mostly about how quickly you can wrap your head around abstract concepts.

An analogy from finance: if you take a pen and a paper, and think about it for a few minutes it'll be clear that buying a stock and a put option is equivalent to buying a call option (and a bond, since you're also locking in some capital). This is fine, and most people can do that. But the brilliance happens only when such things are immediately obvious to you, in the same manner as you don't need to consciously decode English while reading this sentence. It's another question whether this or that particular company really needs brilliance.


No, but its a really easy algorithm, it really shouldn’t be too hard to implement given a description of it.

(not accounting for the people who are just bad at writing anything on the spot in a high pressure interview situation, of course)


It doesn't matter if it's a really easy algorithm if it's trivia or unrelated to the job.


How do you know that implementing a feature very close to how BFS works won't be necessary in the job? You can't. At this point asking if a candidate can make BFS is akin to asking if they can count. Same thing for inverting a binary tree or doing fizzbuzz. They're general and easy enough that anyone with a programming background should be able to figure them out.


Given that the hiring process has time constraints (however you do hiring, you can’t use infinite time, so you need to pick what you’ll include in the process), why use time testing the candidate’s ability of something that might be necessary in the job. Wouldn’t that time be better served testing something you know will be necessary?


You ask about things you know will be necessary as well. In an interview the time shouldn't be spent entirely on academic programming questions. They're there only to gauge problem solving skills in real time in a stressful environment, which work sometimes is. Who says you can't fit any of the above mentioned programming problems in 10 to 15 minutes?


Wouldn’t asking real-world questions that the person will definitely run into during the job also gauge problem solving skills? Unless problem-solving isn’t part of the real job, in which case it’s not worth testing for.

If your point is that you do a mix of questions that are definitely relevant and some that are academic and might be relevant, why not just do 100% known-relevant questions? What’s the value add of asking questions that are less than 100% relevant?


Here's a simple way to quantify the argument: go find the most popular Node or React front-end projects on Github, and then find out how many of them --- in their own source code, not in the vast, unending dependency tree NPM generates --- contain a breadth-first search.

My guess is: not very many.


I’ve had to write tree searching algorithms at multiple companies, although I usually find myself reaching for depth-first search instead. It comes up often when dealing with hierarchies (think things like company org charts for HR software, for example).


I stumbled a bit on "write BFS" at a FAANG interview. The specific question was to write it for the Facebook friend graph. I just blurted out my reaction, which is that this was a terrible idea (given the space requirements). His response, "Well, pretend it's not.". Ugh.


Huh, what's terrible about it? BFS is exactly what I'd want.

Let's make up a synthetic problem, like "the closest friend which has property X". Then you'd first check all of your direct friends for X, then check friends-of-friends for X, then check friends-of-friends-of-friends for X, and so on. You'd go on forever until you either find a friend with property X, or run out of time/space.

This is the classic BFS problem, and I can see how it would make a good interview question.


It depends a lot on the specific question, and the connectivity of the graph, but in general, BFS can use space proportional to the size of the entire graph, which for the FB friend graph is huge. Even on a machine with a lot of RAM, you shouldn't assume this will work.

DFS or IDFS can generally use space proportional to the diameter of the graph, which is far smaller.

That caveat with BFS turns out to be so bad in practice that I've never seen the algorithm used in practice, outside of a classroom. And indeed, I first thought the point of the question was to elicit this complaint. The interviewer wasn't on that page, though.

The problem being asked was considerably more complex than "closest friend with property X". I don't recall the details, but perhaps it was something more like "find the ten shortest friend paths to (a unique but unknown) someone with property X, where those paths share no nodes".


Agree re "depends a lot on specific question", but the problem you specified still sounds very much like BFS, especially "shortest path" part.

My assumption for the Facebook graph would be that there is basically no way we can traverse it all, so your only hope is to find the path without expanding all the nodes. DFS will not work for that at all, but both BFS and IDFS may give you practical results.

This leaves the question of BFS vs IDFS, and that depends heavily on the details of the problem. For example, if the graph is already in RAM, then IDFS would be the best. But if the graph is not already in RAM, and you have to fetch it (from database or remote API), you'd definitely want the caching between successful IDFS rounds. And if you do that, then you might as well do BFS -- approximately the same memory performance, and much easier code.

As for usage, while BFS itself is not used this much, it's more advanced versions, Dijkstra and A*, are used all the time in graph traversals. For example, in many computer games, navigation apps and robotics planners.

(And back to original topic: if we had conversation like this during the interview, then you would likely get good score from me, even if I was fully convinced that BFS is the only way to go. After all, I am not testing for the specific bot of trivia -- I am testing for the ability to reason about algorithms)


> DFS or IDFS can generally use space proportional to the diameter of the graph, which is far smaller.

Maybe, but for "find the closest friend with property X" basic DFS is completely useless. It's likely to give you some distant rando with property X. Fast, sure, but not useful if you're looking for a friend.

Besides, if you have cycles in your graph, DFS also needs to keep track of which nodes you've already visited, or it's never going to end. Or use iterative deepening, which you probably want to use anyway to prevent ending up with some distant rando. Slower than BFS but consumes less memory.

> "find the ten shortest friend paths to (a unique but unknown) someone with property X, where those paths share no nodes".

Ah, but that's a completely different problem.


As above, "IDFS" == "iterative deepening DFS".


Yeah, I figured that out some time after I succeeded in digging through some old memory for the phrase "iterative deepening". It's been a while since I've had those classes. I do remember that at the time I didn't see the point in iterative deepening. Surely breadth first was faster than redoing the same work every time? But if you see it as a more memory-efficient way to simulate breadth-first through depth-first search, it makes sense.

Still, intuitively I'd expect that it depends a lot on the shape of your search space, the cost of your memory and the cost of traversing your network, which one is actually more efficient.


If you're generating nodes in an abstract graph (e.g., moves in chess), iterative deepening can be incredibly space-efficient, whereas BFS can rapidly consume an exponential amount of space.

As you point out, IDFS (or IDA*, etc) work far better if you have a means to avoid re-exploring the same nodes repeatedly.

Nonetheless, run-time can theoretically be greater, since you're iterating. Theoretically, though, each iteration will take N times longer than the prior, which means that the running time up to the final iteration doesn't matter that much (because it's such a small fraction of the total). The extra bookkeeping required by BFS can easily outmatch that in running time.

Definitely, it all depends.


Getting around the DOM is really simple: querySelector and querySelectorAll do the work for you.

Accessing properties of a known object is better handled with the selector pattern. I guess you might use BFS to implement a search feature for user-generated content on the fly? I don't think it's relevant to most FEDs roles making CRUD apps with backend searching APIs.


Ughhh, should I tell him?

I agree with the sibling commenter: responses like these are very condescending -- and basically prove the main point of the original article.

And BTW:

Writing an Instagram clone in Angular doesn't really tell me much about your problem solving skills when faced with a unique problem.

Neither do your made-up puzzle problems.


There's always the interviewing apologists out in force whenever someone brings up how broken the process is; this post is no exception.


Have you been on the other side -- too optimistic interview? You know, your had great-sounding candidate, had nice, non-technical interview, and hired them.

Then the person could not really pull their share. They did a few simple PRs, it all looks good. You gave them the more complex task, and they just could not make it to work. After teaching them basic CS concepts for a while, you give up, and try to move them to backend -- and they do not do better there. Then to do data analysis -- no luck. You really do not want to fire them, but this seems the only way forward.

The resulting experience is painful and time-consuming for the team. You wasted many weeks trying to teach that person and nothing good came out of it. You promise that in the future, your interviews would always contain technical questions, and no one who does not know about big-O complexity would be hired.


And asking people tricky questions doesn't tell you about people's problem solving skills either, it just tells you that person is a good talker. Justifying tricky tree traversal interview questions by comparing it the DOM??? That is a stretch. In any case, beyond _solid_ coding skills, the thing that makes someone a great member of a dev isn't coding skills, it is a whole pile of professional best practices, social skills, and general passion for continuous improvement. It is simply not possible to test for those things in an interview.

The reality is that interviews are broken. Not because of this guys subjective perception that it is so but because for employers you just aren't getting the SNR to justify typical interviews. People who cling to that approach anyway are largely, IMO, motivated by ego and cargo cult (lack of understanding and creativity). In addition, in that context, interviews are also broken because in being worthless, they are demeaning to the candidate who isn't great at tap dancing on your command.

What does work is past performance and actually working together. Both are problematic data to get at so I don't think there are easy answers here. We do a resume review to see if they even claim to have the expertise we're interested in, a very short and simple "gut check" coding exercise (not tricky, just checking they can actually write decent code and tests), a 30min phone conversations where we check that both parties are aligned on what we're looking for, contract to hire, then exercise extreme discipline in parting ways with people that aren't great before converting to W2. Our SNR is pretty good. A lot of people don't want to contract to hire so this system has cons. YMMV of course.


I think my current company does a decent job of this. Yes, we'll ask you design questions on a whiteboard, but any algorithms or code implementations are done on a computer, and Google is encouraged.,


The thing that appears to me, most overpoweringly, is the bitterness.

I understand, I've felt bitter about interviews before, I'm sure some interviews were unfair. But I also recognize this bitterness is a weakness of mine, unproductive, unattractive.

The way the author indulges in this resentment would make me very cautious about recommending him regardless of technical capability. An employee who has low tolerance for the massive randomness/unfairness in the world probably would get sick of most companies pretty quickly.


The frustration mounts over time - give the person some slack, they’re probably just venting. Most everyone hates the interview process. The way I see it is if perfectly qualified and not nervous, you have a 40-60% chance of passing an interview - it only takes a few coin flips for someone actually competant to go “hey, this shit I’ve worked much of my life for is arbitrary and random”.

The issue I think the poster is having is they just don’t understand how algorithm interviewers think - it’s totally reasonable to think oh, frontend has a DOM tree usually, so they should know BFS (even though you’d rarely if ever use your own traversal alg - leave it to the people actually researching it, not the person building a frontend).


This was my impression as well. There were likely real issues that made the process annoying, but the tone of the post is basically "how dare you ask me those stupid questions at the interview you invited me to?".

On site interviews usually evaluate not only technical competency but also whether that person is OK to work with, especially during crunch times. Being able to point to a few github projects is good, but not sufficient. When faced with a question one cannot answer staying positive and trying a few options is often sufficient to get off the hook.


It is insulting to someone with tens of thousands of lines of public code in open source. You can work at a tech giant for half a decade and slip under the radar with no way to prove or disprove your competency there unless you can get references from the job, but for those who have put out tons of work for free having those who want them to write code for them asking for grade school pop questionnaires is an insult to their expertise.

You don't ask a published drug researcher with patents or papers written to build simple molecules out of building blocks during the interview, but in software even senior engineers are asked to regurgitate things from CS202 for decades on end instead of actually acknowledging their accomplishments or talking about their legitimate capabilities.


I only skimmed this rant, but if one interview is terrible, then the interviewing party might be an asshole.

If every interview you ever took is terrible, then you are probably the asshole.


The "surrounded by idiots" theorem.


What if you’re only “surrounded by idiots” while in an interview, but otherwise well-regarded amongst family and friends, and most of their acquaintances you meet? Or other strangers, whether on the train/bus, at meetups or online?

On a more serious note, I’ll offer as a point of contrast The Sane Society[0], which puts forth that society can itself be sick–despite it’s being the status quo–detracting from the wellbeing of individuals.

[0] https://www.goodreads.com/book/show/40717990 by Erich Fromm https://en.m.wikipedia.org/wiki/Erich_Fromm


An employee who has low tolerance for the massive randomness/unfairness in the world probably would get sick of most companies pretty quickly.

Another way to look at it that they're a ... human being, attempting to come to terms with a truly toxic and dehumanizing system.

In response to which my gut instinct is not to defend the system, as would appear to be yours. But to recognize, as the original author does, that the system truly is fucked.

And if we're to maintain our health and sanity -- at some point one has to simply decide to "just say no", and start looking for and building an alternative.


It only failed 5 times, try more times and u will master all the interview process! maybe the hiring statistics is 1/6 smart guys get the job and u just had to try a little bit more.

I got this quote after failing google interview in the last round.

"Success consists of going from failure to failure without loss of enthusiasm."

Winston Churchill


> unattractive.

You're hiring a software developer, not an actor portraying a generic love interest.


It's likely they're referring to the person's attractiveness as a potential candidate, not their physical appearance.


And yet, it boils down to the same issue, and the issue with interviews in general: they're highly subjective in ways that often end up exclusionary (and in specific, racist and sexist), with a significant amount of smoke and mirrors (of varying levels of annoyance) designed to distract both sides from what's actually happening.


Here it was quite clearly used to express the idea of "makes other people less want to give you job" rather then anything vaguely subjective.


> I really wish companies would be more transparent about their candidate rejection reasons

I always wish this too. At my previous job a (also at BigCorp) candidate reached out to ask why they hadn't been selected so I started putting together a little email for them trying to give polite feedback and encouraging them to apply again in a few years (they were great, but we needed a more senior role and had no open junior positions at that time). I asked my boss to review it and make sure it was okay, and he told me that under no circumstances do we ever give candidates feedback because if you say one wrong thing it's a lawsuit waiting to happen. Having been on the interview side where I really wanted to know how I could have done better, I was a bit angry on behalf of the candidate.


> under no circumstances do we ever give candidates feedback because if you say one wrong thing it's a lawsuit waiting to happen

Wow, how widespread is this? I've _always_ asked for feedback and I've gotten meaningful feedback maybe once.


Universal, for all the companies I've worked at.


Seconding this.


I got very detailed feedback after an interview with Netflix.


I think it's fairly widespread. I got rejected from Amazon with a "reach out in 6-12 months," but without a reason for the rejection.


This is common and necessary. There are a ton of federal and state laws about what can be considered in hiring, and if one steps an inch off of the legal path, the results are dire. Written feedback is the worst. One simply can't do it, and if you were employed at yahoo, and did any interviewing, you went through that training.


I would really love an answer whether this is ACTUALLY the case - it appears to be everyone’s assumption, but is it actually true?


Most people take feedback very poorly, so if you do provide someone with reason on why they weren't qualified they may read it in a very negative light. Saying something polite like "you're over qualified" might be read as age discrimination if the candidate is over 40.

Even just ignoring any liability, someone who's angry about being rejected might also respond with reasons that they are qualified or whatever and just burn up more of everyone's time. Then sometimes the reason that a candidate wasn't picked has nothing to do with them, and that's not really something you should share.


> sometimes the reason that a candidate wasn't picked has nothing to do with them, and that's not really something you should share

Strongly disagree. Personally, it would be such a weight off my mind if I heard I was passed over for someone with a few more years of expertise in the target domain, or were a well-known expert, etc., or that suddenly the company is no longer able to bring on another person.

Not having to guess if it was something I said behaviorally, or if I didn’t solve a problem to their satisfaction, would remove so much frustration.

I’m a smart person–I realize that there are many other smart people out there, even smarter than me! It’s perfectly reasonable and expected that I might be competing with some of them, and it is perfectly fair and good that they win out.


> is over 40.

A step beyond that, a surprising number of people think that that particular law applies to any age. Even if it's thrown out it may cause legal or reputation problems anyway.


In my experience as hiring manager I never had that imposed on me. However at the same time I try not to be too open because you never know some people take it personally. Sometimes people do ask and I forward a reply through HR. I do make an exception for junior candidates who are just starting out and could use a lot more pointers.


Surely you’re open to a lawsuit anyway: candidate doesn’t get a response to a feedback request and sues because they’re a minority, no response must mean discrimination, right?


Nope. I live in an at-will state. Even then, when letting someone go, if you give a reason, you may have to prove that reason in court. If you don't give a reason, then you won't have to prove anything. Therefore, a simple, "you don't fit into future plans" is always the safest reason, and nothing more. It sucks and it's cold, but it's the harsh truth. Similarly why you see very few companies and people apologizing for misdeeds, even if they're really sorry - because it opens them up to liability.


> Surely you’re open to a lawsuit anyway: candidate doesn’t get a response to a feedback request and sues because they’re a minority, no response must mean discrimination, right?

No.. because all you have to do is demonstrate that you followed standard procedures.

Your lawyers would respond to their lawyers with appropriate policy documentation, and their lawyers would tell their client that there likely isn't a case.

The only way there would be a case would be _if_ the person could find others who had received feedback.


> Surely you’re open to a lawsuit anyway

Of course. But every piece of information the ex-employee has is potential evidence that could be used to support a discrimination claim, and since you don't know what other information they have it will be able to gather and how any feedback you provide will look in light of that, the simple solution is to provide nothing.


What the company is trying to do is decrease the attack surface


Hiring is hard. It's very difficult to figure out what combination of activities/questions will correlate to actual job performance, not to mention uncover someone's day-to-day personality when their guard is up.

With that said, my interview process basically is:

* 30 minute intro call to dive into the CV a bit and see if it's worth everyone's time to go further

* 1.5-2 hour pair programming exercise using the actual language & framework they'd most likely be using for the job

* If that goes well, 30-60 minutes to meet with the team and figure out if everyone will get along

That's it. No whiteboarding, no FizzBuzz, no algorithms. If we make a mistake we try to recognize it within 6 weeks of the person starting in that position.


> * 1.5-2 hour pair programming exercise using the actual language & framework they'd most likely be using for the job

This advantages people who have actually used that language/framework for prior jobs. If that's a thing you want to optimize for, then that's fine. If you want to evaluate people as more generic software developers, it's not so great.

That's why whiteboarding algorithm questions can be useful. They're sufficiently abstract to generalize to lots of different 'stacks' and experience levels. If you just want to get generally competent developers first and then figure out where to put them later, then this type of question can make a lot of sense, which is why Google, Facebook, et al. use them.


Algorithm questions may be more “general” than doing something like the actual work being performed, but frankly most programmers aren’t writing optimized low level algorithms on a day to day basis. All you get is people who’ve memorized the algorithm tricks.


It's not as simple as just memorizing "algorithm tricks". There are too many possible questions to just memorize all the answers, you need reasonably strong CS fundamentals, and also to be fairly smart. I think that's part of why they're used: actual IQ tests are legally fraught, but giving someone what amounts to a programming IQ test is not.

It's certainly true that studying these kinds of questions helps, but if you're not too bright, I don't think that would take you very far.


Strong CS fundamentals is understanding all the various patterns and tricks in algorithms and data structures, and how they could be applied to problems outside of where you originally learned them. That is it, outside of actual CS research work, when applied to an actual software job. Ask any self taught person how much they need “CS fundamentals” to blow the pants of any business need for an app and they’ll tell you probably very little.

When you’re doing tools for programmers, it gets a lot more complicated. That is not most jobs.


My process is pretty much same as above, with the difference that I let them choose whatever framework/language they feel most comfortable with (with obvious caveat that I might not be able to help them if they get stuck).

If you are good dev, you will pick up any language/frameworks quite quickly if you want to (there are exceptions, like switching to completly different paradigm e.g. from JS to Haskell, but that is not our case).


Generally-speaking if we're hiring someone to work in React or, say, Kotlin for Android, they need to know at least a bit about the framework they're working with. The process would be different for co-op positions/new hires directly out of school, and obviously we'd tweak it for a clearly awesome developer who is in the process of retooling.


Is that really true? Learning a framework or language isn't especially arduous. My team works in C++. I have zero C++ requirement for hiring on my team because in my experience every good hire can pick it up very fast. If I restricted myself to people who can sit down and run with C++ then I'm winnowing my pool of candidates.


This seems like a reasonable process. However, for me personally, I have a big portfolio of side projects with open-source code. And it irks me if a company asks me to prove my programming skills by wasting my time doing some silly tests, where if they just read my code we would be able to skip it. From all the places I've worked, the companies with the better programmers were the ones where they actually looked at your past code and had hardly any exercise or test.

I think the 6-week "probation" period is an excellent idea. Since most of the time you can recognize if they are a good fit or not.


One problem I have with this is that tons of people like to push the "GitHub is my resume" thing, but 75% of the time they have 25+ "forked" copies of major open source projects with maybe one or two extra commits on top.

It can be a real drag to try and suss out who actually has real contributions or real original work on their GitHub. Our approach for hiring is to actually have people cherry pick out one or two items they're really proud of and have them send a brief overview / sample.

That approach is also nice, because it gives you a great interview starting point to dig into a technical item that they (should be) comfortable with and passionate about.


Your cherry picking idea is fine, but your reasoning for it is bs. If someone wrote the code (or read it enough to grok it), asking them some questions on the internals will sift out who just copied some free repos.


I think that's reasonable too. I wasn't try to say "github is my resume" because in my site I point out specifically projects that I made, and link them to the repo.

Making the person pick out projects they are proud of is a great idea.


A great portfolio is a huge plus (and I've hired people purely based on their open source contributions, usually remote), but as mentioned elsewhere your previous code tells me very little about what you're like to work with day-to-day. We usually do something like "here's our mobile app, let's start the process of building it in React - walk me through the process as you go".

We all know devs who write beautiful code but are impossible to work with ;-)


the companies with the better programmers were the ones where they actually looked at your past code and had hardly any exercise or test.

I don't do side projects. I work 40+ hours a week on a computer, when I get home, I'm spending time exercising, with family and friends are just doing nothing.

If a company doesn't think I'm "passionate" because I don't have a Github account so be it.

Also, I'm not going to do a dog and pony algorithm white board coding test. I'll draw out architecture on a white board but if you want me to code, sit me behind a computer and test whether I can solve a real world business problem.

I'm most likely being interviewed to either write/architect a bespoke internal app or yet another software as a service CRUD app. I'm not being interviewed to create software to put a man on the moon.


Sorry for the misunderstanding. I didn't mean that everyone should have side projects. But if you do, why not judge some of it based on that? And if you don't have open-source code to show, then some kind of test seems appropriate.


That code may be the result of months of polishing and cooperation.

An hour of pair programming lets you know what it's like to work with someone. Are they reasonable to discuss things with, etc. On top of learning about their programming and problem solving skills.


It's also a result of me calmly working instead of nervously taken part of an interview. The one thing that I've noticed is that it's very hard to judge someone's personality in these kinds of situations.


Yeah, that's a fair point. Interviews are a little insane however you do them, and still you need to figure out some way to hire people.

I do like this "homework" interview variation: You get to work on a task at home before the interview at your own pace for a few days. Then you come in and talk about your design choices, and are asked to add a feature or two to your code.


> the companies with the better programmers were the ones where they actually looked at your past code

How common is it to actually be able to do this? I've worked at Google and Amazon and Goldman Sachs; I can't show any of that code. I've occasionally dabbled in coding little scripts or projects at home, but nothing terribly challenging or complicated.


I'm not sure how common it is. But if someone does have open-source code that you can review and they'd like to showcase, why not do that? I guess that's all I was trying to imply.


Usually my suspicion is that the code is just a cloned repo or similar.

I have however dug into github code that seemed legit, with the candidate claiming to be one of the primary developers of a popular Xbox360 emulator. I surprised him by quickly finding obscure bugs, including a couple buffer overflows and a way to cause a math exception in the host process.

It does feel kind of unfair to those whose code is locked away under NDA.


A pairing exercise isn’t only testing your programming skills, it also tests your ability to work with other people and get on with them, which is not only important but much harder to train than the technical side of things.


Which I understand, but I also see this as flawed since the person being interviewed will have their guard up and be nervous. This will skew what their personality is really like.


If interviewer has any clue, then they will know that and will account for that in their judgment. Almost everyone is nervous and will not perform at their best.


It can test many things, especially if you physically sit together. I used to have a coworker with an unusually severe fart problem.


I think developers complain way to much about the interview process. The fact of the matter is that it's very clear what type of questions will be asked for most technical interviews and there is a wealth of resources available to prepare. The ROI on spending a couple of weeks preparing is definitely worth the possibility of a substantial raise at a big technology company.

I can't think of another industry that is as high paying as software development, with such a low barrier to entry (formal education matters less and less) with such transparency regarding the interview process (books, blogs, literal guides from the company itself). And HN is flooded with a couple of posts like this every single month, is it perfect? No. What is? At the end of the day this just looks like entitlement.


It is a total failure of industry to ask people who are productively writing software one day to waste dozens of hours learning to dance an artificial interview game to get hired somewhere to start writing productive software again weeks later.

If the candidate has no resume and nothing to prove they have any skill then yes, give them a rigamarole. But for don't ask a staff artist to speedpaint you the view out the window with crayons in a lined paper binder in half an hour. They give you their porfolio, you inspect it, and if they are accredited and recognized you don't waste their time on quizbowl.

Arguing about it though is mostly pointless. The awareness is out there that these companies are throwing away not just the on hand candidates they turn down for failing their charade game but also the legions of potential employees turned off by the theatrics and immaturity of software hiring. The OP is absolutely right to say interviewing in tech is draining because you go from thinking yourself competent on the outside to interviewers trying to beat you into a sobbing mess on the inside. Its adversarial from the start at almost every company, it certainly contributes in large part to the lack of women participating in development, and businesses seem to be just fine with the outcomes since they know what they are doing is awful but so long as they are "hip" and popular they can get away with it, with an endless list of potential hires to give the rigamarole to.


> if they are accredited and recognized

This jumped out at me. What do you think is analogous to that accreditation in the software industry?


A portfolio of shipped code, preferably open source. No different than how an artist might have to jump through a lot of shit to get hired if they don't have a substantial show of work, but if they do, you are often fast tracked into employment.

That doesn't preclude spending some time to insure your candidate isn't plagiarizing their cited works, but that is a problem other industries have solved well for years without turning the process into an adversarial circus.


> What is? At the end of the day this just looks like entitlement.

I had the same impression, particularly driven by these quotes:

From this day on, I was even more disappointed — both with myself and tech hiring process — rejection, rejection, rejection. It honestly feels as if I am a complete failure and an unhirable candidate. How is this possible — I have received emails from people all over the world, who praise and give me thanks for the work I’ve done on my open-source projects and tutorials (and I say this as humbly as I can); people who see me as an expert on Hackhands; co-workers, friends and acquaintances from meetups, hackathons, conferences who apparently think I am a decent programmer—but I cannot pass a single tech interview. How?

TBH, the author sounds like someone who went to the interviews completely unprepared, believing that just because they have some open-source projects and tutorials, they deserve to be hired.

Nobody deserves anything. Most of my friends who went up for interviews with FAANG (or whatever is the latest acronym) studied and prepared for months even before applying.


> Nobody deserves anything.

I'm pretty sure doctors don't have to go get their high school bio textbooks and review cell structures to get hired at a new hospital.

It is absolutely insulting to someone who can prove a substantial history of productive software to ask them to spend hours doing trivial brain teasers for them. If you have open source projects under your ownership or substantial contributorship that the company you are applying for is using and they still treat you like a wet graduate coming off a brief affair with Java Swing you should feel insulted.


> I'm pretty sure doctors

Doctors have to go through over a decade of accredited education and training to even apply to be a doctor at a hospital. It is a restricted and regulated profession. Software development is nothing of the sort.

> absolutely insulting to someone

Wow, this really sounds like entitlement.

You don't deserve anything. Nobody knows how you developed that "productive software". Did you have help from a friend? did you take 10 years longer than someone else would? Did you copy some of it? Did you work alone and would be useless if you had to interact with others and work as part of a team or project?

Sure, when you're a widely known developer with plenty of real accolade (not emails or tweets), then you can expect to be treated differently.

Until then, don't expect to be treated any differently just because you have some code on github.


I know where you're coming from, but I don't actually think it's very clear what type of questions will be asked for most technical interviews, and that is what makes it so frustrating and soul-crushing. There is always something I do not know, so the kinds of questions I face during interviews is seemingly endless.

If I'm applying for a Front-End Developer position, the interviews can ask CS fundamental questions, or things I maybe learned by reading Cracking The Coding Interview, or doing HackerRank problems; or they might ask specific questions about any frameworks or libraries listed on the job posting, or they might ask more software architecture / system design questions, or they might ask about the Software Development Life Cycle, what is Waterfall, describe AGILE, how do you write testable code, the deployment process, security vulnerabilities, etc.

Maybe all of that doesn't sound so bad to you, but for a young, mostly inexperienced developer who is trying to get a better job, it is daunting.


I only skimmed the parent article, but I tend to agree with you.

I would add that a lot of the job entails quickly learning new things and solving new problems. For example, just yesterday I had to work with RavenDB, which I've never seen before. The complaint that you shouldn't need to know how to, say, write a BFS because you never have to do that IRL has only surface validity when you consider that much of your job amounts to doing things you've never done IRL.

In other words, if you can't (re) learn certain CS fundamentals in preparation for an interview, perhaps you'll have trouble on the job as well.

This isn't to let interviewers off the hook, however. Standard whiteboard problems like "reverse a linked list" are still suboptimal, IMHO, and even with these, there are better and worse ways to handle them. Do you focus on the interviewees reasoning and use the problem as a basis for deeper discussion or simply focus on their end result? Do you treat Sr & Jr interviewees the same? Do you seek out curiosity and ability to learn or a specific knowledge set. IMHO, if the latter, you're probably doing it wrong.


> The ROI on spending a couple of weeks preparing is definitely worth the possibility of a substantial raise at a big technology company.

If you're a junior that makes perfect sense. As a senior developer, you can spend as much time as a junior preparing for these entry-level coding tests and perform just as well. And that's the problem.

I'm a significantly better developer in all ways now than I was when I was a student and could rattle off answers to these questions without breaking a sweat.


Should a couple weeks of preparing be necessary when you have spent years doing the work every day of your life? This 100% biases your interviews to people willing to spend 3 weeks practicing problems instead of doing meaningful things, especially if they happen to have children to take care of.


> I think developers complain way to much about the interview process. The fact of the matter is that it's very clear what type of questions will be asked for most technical interviews and there is a wealth of resources available to prepare. The ROI on spending a couple of weeks preparing is definitely worth the possibility of a substantial raise at a big technology company.

If you can learn to be an expert interviewee in a couple weeks then what's the point of the questions? If they're testing such simple easily gained but useless knowledge (as noted by requiring you to study for it) then how does it even filter out good candidates from poor ones?


Can we get a (2016) added to this?

Also previous discussion: https://news.ycombinator.com/item?id=11579757


> The first round started with introductions, followed by a coding exercise — write a maze solving algorithm. What the fuck?! Seriously? Mind you, this was an interview for a front-end developer position and I am not a recent college graduate anymore.

If you balk at breaking down a relatively simple and straightforward problem into approachable steps that a computer can interpret, then what exactly do you think that programming is? You don't need to _know_ the algorithm. You need to be able to think like a problem solver. And if you can't do that, then what exactly are you doing?

> Anyway, that did not go too well, obviously, because it has been a long time since I took an Artificial Intelligence course in college

You should be able to do this even if you've never done it before. You have a start, you have an end, and you have decisions to make to get from the start to the end. That's the literal definition of programming.

The only people I know who think that basic pathfinding is some voodoo dark magic artificial intelligence problem and not just a simple straightforward algorithmic process that you should be able to do in your sleep for the rest of your life immediately after the first week of Intro CS 100 are not software engineers.


> If you balk at breaking down a relatively simple and straightforward problem into approachable steps that a computer can interpret, then what exactly do you think that programming is?

Programmers tend to be good at putting their head down and solving a problem. They tend to be very bad at live coding for someone whose job is to judge you.

A hiring process that measures you on the latter rather than the former is broken. It's like interviewing a novelist by asking them to extemporize a speech for you.

It's very annoying that every attempt to make the case against the latter is met with people equating it with the former. They're very different skills.

Maybe for some people evaluating performance in live coding is a good proxy for evaluating their programming ability in general, but it's pretty obvious that there are a ton of excellent programmers for which it's a terrible proxy.


> Programmers tend to...

Please don't generalize about people this way. It's not useful, and it's almost certainly a bad idea to hand-wave away not being able to think through something trivial with an audience.

If someone asks you to identify whether a bird in a photo is having a good day or a bad day, then it's reasonable for you to say "Wait, what?" Because that's a research problem that covers a whole host of different subjects (bird psychology, for one) and doesn't have a solution.

If someone asks you to implement efficient image segmentation, something that you probably don't know how to do unless you have a PhD in computer vision, then it's reasonable for you to say "I don't know how to do that but I can go home and research how to do it".

But OP wasn't asked to solve a hard problem. OP was asked to solve an incredibly simple problem and got angry about it.

> It's like interviewing a novelist by asking them to extemporize a speech for you.

OPs given scenario is more like interviewing a novelist by asking them to put a handful of words in an order that constructs a grammatically sound sentence. If you can't do that, then they don't want your novel.


"I really wish companies would be more transparent about their candidate rejection reasons."

There is just too much legal risk from doing so. Too much risk of wording being misinterpreted and being turned around to be discrimination etc. To provide feedback you'd have to have so much oversight and absolutely ridiculous levels of paranoia about the potential interpretations of what you'd said that the risk is just not worth it.

The value giving feedback to candidates provides to the company is next-to-nothing as well, especially for the larger companies.


That's definitely true. I've only heard of a couple of cases where people pursued rejection feedback and got it.

One of them was very important though. A friend of mine in college graduated from a degree program with a 100% employment rate but for 2+ years could not get a job. Anywhere. He was a really charming, funny and smart guy. Interviews always went really well...but nobody ever hired him.

One time, a guy who interviewed him called to tell him that he was sorry but they'd gone in another direction. He basically broke down on the phone and begged the guy to help him understand WHY this kept happening to him after 2 full years.

The culprit, it turns out my friend had a DUI on his record. He's never actually had a DUI but he was pulled one night after working at a bar and the officer arrested him for DUI anyway. There was nothing in his system but he had to stay overnight anyway for some reason. The arrest was never removed from his record and he had no idea it was showing up in background checks.

Once he got it corrected, he got a job almost immediately and is doing really well now, 15 years later.

I honestly wonder what would have happened if nobody ever told him about the problem.


Everyone says this, but do you know of any actual situations where a company's feedback led to a lawsuit? Even an unsuccessful or settled one would be interesting.

I'm also not totally sure about the minimal value of feedback. I interviewed at IBM several years ago, and actually did get useful feedback from the hiring manager. I was referred by someone way up the food chain and the feedback was fairly non-controversial (unlike my background, the job was much more business than technical), so maybe this was unusual.

However, the fact that someone was willing to spend 5 minutes explaining why their decision–and outlining some steps I could take to make myself more competitive—gave me more positive impression of IBM than their 3,000 different Watson commercials put together.


To give you a counter example though, I've worked at a big corporation in past, ended up in management hiring in engineers for my team. It was at that point I found out that the engineers that traditionally interview had never been trained on what an interview is for, how it should work, or specifically what not to say or do. I found that out because the interviewers would openly feedback things that were blatantly discriminatory and interviewees would feedback comments like "They sat me in a room for 3 hours, 3 hours of back to back technical interviews and no one so much as offered me a cup of tea".

We never had any lawsuits, but from that perspective I totally understand how the company I worked for (employs over 100k employees) saw every employee as a walking liability.


"There is just too much legal risk from doing so"

How about having them sign an agreement that the candidate would under no circumstances sue the company for the rejection? I would sign it in a heartbeat if I received some feedback.


Most state explicitly do not allow a person to give up their right to sue based on discrimination. Such a contract would be unenforceable and perhaps even used as evidence in a discrimination case.


There is a hiring infrastructure which is sort of like those dating apps that say they want you to find true love but really they want you to keep dating. I am thinking of a lot of HR at companies, recruiters, and people that enjoy interviewing. I have worked at startups that weren't really growing that fast but we're always interviewing to fill in the churn. Interviewing feels productive, feels like a company is growing. They say: We only accept the best so any test is reasonable. This tends to discount people's accomplishments and the idea that smart people can learn.

The internet makes it possible screen a zillion candidates, so like dating it feels like there are a million fish in the sea. Why value anyone's time when there is so many to consider? I was thinking about putting my CV for a very senior role and before you could put it in they made you take this like web IQ test. Seriously? I would worry about anyone who go through that for a job that expected more than ten years high level experience. Pass.


The core problem IMHO is that a great many of the people hiring developers are not themselves developers and have little to no technical knowledge. Companies depend on these ridiculous hiring processes because many decision makers believe it allows them evaluate candidates that they have no other way to evaluate. They see Google using these approaches and figure if it is good enough for Google it is good enough for them -- not realizing that Google is an entirely different situation with needs that don't match their own (Note to 99% of companies in existence, you are NOT Google and Facebook). I speak from experience. I have been hiring and managing IT staff for 20 years. I also happen to have 25 years of experience in development and infrastructure. One of my priorities when taking over an IT division is to get rid of these useless tests and I have received push back from internal recruiters, HR, and non-technical management every time I have attempted to do this. It requires moving a good percentage of of the hiring process from non-technical management to the IT staff and its not an easy change. Recruiters and non-technical management see it as a threat to their positions. Until there is a realization that you have to know something about a topic to judge the skillsets of others I don't see this improving (I expect that to happen... never).


I disagree with the OP's premise. Yes, nobody is going to write a BFS or `Math.pow()` on the spot, but it's way too easy for people to bullshit about their experience to rely on that alone, so BFS and `Math.pow()` is what we've got to work with. While I'm a proponent of giving interviewers a bit of homework and making the interview a pairing session, there are plenty of people that complain about this being free labor (even if it isn't). OP mentioned earlier that looking these up is easy enough, so what's the harm in looking these up before the interview so you can get that out of the way?

Also, failing five face-to-face interviews smells of personality or culture fit issues, not tech issues, IMO. I've interviewed tons of people over the years, and the ones that pass the phone screen but fail the tech screen usually failed due to poor culture fit (or padding their knowledge a little too much). I would see if it's possible to message one of your past interviewers and ask for a honest opinion of what they thought of you. While you're unlikely to get a response (giving interview feedback is, sadly, a big HR no-no), some people might budge on your offer.

Good luck?


I'm not exactly a gifted programmer or anything but a BFS seems pretty reasonable in an interview depending upon the amount of rigor / efficiency being demanded. I remember doing it for one of my very first interviews where I was asked to implement a website crawler and hold onto URLs. Having written a lot of them while trying to scrape sites for info before every other site had an API, this was super fast for me and was a very practical problem. It's not like they're asking for a topological sort of a graph where you can use Prim v. Kruskal and now you have to make sure you remember that near the neurons that activated your memory of your 3rd week of sophomore year 2nd semester when you had a really nasty cold. If you haven't heard of breadth-first v. depth-first that should be simple.

I do this problem constantly on my laptop when I'm looking for files and start getting annoyed at how long a search is taking. You start applying -maxdepth to a find command and fumble through directories perhaps and do a for loop. This is how I wind up doing a crude variant of k means clustering using cut, sort, uniq, and some creative use of awk when trawling through a bunch of log files.

But I do feel that anyone pretty serious about programming as a career understands there's little to be gained by more or less bitching about the state of hiring and a lot more to be gained by spending a few hours or so working on some problem sets occasionally. We're all busy and have to keep our technical knowledge up to date of course, but almost every well paid professional has to do this (doctors and lawyers are required to do this to even practice, in fact). I don't bother interviewing for FAANGS because my brain's fried and burn-out makes working on interview prep an order magnitude harder, but I don't go around blaming everyone for acting in their own self interest either.


In this regard, A lot of interviews are designed to test soft skills.

I know many recruitment teams aim for soft skills first because they don't want to bother their technical staff if a candidate is not a good match.

Many algorithm questions are designed intentionally to see how you tackle an uncommon problem or even an unfair situation and to see if you have a strategy to solve it. Of course, a candidate invited for front-end position at a decent company knows how to tackle his common tasks regardless of how complex they are in comparison.

However, that is definitely not true about all interviews. Many companies just copy these interview formats without understanding what is the actual idea behind them. Or how to facilitate these interviews correctly with different candidates from different cultures in different situations.

But you shouldn't go to interviews with this assumption. I'm sure at least one of those BigCorps had a professional recruiter.

After all, hiring a professional recruiter is as hard as hiring a good software engineer.


This is a good answer. I was talking to someone at a company that actually used questions like this.

He asked me a question like "You're shrunk down to a couple inches in height and put inside a blender. How do you escape?"

My response was "Is there anything else in the blender? Can I see anything nearby?"

He told me that I "already passed", but if it was an interview, we'd probably have a 1 or 2 minute conversation about it and then move on. The fact that I didn't start rattling off ideas, but instead asked smart questions, was what he was looking for.

It's easier to take a measure of someone if they can't figure out what they're being judged on. If they know how they're being judged, they optimize for just that one thing.


I believe the fundamental problem is, most organizations interview in a style similar to testing in college: ask a question about a technical topic, evaluate if the answer is correct. This doesn't correlate especially well to how well a candidate will program if you hire them, but it's the method people learned in school for evaluating someone's knowledge of a topic, so it's what they use. Most developers, and most managers, have almost no training on how to interview well in order to determine who would be a good candidate, so they fall back on the only halfway-similar experience they have.

Now, this raises the question of why our society's method of educating people (or determining if that education has been effective) is not very well correlated to their real-world performance, but that's a whole 'nother conversation.


The weird part is that this is almost exclusively found in organizations that think of themselves as "tech"-y.

I work somewhere on the border between neuroscience and machine learning. Interviewing in a "neuro" group usually involves a lot of talking--what you've done on project X, how would you approach Y, what do you know about Z. Coding does come up, but in the context of stuff that I've done or would do on the job.

Applying for a very similar job in a tech company usually starts with me reversing strings on a whiteboard or abusing C++ templates to calculate factorials or something....


Exactly. I worked in Business Intelligence and having both an accountant and a programmer background, whiteboard coding never comes up in interviews. Its more domain knowledge that's important, which is mostly about finance/accountancy.


Breadth-First Search (BFS) and Depth-First Search (DFS) are actually both pretty interesting algorithms (although relatively simple for the basic cases - there are some variations), partly because of their applications.

It's actually initially a bit surprising that BFS has such a simple algorithm using a queue, as mentioned elsewhere in this thread and in the Wikipedia article about it [1]. Might be non-intuitive on first look, but then turns out to make perfect sense.

[1] https://en.wikipedia.org/wiki/Breadth-first_search

https://en.wikipedia.org/wiki/Depth-first_search


And this being front-end, ie. JS, you can just use map/flat in a loop to get breadth first traversal.

   function bfs(list, cb) {
       while (list.length > 0) {
           let i = list.find(cb);
           if (i) return i;

           list = list.map(i => i.cn || []).flat();
       }
   }


Interesting, thanks. The Go Programming Language book also has a couple of nice examples of DFS and BFS, in the middle chapters. One is for sorting courses by prerequisites, and the other is for a web crawler, IIRC.


Many (most?) programming jobs are about understanding the tools and libraries of an ecosystem well enough to build something useful.

When I interview people, I want to know if they can think. I want to know what their attitude will be like (to work with/near them). I want to know what gets them excited (in the context of work, mind you).

The older and/or more things a candidate has done in their life, the more accumulated wisdom they will have. This will translate into making better choices earlier (about approaches to problem solving). Testing against university compsci concepts is more likely to get a candidate who will either ignore a well tested and commonly understood library in favor of rolling their own (less-maintainable) code.


This guy probably got a negative reaction because he says "hiring is broken", when what he meant was "my strategy for getting hired is broken." The goal of "hiring" is for companies to satisfy their needs, not to guarantee that every smart person that comes their way gets a job. A "hiring is broken" article coming from the position of a hiring manager would carry more weight.

Also I've never heard of this guy, not sure his handful of Medium articles and cookie-cutter tutorials is sufficient justification for his belief that Google should be lucky to talk to him.


I've had a 100% success rate at getting an offer once I realized the problem was me. (And I've had some bad interviews in my life before I got there, for sure.)

If the company doesn't extend an offer, it's because I didn't show them the value I can deliver their organization. Either I didn't present myself as a teammate that makes the group stronger or I didn't present the technical skills to make them realize the talent I bring individually.

Maybe I'm lucky -- it's a low sample size.

But I suspect that we all can do better at interviewing and at selling ourselves. Traditionally, that's a weakness in engineers.

It's also why, when I'm interviewing, I try to remind applicants what I need: give me something, anything I can use to sell you to my manager -- to argue "yeah, this cool guy is the right business choice". Help me help you. If you can't, you won't get hired.


You are lucky, it's always possible to fully show the value one can deliver but for there to be another candidate who they decide will deliver more value.

I agree that it's common for interviewees to sell themselves well.


have you read his code ? make a judgement after you have. be empathetic.


No, and I guess I could. But why should I? More importantly, why should/would a prospective employer?


Are you asking why someone - who is hiring a programmer, would want to read the code that a candidate might have written? Because it gives a good idea of the coding style, technique, ability to tackle problems, and put together a solution, and ship that solution, all of which are very pertinent when hiring a programer.


Why interview anyone then? Pick a resume out of a hat and hire that candidate.


Because obviously, you spent time to judge the guy by his "cookie-cutter tutorials", if you are really good judge of a coder, you would judge the coder by the code. Not by tutorials he writes.


Usually the people that says hiring isn’t brokens are the ones that benefit most from it. Algorithms and programming are not one and the same. The overlap but knowing algorithms tells me nothing about your knowledge of programming.

You know what I would be more interested in candidates knowing? Solving problems that have no immediate answers. For example, understanding how to architect a program without using a framework, Debugging a bug even though you know nothing about the library you are using (eg. Be able to read code and race the problem), and understanding the difference between algorithm and architecture in overall performance of a system.

You can memorise algorithms. You can’t memorise problem solving and architecture. Albert Einstein said something along the lines of, “Don’t memorise knowledge that you can look up in a textbook.”

What have you learned since becoming a programmer? Did you push your knowledge of programming or are you rushing to the next hype framework? I see many more of the latter than the former.

Knowing algorithms is like optimising for the flow of water for your tap in the house without understanding how the pipes layout affects the overall output. And that’s the problem when you only work on one small section of the program your entire career.

Apologies for the rant..


I was asked to write a function findSum(array1, array2, sum) that returns true if two numbers from two sorted arrays add up to a given sum. I was able to solve it with my eyes closed using a double for loop — O(n²) time complexity. For the remaining time I was able to reduce the time complexity by converting arrays to hash maps, then finding the difference by subtracting sum from the first list element and checking if that difference is a “key” of the second hash map.

OK, I can accept the fact that in some environments, knowledge of BFS (and the ability to implement it from scratch) can be considered a must-have skill for a front-end developer. (Not in all environments. But granted, in some environments).

But can someone please explain how the need to implement a function like findSum in less than O(n²) -- and in particular: to be able to whip out the sorting trick necessary to achieve it, while standing in front of a shitty whiteboard with strangers staring at you --

Can reasonably occur in an actual front-end engineering role?


Yikes as an engineer and not a hiring manager, I do NOT want to work with this person, and I would thank the hiring manager that passed.

It's not even just the attitude. It's the competence too. If you can't code a BFS on the spot, sorry... that feels like far too low of a bar.


You really should walk into your office tomorrow and ask your team mates to write out BFS on a sheet of paper in half an hour (and take away their phones/laptops). I have a feeling you would be quite surprised at the outcome.


It really depends on what you do but I doubt your average graphics programmer wouldn't be able to do it in a heartbeat.


> If you can't code a BFS on the spot, sorry... that feels like far too low of a bar.

I would guess 99% of developers I've worked with couldn't do this on the spot and I've worked on a lot of teams at a lot of companies lol.


I could mention something about the prevalence of this skill, but I don't want to spoil the surprise...


Tech job interviews: the Tinder of labor marketplace

https://medium.com/@carlosreutemann/tech-job-interviews-the-...


Why does nobody get it?

The coding interview is not about memorizing algorithms!

The coding interview is about demonstrating that you can think from first principles and DERIVE an algorithm!

If you understand the problem and the strategy for solving it, you don't need to remember the algorithm.


A lot of those algorithms were research questions that people spent a large amount of time on. The best way to prepare for these types of interviews is doing tons of practice problems until you get the pattern recognition to know what algorithms and data structures apply and fill out the code. Some things like DFS are pretty easy to derive from scratch, but others require more thought and very few people will be able to do it under pressure in an interview.


Are you seriously trying to tell me that you remember the code for quick sort?

I know what it does. Mechanically. I can visualize the recursive divide-and-conquer in my head.

I can implement it in any language.

All you have to understand is WHY the algorithm works.


I know how quicksort works well enough to write out some pseudocode quickly. Then once I've got the structure I can implement it. Same with lots of other algorithms, but if I had never heard of quicksort there's no way I would be able to come up with it during an interview.


Maybe some very talented candidates will do this but most will memorize a large toolbox of algorithms that people invented over a very long period of time that are known optimal for certain uses. And then the test is recognizing a problem that needs one of these algorthms. Very few questions let someone derive an optimal solution from first principles on a whiteboard in 1 hour with an audience.


You assume you are being tested on deriving an optimal solution.

What you are actually being tested on is whether you can derive A solution. ANY solution. And if it is inefficient, you can figure out what needs optimisation and where.

The very fact you recognize the inefficiency and you can speak up with ideas on how to make it better is data for the interviewer.

Perfect is the enemy of good enough.


But in an arms race where one person does this and the other person recognizes it as being solved optimally by an algorithm they know, who looks like a better candidate?


I interview people a lot, if you get to a solution to one of my questions too quickly you get a very small amount of credit and I'll move on to another question until I find one you don't know. If you finish the interview having been familiar with all my questions at best I'm bringing you back for another interview, at worst you'll be discounted because I'll get the impression you've memorized everything you know and you'll be stuffed the first time we give you something new.


You are unique in that respect then in my experience. I don’t do this in interviews and afaik have never seen this kind of bias against people who study for interviews. It doesn’t make sense to me to discount people who prepared.


It's not about discounting people who have prepared, what I'm looking for in an interview is: (1) Does this person approach problems in a systematic and logical way, (2) do they ask the right questions to clarify the task, (3) can they explain to others how their thought processes work, (4) can they formalize a discussion into a written description of the problem, (5) can they discuss in detail the nuances of their solution and relate that back to real world situations.

If I ask a question and the candidate knows the most optimal solution then it's very difficult for me to assess most of those questions - because they aren't actually thinking through the problem, they're just remembering the solution and I primarily want to hire people who can figure out solutions - even if takes them a little bit longer, because it's indicative of their ability to actually go beyond what's already established in the field.


We may be just using different definitions for what it means to prepare / know the solution. Most of the “thinking through the problem” in my experience is mapping the problem to a known class of problems which you have tools for. This is what software engineering is often, making entirely new and untested patterns is rare in my experience (and if it isn’t that could be indicative of an opportunity for improvement). And someone who has studied more will have more examples to map it to, and will likely do so faster. Disregarding a candidate because they have more examples and recognized them faster than someone who had to work through it from first principles is disregarding a candidate who will often provide you an industry standard solution for one who may be unaware someone has solved this problem already. It also seems to be an unrealistic goal, since most engineers I know who do work through things from first principles don’t do so with observers on a whiteboard in 45 minutes.


On the other end of the spectrum, I have little software engineering experience but having dabbled in competitive programming, I can solve almost all algorithmic interview questions optimally within minutes if not seconds. I still failed half of my interviews though (including Google even though I believe I answered every question perfectly). Perhaps it's due to my presentation, or my lacking portfolio, or perhaps the positions have simply been filled.

On another note, OP seems to rely on recruiters to approach him? Wouldn't it make sense for OP to actively apply to positions which may have better fit?


This candidate reminds me of someone who went on a date and is bitter they didn’t get a call back for a second, despite their qualities that obviously made them the right choice as a partner. Dating must be broken...


Totally agreed!

This dude most play League of legends


The 2016 discussion on this very article https://news.ycombinator.com/item?id=11579757


I'm going to join in and say: you should be able to puzzle out the pseudocode for a BFS, not from "memory", but by thinking about the structure of a tree and asking yourself what would be required to touch each node at each level before moving on to the next level. It's not fantastically difficult, and if you can figure it out just by reasoning about trees then you're a smart problem solver that I'd like to work with.


I think the real reason programming interviews have become torturous is to discourage people from changing jobs. It's a kind of unspoken anti-poaching agreement.


I used to feel like the author of this post, but trees have come up at each of my last few jobs.

BFS is a trivial algorithm and you really should know it. It is only a few lines.

Do you work with things that have dependencies? Do you work with hierarchical data? What about hierachical apis? Tree traversal is a powerful tool, and not knowing how to use trees is a sign you may take the stick-and-rock route when it comes to solving problems


Is it just me, or is HN, normally a much calmer and more reasonable forum than almost anywhere else on the internet, actually a little bit flamey and ranty when you get on the topic of tech interviewing? I've seen reasoned conversations about world politics, war, gender issues, etc. etc. on here, and never as much sturm und drang as on this issue. I do not necessarily exempt myself from this statement, btw.


It is very calming to see some more people in the comments are sharing this observation as well.

So many comments seem to be projecting onto, reading into OPs personality.


Sunk cost fallacy/hazing mentality. "If I spent 6 months of my life memorizing programming problems to get a job at Google, then everyone should have to".


I have a similar impression.


Although, still interesting to read the comments.


The google interview really isn’t that bad. In fact it’s kind of unfair in the sense that someone who just did leetcode for a month can pass it while also being a terrible coder.

It’s true the interviews don’t test your real coding abilities. It’s probsbly very low signal you aren’t going to go in and write spaghetti algorithms (but they will at least be fast).


>In fact it’s kind of unfair in the sense that someone who just did leetcode for a month can pass it while also being a terrible coder.

That isn't unfair, that's just a bad interview.


It is very unfair to people that have other obligations than doing leetcode for a month.


If you can pass a Google interview just from doing leetcode for a month, you got some bad interviewers there.

Last time I interviewed there I got pushed on just about every aspect of knowledge I had, going back over a decade+ of experience.

There was an awful lot of bullshit trivia questions buried in it, though, and a bizarre obsession with apparently expecting me to know every argument flag for a CLI off the top of my head.


If I had to resolve BFS problem I wouldn't bitch about it like this guy, because the only thing you need to know is how to move across the tree. The rest - you can come up with that - it's not that hard and it's showing your thinking process.


Yeah, precisely. BFS and DFS are part of a very small collection of algorithmic questions I consider fair game — but I would never ask them directly. Instead I'm likelier to ask something like pretty printing a directory structure, so a function that prints out something like

    bin/
      sh
    usr/
      bin/
      local/
        bin/
      var/
      ...


Here we go again. This bickering is never ending. Personally I have stopped caring.


One of the things that I wish we could meet in the middle on is getting the option to use your favorite text editor or IDE during these interviews.


This article is from 2016. We are in 2019, I'm curious what happened to the author, did he eventually found a job? switched career?


I don't get the entitlement of people who try to interview with Google. The ROI of a few months (or in my case years) of preparation for a Google interview is huge. In Google you are supposed to work on things for a year before it comes to fruition (and you get promoted), but people are patient. Getting to know a few algorithms is like real work: you put in the effort and you get a nice paycheck at the end.


Well firstly, Google's plan is not to hire people who are well-prepared, it's to hire people who are smart and familiar with working day to day with the concepts that are used in Google's interviews. If a significant number of people spent months prepping for the interview Google would have a serious problem with their interview process - unable to find the truly talented engineers amongst the naturally gifted.

But secondly, if months of prep is required to work at Google what you're saying is they're selecting for young people, rich people, male people - ie, all the people that can afford to be spending their spare time on interview preparation as well as holding down a day job. Go explain to a mother of 2 that the reason she can't get a job at Google is because she can't spend her evenings and weekends on hacker rank despite a job at Google never really requiring work at evenings and weekends.


What you say may be true, but there's a loophole for women that Google does specifically: at least in the Zurich office interns are 90% female. Also Google made a new category, called step interns (interns who are in the first years in university). They generally just need to be young, have good reviews, and this way they just need to get through 1 interview when they are converting to full time engineers.

As for old people: don't even try, Google has a huge age bias.


Why is this article popping again now? It's from over 2 years ago.


Something the author might not realize. It might be name discrimination for why he's struggling.


Why is it that people who post on Tedium find it hard to hire people or get hired?

Maybe the problem is that they are writing on Tedium.

https://www.youtube.com/watch?v=_f4oJ-DQdSY


I stopped reading in the middle.

I sympathize quite a bit with his frustration, but he does a good job of making it difficult to sympathize with him. He goes on about how there are books written on how to interview at Google, yet it's patently clear he didn't even try to read one of them. It's one thing not to know BFS on the spot because you didn't want to prepare. It's another thing to be totally shocked that they asked the question. A few minutes of searching the Internet, or casually browsing the table of contents of several books, would give you an idea of what they want you to know.

And not to defend Google or other companies, but frankly, BFS is not some super complicated academic problem that only smart people can code on the fly. It is one of the more basic recursive[0] algorithms there. So even if you haven't memorized it, it's not ridiculous to expect you to "derive" it.

And equating maze solving with AI. Really?

I've gone to interviews without preparation, just because "Why not?" However, I don't go on a rant when I do poorly on it.

[0] Actually, it need not be recursive.


BFS isn't really suited for recursive solutions. Trying to do so usually amounts to trying to write BFS with a stack only.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: