Hacker News new | past | comments | ask | show | jobs | submit login

A big issue in modern hiring, IMO, is the combination of diverging specialized platforms and languages combined with overemphasis on keyword-hiring.

When I first got into the industry you had:

---

* old fogey LISP programmers

* old fogey FORTRAN programmers

* old fogey COBOL programmers

* Assembly programmers

* C/Win programmers

* C/Unix programmers

* C/Mac programmers

* some people on the cutting edge looking at this new "C++ thing"

* everyone else

---

I mean, sure you had some specializations but nothing like what you see now. Now you have:

---

* young retro hipster LISP programmers

* JavaScript/Client-Side programmers

* JavaScript/Client-Side/jQuery gurus

* JavaScript node.js programmers

* Python programmers

* Python/Django programmers

* Ruby developers

* RoR developers

* Java/EE programmers

* Java-Android programmers

* Clojure programmers

* Scala programmers

* Perl programmers

* .NET/C#/Win32 programmers

* .NET/C#/Silverlight programmers

* .NET/C#/WP7 programmers

* Objective-C programmers

* C programmers

* Embedded C programmers

* C++ programmers

* Embedded C++ programmers

* Go programmers

* D programmers

* Game Industry programmers (usually C++, often treated as a wholly separate category)

* PHP programmers

* etc

* etc (this is maybe 10% of the list I could generate without getting into the really obscure stuff)

---

Of course a spectacular programmer can pick up a new language/platform quickly and is usually learning non-day-job related programming languages on his or her own time anyway, but I've still both experienced and heard anecdotally of many situations where spectacular programmers were passed over without a second glance because they didn't have "at least X years of KEYWORD-HEAVY-SPECIFIC-TECH". Viable commercial quality languages and platforms are diverging way beyond the ability of even the most passionate programmer to be regularly using all but a small minority of them.




Just the other day I saw a job requiring "3+ years iPad experience".

I understand the appeal of hiring somebody who needs minimal training, but they may still not be the best pick. When it comes to a long-term position, it is better to hire the great developer who takes two weeks to get up to speed with some language or technology than it is to hire the mediocre developer who happens to have used Node.js or whatever.


But that in itself isn't new...

When Java was young, I remember many tales of Java job advertisements demanding experience extending back before the language existed.

My impression is that the current insanity isn't so much the absurdity of the bureaucratic demands but the degree to which people take them seriously...


I'm pretty sure these instances (which I remember hearing about too) are just example of how out-of-touch HR is. Someone says:

  I need a developer with 10 years of professional
  experience, and he needs to know Java.
HR hears:

  Developer with 10 years of Java experience
That, and we all know that sometimes the job listings are just there so that they can say that they tried to find other applicants, but the friend-of-a-friend is actually the 'best fit' for the job (i.e. those job applicants are designed to fail).


That is nothing. DHH posted a tweet about a company that required 7 years of rails experience.

The key to overcome that is to not give a fuck and just apply anyway. I applied to countless jobs right out of University that turned me down, but that didn't matter since more than one didn't. Nearly all of them expected more experience than I had, but again it is just meaningless HR talk.


I suspect this has a lot to do with recruiters/HR being the intermediate. If I hire a programmer, I'd prefer him of course to know the platform he's working on and hit the ground running, but I'd much rather have a good programmer that can learn a new platform and become productive in a month than a mediocre one which knows the platform but will be mediocre forever.

The problem, however, is that if I am a software engineering team leader, there's no way I can talk (or have anybody from my team talk) to every candidate out there to see if they're good. I will have to use somebody to screen people, even before they get to the interview. So recruiters, etc. come in. And it's very easy for them to operate on the keyword basis and very hard for them to find out who's good and who's not. So they do what's easy, since otherwise they'd be out of business or seriously hurt their competitiveness.


It's not just an HR issue. I've come across a number of situations where developers are making hiring decisions and using arbitrary criteria to base them on. E.g.:

* The fire storm over Deviant Art's 'we only hired 0.0000001% of people' blogpost/HN discussion.

* I went through the application process at a start-up where they were "really excited" about me after the first interview, but dropped me after the second because of a couple of stupid trivia questions. Trivia questions related to callable classes in Python, a feature that while interesting is (from what I can tell) little-used and extremely easy to pick up (took me 5 minutes of looking at the docs).

* I've seen groups within my own company pass up people with potential because they are 'too old to be junior developers.' (In this case, it was someone with development experience, though not too much, that was changing careers and who showed promise in his interview questions.)

* I know of particular people within my own company that are really smart, but also really harsh on interviewees. The tune seems to be along the lines of, "I'm a rockstar programmer, and if you're not too, then you're an idiot and I don't want to work with you."

* I know of places where it's generally admitted that current employees would not pass the interview process for the job that they do, even though no one thinks that they are inadequate for the position.

These are all examples of developers also doing a horrible job at hiring by finding arbitrary criteria to base their decisions on that have little to do with whether or not someone is a qualified candidate.


On the trivia-questions angle, I do think developers who've been working with a particular set of technology for a while tend to highly value deep knowledge of the particulars of that technology, and under-estimate the competence of someone who doesn't come from exactly the same background. It's not so much that they feel the detail is important, but that it becomes a heuristic of, if you don't know X and call yourself a Y-programmer, you must be incompetent or faking. Which is a tough call to make with all these permutations of experience people could have, and excludes anyone whose technology/stack experience is even slightly different.

The traditional example is interviews for C coding jobs, where when developers interview, they gravitate towards standards-lawyering type interview questions. That at least has the advantage of being a fairly stable community, though; being a "C coder" has a certain set of cultural assumptions, and it might even be true that if you don't know certain trivia questions, it's a reliable litmus test for whether you're "really" a C specialist.


This is why I never make a judgment based on the answer of a single trivia-like question. I will jump all over the map and ask about as many different aspects of a topic as I can, hoping to map out where the depth of knowledge is for a candidate. If I am getting mostly blank stares I then give them the opportunity to decide what to tell me about the subject, in case they can surprise me. Unfortunately the vast majority of the time candidates are unable to convince me that they have deep knowledge on a subject. I want to find qualified people as much as these people would like to be considered qualified. But sometimes I feel like I throw a bunch of stuff at the candidate and NOTHING sticks.

I am beginning to believe that the majority of people who consider themselves experts or experienced are really just novice or intermediate and don't realize how deep the rabbit hole goes.


Exhibit B: We code in Ruby, but are open to engineers with strong backgrounds in any interpreted or functional languages (Python, Lisp, Scala, Erlang . . .). Even a Java background might be OK, if you can show us that you are still learning and growing as an engineer. No C# /.Net engineers though: there are limits!* Ad for Principal Engineer at Slideshare


My point wasn't to paint all developers as being bad at hiring. My point is that HR shouldn't be used as a scapegoat for what's wrong with technical hiring.

I understand that it can be frustrating to feel like some non-technical person is blocking you from talking to someone that you can relate to and prove your value to. But it is just as frustrating once you get to someone technical that can understand you, only to find that that person is using equally arbitrary criterion to evaluate you.


I messed up the formatting on that.

I agree with your point, and posted that job ad in support of it: To disqualify someone because they are using C#/.NET is arbitrary and stupid, and apparently came from engineering.


These guys are trying to be funny, but in fact they probably are just limiting their available pool of talent. It's not a big deal - they probably need just one person, so they'd find that person anyway. But it does look somewhat childish.


Out of curiosity, how old were the candidates who were deemed too old to be junior devs? I'm in a similar situation myself (turning 29), and am concerned about my age being bad juju.


Add node.js, MongoDB, backbone.js, Redis, neo4j, coffeescript, html5, CSS3, python, django, zend, spring, hibernate, web services, etc.

They want to hire perfection, but are themselves imperfect.


I agree completely. This is especially true when the KEYWORD-HEAVY-SPECIFIC-TECH is something that isn't easy for a programmer to gain meaningful experience in outside of work, like cloud computing, distributed software, or MapReduce architectures.


George, I really like your laundry-list post - and you're right, it could be much longer (at the very least, break "web programming" down into frameworks). And it inspires a thought that is only tangentially related to the OP's point, and that is the utility of fragmentation. There's a curious phenomena that when you list all of the languages out like that they all seem so...equivalent.

And yet they are only equivalent in the same way human languages are: there is a shared core of ideas that are expressible in any language, and then there are things which are easier to express in some, and which cannot be expressed in some others.

The proliferation of computer languages seems to be strong evidence to the claim that programmers do NOT believe in the equivalence of languages. And perhaps pg is right, that we already have a winner, Lisp, and most people are just too blind to see it.


I have encountered a few (too few) companies who recognize the equivalence of languages, and recognize that moving from, say, Python to Java isn't very difficult and isn't nearly the same gulf as trying to move from C to Java.

That said, the proliferation of language/framework-of-the-week projects is a serious drag on the industry. Its too easy right now to roll your own framework/JVM language/interpreter. The temptation is too great to manufacture a framework/JVM language/interpreter to solve your sort-of unique problem instead of adapting an existing one to your needs through modifications or extensions. In addition to clouding the buzzword pool, it hurts the position of existing languages because the energy that could have gone into improving language/framework X now goes into developing Y based on X instead.


So we are in agreement and the question becomes: how can we debunk the belief that these languages offer any real difference in productivity?

Presumably if we could show this then the world's programmers could focus on porting old code into "Codesperanto" and everyone would be happy(er).


Why did you list Ruby/RoR users as developers but list everyone else as programmers? I'm learning RoR so I'm curious if there's a specific reason for your wording.


I would have never noticed I did that if you didn't point it out, it wasn't intentional so I'm not sure how that happened.

FWIW I use programmer and developer (and coder, and...) interchangeably so that wasn't meant as a slight on Ruby programmers or to imply that they are somehow different from anyone else on the list.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: