Real-life tests are THE best thing to send job candidates. It scales well (you don't have to spend personal hours on them) and you get real information.
This applies even to sysadmins. We have a favourite: set up a VM with a slightly-broken application in a slightly-broken Apache and Tomcat, and get them to ssh in and document the process of fixing it. Even people who aren't a full bottle on Tomcat will give useful information, because we get an insight into their thought processes. I recommend this to all.
(I note we've just done a round of interviews where we get a nice-looking CV and conduct a technical grilling. Hideous waste of time for everyone involved. All CVs should be regarded, on the balance of probabilities, as works of fiction. Do a remote self-paced test like this. You won't regret it.)
> Real-life tests are THE best thing to send job candidates
I agree, but only if you're allowed to use references/google/etc and given a reasonable amount of time to accomplish it. I've had a "real-life" test where I wasn't allowed to verify or look up information, or where I'm giving a very short time to execute, and I've always thought those were absurd.
The idea that you would ever have to do something like this in a bomb-diffusing type of scenario with no internet and a ticking clock, would only test if you have already solved the problem before, not whether you could investigate a new problem.
>I agree, but only if you're allowed to use references/google/etc and given a reasonable amount of time to accomplish it. I've had a "real-life" test where I wasn't allowed to verify or look up information, or where I'm giving a very short time to execute, and I've always thought those were absurd.
Employers that do this are ridiculous. As are educators, too. What is this, preparation for when coders are kidnapped by terrorists and forced to recite how to tar/untar stuff on command? (relevant xkcd, of course)
In my experience, they're usually startups or smaller companies that are trying to "copy google" in their hiring practices, but don't fully think out how they're doing it. It's not a good sign.
Well Google does it too. Their 'gang bang' interviews are notorious for weeding out people that don't perform well under short term pressure like that. "just write an algorithm for a stack on this whiteboard with a big O complexity analysis and memory usage pattern in 15 minutes while we all stare at you. What? You don't work well like this? Well this is how elite Googlers work all of the time so GTFO"
To be fair, a lot of people are capable of becoming that type of employee after gaining comfort at the company, and after building relationships with their co-workers. To expect someone to do that in front of strangers while under pressure doesn't necessarily correlate with a programmer who can evolve into such a person.
I've been in two interviews where they asked me to code without googling or looking at the reference.
Personally, I think it's as absurd as writing shell scripts on a computer with no `man`, or without being able to refresh your memory with `<command> --help`. Good luck with that!
I was in an interview where they asked me to configure specific firewall rules on a notepad with no access to a terminal, so I couldn't check man iptables for stuff like how to match certain sources/destinations hosts/IPs. And I was not too familiar with the opts (was it --dest, --dst, or -d? and how do calculate the network mask for this range of IPs?). It was also for a junior position so I thought they were taking a piss at me and simply told them so.
It's gets a bit complicated when your job candidates may find your test interesting enough that they post a complete walk-through of the solution online.
The candidate was kind enough to avoid posting the name of the company (to hopefully hide the walk-through from Google queries for "crack me hiring test company name", but an HN discussion might then arise around the walk-through, in which the company would be named, and well, there goes your test.
It's hard to accurately run "take home" tests when hiring because of just this.
If you're not changing your tests on every interview batch you're doing it wrong.
You're asking a potential candidate to spend hours cracking your program but you can't be bothered to change it for each batch?
I expect any company applying this interview technique to at least put some effort into it and even then it pays off since a batch could contain hundreds of candidates and you only have to build the program once.
This ^ "You only get out of it, what you put into it." goes both ways - employers should expect to extend some effort if they expect their candidates to extend some effort.
I strongly suspected that sometimes goals of the person responsible for hiring are not quite aligned with the company they work for.
Example:
Company's goal: Hire an exceptional candidate.
Lazy or overworked hiring manager's goal: Reduce the size of the stack of 1000's of resume's I have to go through using any means possible that will make it appear like he/she actually did the work to find the candidate.
"Lemme see, use keyword search to ditch the ones without 'Github' on their resume'. Now ditch the ones who don't have a college degree. Still some resume's left? OK, how about tossing the ones without a CS degree?"
Run a filter with enough conditions and you can get that pile down to nothing pretty fast. If that is still not enough filtering, ask the candidates to do something very time consuming. This a great way to weed out people who are very busy or exceptional at what they do and don't need to jump through your hoops.
The problem is that this makes it difficult to compare evaluations. If you want to have as objective an evaluation of candidates as possible, you need to subject them to exactly the same process.
I'm surprised they didn't make him sign an NDA for it...
For any person who claims to have years of Unix experience, they can easily do anything they want using man pages. Though I agree with you on the general premise though. One must be allowed to look up information they need to solve a problem.
>>I've had a "real-life" test where I wasn't allowed to verify or look up information, or where I'm giving a very short time to execute, and I've always thought those were absurd.
You certainly haven't worked at major IT services firms here in India. I spent early part of working years (around 2006) coding by purely reading books and discussing things with folks on internal mailing list. Because only managers and levels above were allowed internet access.
>>The idea that you would ever have to do something like this in a bomb-diffusing type of scenario with no internet and a ticking clock, would only test if you have already solved the problem before, not whether you could investigate a new problem.
This is unfortunately true even for coding interviews. Most algorithm questions are impossible to answer unless you have rote memorized the algorithms and their implementations.
If you have to give white-board coding interviews, the goal should be to have a question that's complex, potentially elegant and straightforward.
By complex, I mean the question should be difficult enough that you can't easily hold every parameter, case and step in your head: I want to see how you deal with situations that you have to mentally walk through and issues that you might lose track of.
By potentially elegant, I mean both brute-force and optimal solutions should be able to fit on a small whiteboard, without requiring more than 2-4 variable declarations and no more than a nested for loop (or some sort of recursion). The optimal solution shouldn't require putting together pieces that they aren't really expected to know/use, i.e. no more complex than arrays, maps, strings and ints, and any other "data structures" the candidate ought to know, given their background. For instance, if they claim to be a crypto expert, they should know how to use a black box that gives them private and public keys.
By straightforward I mean, you don't want the candidate to have to have an "aha" moment. If you're asking for them to write a max-flow variant, you'd better be explicit that you don't mind a really brute-force solution, or most of the interview will be spent trying to remember complex algorithms. Is the algorithm one that you can understand by walking through test cases/figuring out what steps you took in your mind to go from input to output? If it takes more than that, you're going to get a few false positives, i.e. "Wow she was great! No one's answered this so fast!" (because she just spent 20 hours doing this on a homework), and many false negatives.
Yeah, but B5Geek's is a pretty entry-level thing. I've encountered whiteboard tests where you have to recreate an algorithm that someone spent their PhD thesis creating. It seems like a good way to find "geniuses", but it just isn't practical.
In a lot of ways, it's a matter of identity crisis for software developers. We're all, industry included, not quite sure whether we're mechanics, carpenters, architects or scientists.
I've taken a whiteboard test that asked me to come up with an algorithm that was difficult for me. I got the job even though my solution was horribly inefficient (and I knew it), and the interviewer later told me that the point wasn't to see what algorithm I knew - it was to see if I acted like an ass when I didn't know stuff.
Not sure that's a good test either. When tasked to figure out something as part of a job, by a manager one knows and trusts, who takes feedback of "I don't know how to do this but I will investigate it", a person may act one way, whereas being tasked with someone's PhD thesis by a stranger on an interview may lead to a very different response.
I know I am far less tolerant of bullshit interviews than I once was, and would have far fewer qualms vocalizing what I'm thinking - "I don't know. If I saw this in real life I'd check Google, because it reads exactly like the sort of contrived problem used in interviews and thus is well documented, and likely solved quickly by those who have memorized it. I am not one of those. I'll take a stab at it now if you just want to see how I reason through it, but if coming up with an efficient, correct implementation is part of your criteria we can probably save both of us the trouble" - is that considered acting like an ass?
True. Testing for sysadmins is easier: you're after a way of thinking. So a competence test ("can you do what you claimed?") with a freeform "keep notes on your thinking along the way" is quite informative. You still need to interview, of course.
I confess I don't know how I'd apply this to developers. They pass fizzbuzz, OK - what do you do next?
> I confess I don't know how I'd apply this to developers. They pass fizzbuzz, OK - what do you do next?
I know what I would do, just have a 10 minute conversation with them about technology. I can easily figure out just where a person is in regards to their development as a programmer just by listening to them talk about what they're paying attention to in the tech landscape. If they don't blink when I mention Hacker News, I know they're on the right track. Do they know what Go is, or Haskell? Do they have an opinion about Node.js?
These criteria are intended to gauge motivation and enthusiasm in the absence of an impressive track record. If they're capable but not highly motivated, then that's fine so long as you're only hiring them to do stuff inside their comfort zone. We have a Microsoft guy just like that. If they're motivated but not capable, then with coaching they'll be able to do anything. The best way to discover someone's motivation is to get a gander at what they're thinking about when they aren't working.
If you're looking for someone who is motivated and capable, you can eventually find someone like that, but you'll have a hell of a time keeping them. Developers have so many options these days that you need to have at least 3 of: competitive salary, decent perks, fun work environment, a glamorous field; in order to hold on to them for longer than a year.
Most companies are lucky to manage one, if they're very lucky two. Since salary is often the easiest thing to manipulate and honestly the most important to the employee, err on the side of paying too much. It's expensive but not having competent developers is far more expensive.
Not infallible - I could talk tech near-perfectly (I am a native speaker of BS), but I am most definitely not a coder pretty much at all - but a good start.
Also, using HN and its fads as a measure of motivation sounds a little elitist. There are really awesome and motivated coders out there who couldn't care less about politics, mainstream tech media news articles, gadgets, Apple-does-this, Google-does-that, cool new language or JS toy of the week... these coders are likely focused on their code instead.
Programming is a wide field, there's much more to it than the front page discussions on HN can cover.
Of course, if you're looking for a Web developer to make "cool new stuff" for the latest fadgets..
You make light of web development, but that's driving the enormous demand for programmers in recent years and all of our salaries. I bumped my pay $20K when I took this job and I'm now asking for $30K more after a year. Most fields calcify after a few years and salaries level off as the managers do their thing. But making "cool new stuff" and the "latest fadgets" is still really hard. It might be the fashion industry, but business is booming.
Speaking as an iOS developer who's had a hand in hiring, it's the same as the "slightly broken Apache and Tomcat" scenario someone posted for sysadmins. We ship people a slightly broken app with a few common-yet-googleable problems, then ask them to build a tiny thing on top of that and ship it back to us.
The ability to solve common problems acts as a good proxy for experience, If they can't google how to fix a retain cycle or a broken constraint, we know they probably don't know what they're talking about.
The ability to build a good, small system on top of the broken app acts as a good proxy for development talent. If they can't easily set up a dead-simple delegate to do an asynchronous web call, they can't hack as a mobile developer.
Most importantly, if they don't care enough about the job to take two hours to whip up a super-simple app, then we don't have to waste our time interviewing them.
For iOS developers I found that a good test was to download and parse some XML or JSON API of a web service and display the result in a UITableView. . It' surprising how many people can't accomplish this.
For developers that have the academics, but not the experience, I've always said that the industry needs to adopt the apprenticeship programs that other trades have. Where you do a mixture of work for the company, and trade schooling run by the industry to teach you the specific skills you need.
For example, an IBEW apprentice will do a lot of gruntwork on the job site. Pulling wire, for instance. But they'll get exposed to how things are done, to trade industry expectations, and what a project looks like from start to end. Part of the week, they spend in a classroom learning the specifics of their trade. Not just the high level theory, but the nitty gritty, based on real world applications and problems. This apprenticeship lasts from 2-5 years depending.
I don't see why this model wouldn't work really well with programming.
Apprenticeship requires a constant stream of mentors. I'm not sure there is an ample supply of professional programmers who would be interested in taking on the responsibility of mentoring in addition to their full-time jobs.
It really depends what those professionals are used to. I did an Apprenticeship as Software Developer in Switzerland. That is, 1 year of school followed by 3 years alternating between the company and the school. So once you start at the company you already have a viable amount of knowledge in programming, databases, networking, etc and supposed to be able to work on your own.
You don't get constant mentoring at the company, but can ask one of the developers or fellow apprentices when you're stuck with one of the tasks you get assigned. Or for example, any time a developer had a small task that he felt, I couldn't do by myself, or it was too time critical he just asked me to sit beside him and watch, listen and take notes while he did the work.
How it works is different in any company but the outcome should be the same: real world experience on real projects with real clients and their time/budget constraints.
That means, once your finished and a professional yourself, it
isn't a question of interest but a completely normal thing to help out the apprentices. Because that's how you got to the point where you are.
We may laugh, but starting in technical support gives you an idea of how your users are using your product, if your documentation is accurate and helpful, what the pain points for people not in the department you are destined for are, and gives you an idea of how the product works.
It also quickly weeds out the people who can't effectively communicate before they get to the programming team.
We've been rotating the sysadmins through customer service. Not taking calls (our CS calls tend to be long and detailed, and a n00b isn't going to help), but being there for the CS people. It's been marvellously instructive. It also cheers the CS people that someone cares. Now we need to get the devs doing the same.
When did block release training or work experience start getting called co-op given that in various areas of the world a coop co-op means a cooperative like Mondragon.
I have only seen it (co-op) used in the last few years.
Algorithm tests only measure the number of man hours the candidates spend on career cup daily. Or how good they are rote memorizing things.
There are many ways to check how good a candidate is at doing their actual work. For instance you can take a candidate to a code review and see what kind of inputs they contribute to the discussion. This alone will give you good deal of idea on how well they understand code and quality. There are other ways, like say asking them to write unit test cases for a class- That would give a lot of idea on how they think with regards to breaking and fixing things.
One more way I have after doing basic checks is pair programming, or picking a totally new problem and working with the candidate to see how they think, how they work and how good their communication skills are.
There are many other ways too. Recently I've been given coding assignments for very practical applications, though those turn out to be a little difficult and time consuming. And at times the expectations are very high. Like asking the candidate to complete a whole application in very short times. It can be hectic with your current job, and a family. But generally those are a good indication of how good the candidate will do at their actual work.
A nice simple test that I like to give as a first-wave elimination:
Have a computer setup and running (all properly configured). Pull the network cable out of the jack a little bit (so it looks like it's plugged in but isn't).
Ask the person being interviewed to show me an IP being used by microsoft or google. (so ping/dig/nslookup/etc)
Let the person know that (a) the computer is in working condition (i.e. no drivers are missing) (b) the network works (i.e cables are good, switch is good, DHCP is enabled, etc.)
(c) tell them that this is a test to determine their troubleshooting skills
It is always disappointing to see how few ever open a term/cmd window (depending on the OS). 90% of participants just try to open a web-browser and type in "what is the IP address of google"
>Let the person know that (a) the computer is in working condition (i.e. no drivers are missing) (b) the network works (i.e cables are good, switch is good, DHCP is enabled, etc.)
So you lie to them? I don't understand what the point of this test is. My first inclination is to open up a term and ping google but I would be pretty annoyed that your "first-wave" test involved actively lying to a candidate.
If this is a user-facing tech-support position, then it is a hard job requirement that the candidate be able to handle being lied to by the user. I don't care what they say, they didn't check the cable, they didn't reboot, and they most certainly didn't not do anything that might have caused the problem.
>If this is a user-facing tech-support position, then it is a hard job requirement that the candidate be able to handle being lied to by the user.
Yes, I agree. Are we supposed to also assume that our bosses and interviewers lie to us, too? At no point does he say that he is emulating a user that needs support help.
Well if not lying, you should always entertain the possibility they are mistaken. The best confirm and reaffirm constantly. You know what they say about assumptions.
To be fair the first thing you do is look for relevant buttons to click on if there are any, the next thing is to google the problem, the thing after that is check it's all plugged in
All he said was that everything was in working order. He didn't say that the cable was fully seated into the receptacle. Your first inclination would challenge your assumption and you would likely follow the stack down to the wire. At that point you'll note that the link lights are off and reseat the cable.
This means - to me - that the network... well, works. And it clearly does not. If this was a Windows machine I'd probably wonder what the hell was going on with the Ethernet (!) sign in the bottom right of the tray and if it was a decent version of Ubuntu (11.x or earlier) then you'd also notice it quickly.
But still. He's pretty obviously leading the person astray by giving him the solid impression that the machine is in good working order. (which it is not) If you mean to say that he didn't CLEARLY specify that the "cable was fully seated..." I don't know what to tell you. That's a "gotcha" question. Not a good diagnostic one. Unless this is for a $12/hour CSR position. In which case, OK fine.
Every day, I have QA, tech support, my boss, and other developers tell me something that is not actually true, or simply omits any of the relevant information necessary to fix the problem. This is something you live with on a daily basis in a technical job involving complex systems that not everyone understands the entirety of.
If you can't solve a problem with only some information, and some of it incorrect, then how are you supposed to be able to help?
You are looking at this like it is a single-variable problem. This "test" should be easily solvable by a candidate. It is also a very bad one if it involves lying to the candidate. These two factors are not mutually exclusive.
No statement in that list is a lie; the computer is in working condition, the network itself works, the cable works, the switch it is connected to is working, DHCP on the computer is enabled. The cable being unplugged doesn't break any of those.
I don't know why this is downvoted. This is Hacker News, a hacker always uses the least amount of effort (the least number of lines of code for example) to achieve his goal.
Windows key + R / "cmd" / Return / "nslookup google.com" then when that fails, "ping 8.8.8.8" and when that fails, "Ipconfig /ALL", then ask "Do you use Dhcp or are ip's assigned?" as I reboot the computer and check the physical connections at the same time.
Doesn't giving someone a non-working computer that you tell them is working make you look like you haven't bothered to test it? As a candidate that'd be a warning that I'd be working with lazy idiots.
You should always remember that hiring is as much about persuading the best person to say yes to a job offer as it is about weeding out the bad people.
The question was "show me an IP being used by microsoft or google"
If you are asserting that google is not using the IP of it's public facing DNS service, this company is full of horrible tech people, so nevermind: I don't want the job.
I love this approach. I am good at what I do, and appreciate the opportunity to prove it directly. Spending a couple hours solving a technical problem or writing an essay is much easier for me, not to mention sending a more reliable signal, than trying to figure out what networks / buzzwords / whatnot various people in an opaque organization are going to want to see. Not to mention, it communicates to me that the people doing the hiring know what they want and how to get it, which is encouraging.
I believe there is a happy synergy here, by the way. A candidate who welcomes this type of test -- who sees a direct work sample as the easy and efficient way to prove themselves to you -- is probably someone who knows their work is good.
It's an effective strategy from the employer's side, but what about the applicant who is asked to spend hours on some test for every job he/she applies for? Are you paying these applicants for the time they spend jumping through these hoops?
And if your "technical grilling" fails to identify strong candidates, you're obviously asking the wrong questions or, at the very least, not asking the right ones.
As a candidate, I'd much rather they ask; I find that companies that don't ask for code are almost by definition unlikely to ask the right questions. If you must screen candidates for skillset, the less you have to rely on a proxy for that skillset the better off you are. As an employer, I find code samples irreplaceable, and I'm comfortable screening out candidates who decline to take the challenge out of principle or limited time.
I would rather spend hours on some test than be judged on a whiteboard problem, or any other problem which has nothing to do with programming on an actual computer that has an internet connection.
For the most part I don't complain if asked for an hour (or less) worth of work. I mean, I probably spend more time researching the company, working on the cover letter, replying to communications, and scheduling follow-ups. I'd much rather spend an hour at home with resources I know, than an hour in a room with a whiteboard.
The technical grilling was the thing that the people with apparently-strong CVs weren't passing.
Your objection is a fair one, but having come into the company through the process I describe I am fond of how it let me (a) show that I could actually do the stuff I claimed to be able to (b) show my thought processes and how I would deal with having received a box in this condition. So I was quite pleased to do it as a candidate. Certainly less of a waste of time than a clashing interview.
Its not hoop jumping to ask a candidate to directly demonstrate how they would go about doing the actual job they're applying for. If you were interviewing a cook would you hire them having never tasted their work? I doubt it.
The way I do this is the real coding test is only administered to final candidates. That's a bit kinder since only a few people with a serious shot at the position end up having to commit the extra time and since we observe the whole process (and discuss) it doesn't waste tons of our time either. Whiteboard and discussion can reveal a lot but IMO there's nothing so revealing as actually watching someone work.
As an aside I would much rather be interviewed this way than at a whiteboard. At least for me there's a certain amount of my programming skill that's nonverbal and flows easier for me at a keyboard.
> If you were interviewing a cook would you hire them having never tasted their work? I doubt it.
A cook is someone who can be hired with very little experience. A chef is someone who is trained professionally. And plenty of chefs are hired without a "taste test" based on their academic and employment history.
They're hired on a provisional basis, true, but they are paid while they prove themselves.
I was paid for a days work, when after the initial interview to test that I knew some really basic programming (think Fizz-buzz, but slightly more interesting), they brought me in to solve a real problem in their app. And I did, and they hired me. It was quite brilliant, actually, and they paid me for the days work at a decent rate. I was impressed.
The point is that you'll only get applicants up to a certain level. Once an applicant has enough skill and knows it, and also values his time, he'll take one look at your 14 hour challenge and laugh.
So in the end you're only hurting yourself because you'll mostly get naive and easily manipulated people who are fine for the lower ranks but absolute poison once they move up. That's if they don't wise up and leave first, costing you time and effort to find and train replacements and scramble to rediscover and distribute the lost knowledge they left with.
But then again, such a long duration challenge is already very telling of the company culture. If you can't respect other peoples' time, then your company is already on shaky footing.
he'll take one look at your 14 hour challenge and laugh
These things shouldn't be designed to take 14 hours. They should be designed to take one, maybe two or three . . . if you know what you're doing.
One of the other posters took so long because he didn't know what he was doing. Well, perhaps that is an uncharitable way to say it, but you know what I mean. :)
The test I describe should take an hour or two - it's a test of "can you actually do these things your CV claims you can?" Which is nontrivial, but I enjoyed taking it (fix a broken thing!) and feel fine about giving it to others.
>"can you actually do these things your CV claims you can?"
Yeah but how many questions must you ask before you are certain? Otherwise you're rolling the dice that the candidate has simply seen that problem before but has terrible habits otherwise.
Reduce the time required and you risk more false positives. Increase the time required, and you risk the skilled ignoring you.
But the process described in the post above is clearly about proficiency, so a challenge is not supposed to last 14 hours. ;)
And I admit you have a point. The thing is, people really have well-looking CVs and they can't code BFS in Python. Probably balancing of 'technical grilling' is a hard to master skill in and of itself.
Seems like this sort of test should be very effective at finding people who are already reasonably good at the job you're hiring for, but gives you little to no information about how quickly they learn, their willingness to try new things, etc.
Given how fast things change, even if you imagine the person staying for only a year or three, learning abilities are nontrivial.
Totally. YMMV. That's why you look for stuff on the CV like willingness to experiment, play with OSes at home, "describe your home network", etc. This doesn't replace the interview, it just verifies claimed competence and helps show thinking ability.
"Describe your home network" is an interview question now? Jeez, guess I shouldn't leave my job, I'll never get another one.
I am going to go ahead and challenge the assumption that your willingness and eagerness to learn on the job is not really correlated with how you spend your time outside your working hours.
I think it's an excellent question. Not because a candidate needs to have something special about their network (although if they do, that's a plus and a chance for them to talk about their interests) but because if they can't describe something they own, why should anyone think they will have the communications skills to describe something they're going to be working on?
I mean as extra points that help the candidate look interesting. It's like the "other interests" section on a CV: you should treat it as an opportunity to market yourself and add a hook to catch the interviewer's interest.
As an interviewer I've done something like this when needing to hire for a very straightforward job with very little time and an effectively endless supply of candidates.
This addressed the problem of our time crunch, but I doubt it helped the selection process overall.
I think this sort of testing is subject to all the same issues as most any other method of interviewing, but limited to a much smaller sample size.
Basically, it's arguably good for the interviewers and a coin flip for everyone else.
As an interviewee, I haven't encountered a test like this, but I expect my response would almost surely involve questioning the architecture decisions that led to such a problem in the first place.
In this case, as a sysadmin I was pleased to be able to show my chops in a realistic test. And our previous hire was also quite pleased to get a real test. But of course, there is potential to abuse it.
In this case, being able to Google how to fix a slightly-misconfigured Apache and Tomcat counts as passing the test with flying colours. Becoming a Google-Certified [whatever] Engineer at the drop of a hat is something we regard as a serious positive in a sysadmin. Open book tests are never easy.
Someone still calling sysadmins instead of devops in HackerNews is a rare thing. :)
I know many so called sysadmins who could not pass this test in a reasonable time. Sometimes people without even basic script-fu or networking knowledge are considered sysadmins. And this drives me crazy.
Yeah. I'm thinking of Joel Spolsky's 199/200 who are just the same bozos cycling through interviews until someone hires them by accident ...
"devops" is fighting words in our team. Damn, if I could get our devs who'd like "devops" on their CV to give a hoot about the "-ops" half of that buzzword ... when a dev shows awareness of ops issues with their shiny new thing, I make a point of mentioning their name positively to the dev manager. "X knows their stuff. Give them more good stuff."
:) When devs are in the loop it always comes to "working in localhost" debate. I like to say developers are working or living in their isolated bubble. And whenever their code comes out of that bubble and reach the real world ops are on fire.
Always the ops guys are in fault.
I think every developer should be given sysadmin 101 and 102 lessons.
I'd just like to note here that the sysadmins know damn well when a dev tries to blame them for something the dev did ... and we NEVER. FORGET. So don't do that.
I'm not sure how to educate devs to think ops. Perhaps have their phone ring at 3am when stuff breaks? I think that would close the feedback loop nicely.
(Let me note again: I LOVE the devs who think operational issues, who think end-user issues - what customers are actually like - who think "developer of the whole thing from go to whoa" and not just "coder on my PC." As I'd hope we all do here.)
As both a developer and admin, it's a little hard for me to not think across the aisle. However, I can think of developers that basically refused to think outside their solo - they tend to fit the type of person that really doesn't like to go too far outside the bounds of their workday tasks. Not saying that everyone is this way, but you wouldn't ask John Carmack to try to think about continuous integration and deployment stuff while he's stuck reading physics books and reading a bunch of papers, would you? Yes, good devs tend to have the curiosity and, more importantly, the concern to think beyond just their immediate day to day concerns of hyper focus and in-depth knowledge.
Sometimes you just need to hire more people or ask the rest of the team to accommodate someone that's that one random oddball. A management book I read mentioned Phil Knight talking about how the Bulls had room for one Dennis Rodman and only one, and made it clear that nobody else can pull the stunts he does without disrupting the harmony.
There's a huge difference between a dev that does something a bit out of ignorance and one out of indignance.
Some companies have developers spend some time in SRE (I believe Google practices this sometimes) so they can gain some insight, but it may not be the best idea for a lot of orgs. It's part of why most orgs that do some form of devops well tend to remove a lot of concerns off the table by using stuff like AWS. Meanwhile, silozation helps people maintain some sanity and focus in larger orgs where there's so much BS work on top of your technical duties.
Sometimes the culture is short-sighted and people are at odds with goals though. I've been penalized by managers and peers for not paying enough attention to my dev duties (which were pretty meh) when I was busy helping support and sales help bring in and retain $2 million in accounts that they later named me on calls as their informal engineering MVP.
But really, being aware of what other people care about in their job is a contentious issue that I mostly think boils down to personality and general ideas of teamwork.
But presumably that happens some time down the road long after you have made your decision. Either that, or said secrets are published and heard about by an audience which are tuned to hear about interesting things quickly, who may also be what you're after.
I remember a website providing failing VMs for this purpose. I'm not even sure it wasn't just a suggestion .. but if someone knows such a site, feel free to leave the url.
This applies even to sysadmins. We have a favourite: set up a VM with a slightly-broken application in a slightly-broken Apache and Tomcat, and get them to ssh in and document the process of fixing it. Even people who aren't a full bottle on Tomcat will give useful information, because we get an insight into their thought processes. I recommend this to all.
(I note we've just done a round of interviews where we get a nice-looking CV and conduct a technical grilling. Hideous waste of time for everyone involved. All CVs should be regarded, on the balance of probabilities, as works of fiction. Do a remote self-paced test like this. You won't regret it.)