Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Organizational Skills Beat Algorithmic Wizardry (dadgum.com)
193 points by sbierwagen on June 20, 2013 | hide | past | favorite | 67 comments


I agree with the OP but I still think the power of algorithms and datastructures should not be understated, as they power most of the software everyone (including "organizational" coders) use, e.g. databases.

For example, I'm writing a pure JS database (https://github.com/louischatriot/nedb), and using indexing I was able to speed it up hundreds fold. That's clearly not something I could have done without knowing about self-balancing binary search trees.

I also think that knowing algorithms does make you a better programmer, even if you don't use them on a daily basis.


There really are two classes of programmer : the one that build "tools" for other developers, and the ones that build end-user software.

The first ones needs to be fluent in every types of algorithms and data structure. The other type will have the task of modeling real world processes into a computer (which means "concept organization translation" rather than pure algorithms).

The first ones deal with primitive types, and very rarely more than that. The second ones need to model all the entities of a real life problem and their complex interactions (complex in the sense of combinations and organization).

That's why I think there will always be a gap between those two developers. They are dealing with very different kind of tasks, yet their jobs titles are the same.


"There really are two classes of programmer : the one that build "tools" for other developers, and the ones that build end-user software."

Actual the second group is occasionally saying, "WTF was the first group thinking?" after spending hours trying to get the tools to work for the real world. (Try spending a few hours/days writing wrappers for PKI software with shitty APIs...)


This distinction is artificial. I know I do both every day.


So do I, but those who build basic tools very often also make stuff for end users too, but there is a lot of programmers that only do end user software based on existing tools. They think basic tools is something someone else does, and if there is no such tool, then what they tried is "hard to do". It will never appear to them that they can make the tools themselves, or they don't have the skills to do it.

The distinction is real, in that many programmers never do build basic tools, while it is artificial in the sense that there is no reason for this other than lack of skill and organizational culture that says that that is not something you are supposed to do.


I would think the point is that if you have a job where someone doesn't need to do both, which is many corporate programming jobs, you should't hire for both skills.


There really are two classes of programmer : the one that build "tools" for other developers, and the ones that build end-user software.

Where exactly is the distinction? How would you classify Monodevelop, or git/svn, or the C++ STL, or SQLite, or Emacs, or the Unix shell utilities, or Photoshop?


This Yegge article from 2007 is relevant:

http://steve-yegge.blogspot.com/2007/06/rich-programmer-food...

"If you don't know how compilers work, then you don't know how computers work."

Becoming better at the first class of programming that you describe, will also make you better at the second class.

And a lot of times, "modeling the world" is a pitfall of object-oriented programming, and something that you don't really want to do. But it's very tempting if you've only ever programmed in Java.


Knowing how compilers work is very useful foundational knowledge, but that's different than knowing, off the top of your head, the underpinning algorithms needed to implement a compiler!


Really? They sound exactly the same to me. What, in your mind, does "knowing how compilers work" consist of, other than "the underpinning algorithms needed to implement a compiler"?


The exhaustive list is longer than I have patience for, here's one example: understanding how the parse and lex steps work, and what is accomplished is useful without knowing how to pseudocode a recursive decent parser. It's worthwhile to write said parsing algorithm once in your life, it's not necessary to keep that knowledge warm in your head throughout your career.


Hmm, I guess I don't know what "understanding how the parse and lex steps work" would consist of that didn't involve being able to pseudocode at least some kind of parser, of which the recursive-descent type is the simplest. But maybe I'm blinded by having spent too much time writing parsers. Or maybe you mean "knowing how" in some kind of total back-of-the-hand mastery sense, as opposed to "you could eventually get it to work"?


It's true there is little overlap between these two categories. But I found that the low-level work I do (such as the db mentioned above) helps me understand better and be faster at high-level, "real world" work (such as my startup's website and API tldr.io).


>I also think that knowing algorithms does make you a better programmer, even if you don't use them on a daily basis.

IMO this helps at the most very basic level because it lets us consider "hey, I wonder if there's an algorithm out there that'd help me with this" and gives you the impetus you need to go looking.

Kind of a "you know what you don't know" situation of sorts.


In my experience, some of the smartest, best guys at algorithms and puzzles make some of the worst coders. No one can understand their code, it's brittle and their work eventually gets replaced by code written by somehow with half their IQ but a decent sense of organization.


We are all mortals at the end of the day. You can choose to study algorithms or you can choose to study clean code. Few have the time to do both. I've noticed that it isn't just algorithms though. People that are really good at reasoning in code (maybe from studying algorithms so much) have a much higher tolerance to code complexity and produce much worse code as a result.

That said, I've worked with a few people who got both sides very well and they were a treasure to work with.


You don't have to "study" clean code. You just have to overcome the tendency to write code in an overly smartypants way. Coincidentally, in my experience, there seems to be a correlation between people who love to study algorithms and people who love doing this. :)

I can reduce that 20 line function into one line of maps / filters / reduces too, but I know I shouldn't if it's going to take 20 minutes for someone else (or me 6 months later) to get through it.

Basically, don't aim to reduce Kolmogorov complexity, aim to increase readability. These two aren't completely mutually exclusive anyway.


> You don't have to "study" clean code. You just have to overcome the tendency to write code in an overly smartypants way.

This is a great point. As an industry, we seem to be impressed by complex solutions (even if the complexity is unnecessary and the code unmaintainable). Writing clean, obvious code is often met with a "Duh, I see why it was written that way", but without understanding that it took a long time to get the complex solution down to that point.


I agree with the sentiment of your comment, and especially like your last two sentences. Maybe I'm just expanding on your last sentence.

I have to object to "20 line function into one line of maps / filters / reduces". I don't know what the 20-line function looks like, but I find use of classic higher order functions instead of loops makes code drastically quicker to read accurately.

As with human language, a larger vocabulary can be used in an "overly smartypants" way, but it can also be used to more quickly and robustly convey information between people (or to your future self).


Clean code is more than just not writing obtuse code. There is a lot of literature on the subject spanning advice about patterns and principles to cookbook recipes for problems and so forth. The challenging part with clean code is that most of it is fairly subjective. More of an art than a science. We know clean code when we see it, but we cannot always reproduce it ourselves.

If you haven't read the literature and do not practice with the different techniques, you are probably not writing clean code.


> in my experience, there seems to be a correlation between people who love to study algorithms and people who love doing this.

Very well put.


This seems very unlikely.


The absolute best people I know at "algorithms" are either CS theoreticians or straight-up mathematicians. They can program, perhaps even better than average coder, but they don't have quite the same whole-system view as a systems person or professional software engineer.

If you're solving problems that require algorithmic ingenuity, you absolutely want these people. And if you can pair them with a good engineer who can integrate their algorithm into a non-trivial system, that's when the magic happens.


It's true, if you're lucky enough to have someone who can write organized code that replaces the original clever junk. https://news.ycombinator.com/item?id=1221756 (ignore the top comment)


Precisely so. I consider myself to be a pretty good coder because I'm good at keeping track of many things going on at once (I used to be a chip designer in a prior life), but I'm 100% sure I'd flunk the puzzles that companies interviewing for coders throw out in their interviews.

Good to see a realization by Google that those puzzles aren't all that useful: http://qz.com/96206/google-admits-those-infamous-brainteaser...

Google might be one company where algorithmic wizardry might be expected to be more important than most, so that's saying something...


I think that the only reason the algorithmic interviews are still so common is because of the difficulty in analyzing the real skills of the candidate. I have worked with guys really good at algorithms, but it is no way related to how good they are at building complex projects.


There's a difference between asking an interviewee to implement an obscure data structure and knowing your basic algorithms and knowing how much complexity is added by using certain data structures. i.e. I'd expect any coder to understand what the performance trade offs are between a hash table and a binary search tree, and why you'd use one over the other. This stuff matters in day to day backend code whenever you're dealing with a non-trivial dataset.

However writing off in-depth knowledge of algorithms as esoteric or a waste of time is disingenuous in my opinion. I've seen lots of coders write over-engineered, far too complex code, but they were usually not great at algorithms.

Studying up for a Google interview was a transforming experience for me as a programmer - I realized how much I was missing and how little I'd let myself forget the basics since college. It made me a much better coder and made it much easier for me to learn new languages and systems. Understanding CS fundamentals gives you a great platform to work off, and understanding the complex issues makes you a better coder for the day to day stuff.


When I interview, I use this question is a reverse fashion.

I asks how would you implement a b-tree index (for example). The right answer is -- "I wouldn't. DB vendors put years into getting this right.".

If you want this sort of gimmick to stop, when you are in an interview, you should just have the balls to ask how it is relevant to your potential job. Their response will give you a good idea if you really want to work there.

As an interviewer, you litmus-tests on questions like this is that they should be (a) answerable by all your existing employees and (b) reveal the thought process of the applicant. If they don't do that, find some better questions.


That may work for you and I don't mean this as personal criticism, but hostile interview techniques like that are enough for me to not accept a position if it's offered. Likewise, if I were interviewing someone and the candidate asked hostile questions to see if I "had balls" that would be a no hire. Just my opinion, of course. YMMV.


I think you misread my post slightly.

I consider the trick questions from Google interviewers to be hostile. The only reason not to call them on it in the interview is that you want the job and think that you can get it by being nice rather than honest.

"having balls" != "hostile". "having balls" == "telling the truth"

The point I was trying to make in an interviewer is that you really want to find out what someone is going to do when you ask them to do something that they think is stupid. Because you will eventually.

As an interviewee, you really want to find out what happens if you have to challenge the management.

You can be both honest and nice. It is just harder than trying to answer the question directly.


I think my company does it right.

After a phone screen we ask candidates to complete two problems at home at their leisure. One is a simple filesystem traversal problem where we ask you to build in some runtime extensibility, and the other is a binary space partitioning problem (2D bounding box query) with a very difficult performance requirement (low 10's of ms max for 1M records).

All of our interviews are conducted by more senior software developers. Before we ever interview you we read through your code. These aren't huge tasks, but they're enough to show whether or not you use good organization. Then comes the interview...

We do use the first problem as sort of a litmus test. We figure that runtime instantiation of an arbitrary class which implements a known interface is a little on the edge of the realm of what you'll be doing day-to-day, but it's close enough that if given the problem to take home you should know how to Google the right runtime API calls to make it happen. And we're pretty flexible. If you misunderstood the extensibility requirement and when asked how you'd make it "runtime extensible" you can spit out something marginally close to "MyInterface blarg = (MyInterface)Class.forName(pluginFQDN).newInstance()," or even "I'd Google 'instantiate class by name in Java'" you're still doing more or less fine. You're doing very well if you can tell me how to add the plugin to your tool in such a way that it doesn't leak implementation details to the user either through its use or its installation procedure.

We absolutely do not use the second problem in anything close to a pass/fail manner. If you get the optimal solution, awesome. If not, no problem. Either way, let's talk about it. Let's work together. Let's have an in depth technical discussion about the various pros and cons of different techniques to solving this problem. If I say something which in the context of our discussion is by common-sense reasoning obviously wrong, will you call me out on it (preferably politely)? If I say something that you don't understand will you just nod and smile, or will you say "wait, what?" Or the single biggest problem I see surprisingly often - are you capable of in some way communicating a semi-complex idea to another person who is "skilled in the art," either verbally, on the whiteboard, or otherwise? These are the things we're evaluating on the second problem, not if you can pull a kd/quad/r/other-tree out of your ass.


Asking candidates to complete problems at their leisure is a very good idea. But recruiters can bungle up on this too.

For something that's purely procedural in nature, asking for a complete and proper object oriented solution would not be such a good idea. Likewise, forcing the candidate to use pure procedural method for something that definitely needs OOP.


Fortunately we have good recruiters. Once they've identified a candidate they play purely a support role in the technical side of the evaluation process.

I'm not sure I follow that last statement 100%, but if you "go against the grain" all we'd care about is that you're able to reasonably articulate the "why" behind your decisions.


I think the best interview I ever had I was sat down in front of a computer and asked to complete two different classes and make the tests pass. This tests ability to write code under a deadline and shows that the candidate knows how to use some programming tools editor, testing framework etc.


Very true. So much of the day to day job is everything except solving the very hard algorithms. But it does help to have that person around.

A separate thread (https://news.ycombinator.com/item?id=5911017) discusses Google giving up on some of the insanity. In the original NY Times (http://mobile.nytimes.com/2013/06/20/business/in-head-huntin...) article, Question 3 covers doing away with brainteasers.


I just had my ego smacked in another algorithmic interview with a large, fancy software company a few days ago (took too long to get too rough an answer). I like to think I'm not stupid - my current day to day developing gig is heavy on thinking & mathematics and I do OK with it - it's more than just "organization". It just seems like if I spent a few hours a week entering TopCoder competitions, in 6 months I could ace these things - but what end result skill is the interviewer really selecting for?


> It just seems like if I spent a few hours a week entering TopCoder competitions, in 6 months I could ace these things

Then why don't you just do it? And you forget once for all about these algorithmic interviews and show your other more job-relevant skills.


Absolutely.

There are times when a new algorithm (or actually a new data structure) will revolutionise the system. But taking advantage of that is an effort of reorganisaing the code.


I've always thought that companies like Google miss out on many talented, hard-working, passionate developers simply because those developers can't tackle the ridiculous programming questions in interviews. Then these same companies turn around and complain that there aren't enough qualified candidates.


The biggest problem here is that I don't see a clear way how do you assess organizational skills of a candidate?


Have them refactor a 200 line program and then discuss with them why they did what they did.

Or given the same codebase have them add a feature and some form of testing.

There are many ways to simulate in a highly compressed manner the typical tasks of a programmer without resorting to algorithmic puzzles.


As a side note, these kinds of tasks are also pretty common in at my university in the practical computer science courses.

Often, you aren't supposed to write a big thing at once, but you are given some partially working code, with some incomplete set of test cases. You then have to finish the implementation, and you are also encouraged to add more test cases.

I found it especially interesting that the test cases were incomplete on purpose - you are supposed to understand and solve the given task, not just to blindly write some code that happens to pass a given set of tests.


Try to solve with them a problem you currently have. If the candidate is adding value, it's a good sign.

It doesn't mean you shouldn't assess the candidate's technical skills.


I'd like that kind of interview. When I get asked to solve a problem that's totally artificial, the type of question that's designed for interviews and programming competitions, it throws me off track. I feel like I'm trying to read the interviewer's mind to know what criteria I'm being evaluated on. With real-world problems, there's no mind-reading necessary. It's easy to imagine what type of solution would get the job done. Plus, hearing a real problem would actually be interesting, and perhaps break some of the tension from being scrutinized.


I ask candidates a high-level design problem. So far it appears to be an effective way to identify engineers who can decompose a problem into smaller, digestable subproblems.


Ask them to write an essay.


From my experience, ability to write good essays doesn't have especially good correlation with programming ability.


I agree with you to an extent but I guess it comes down to how you judge a good essay. I wouldn't expect a literary masterpiece but if the candidate can convey their intended message in a clear, unambiguous and concise manner, I'd take that as a fairly good indicator of decent organisational skills.


Talk about the organizational struggles at his previous company.


The company I work for holds a summer school every year. And while I understand this does not apply to most tech jobs applications, I can assure you that after three weeks working together with the students we get a pretty good idea of who can organize their thoughts and divide problems in subproblems. So maybe the key would be to test technical skills for a while longer than a half hour quiz.


that's something i closely relate to verbal expression. the first thing you need to structure are your thoughts. so i simply ask them to describe to me a system they've previously built. if the language is clear, then so are the thoughts and so most likely will the code be.


I'm afraid that assessment of programming skills based on verbal expression makes you vulnerable to many errors and biases. Sure, you'll quickly filter out incompetent candidates but along with them you'll most likely flush out some really good ones, because programming and communication are two different competences and they doesn't have to be correlated. At least not positively ;) Plus, it's generally bad idea to assess two competences on one observed behavior. You're "saving time" on the expense of clarity and precision of judgement.


Don't worry, my recruitment interview also include algorithm and data structure quizz.

But when i do, it's in front of computer, and i'm sitting right next to the candidate while he taps, to see his line of reasoning. Then, if he's in trouble, we talk about it, i give him hints, and see if he understand what i'm saying and fix the code.


Are you really sure that this isn't a myth? I often have problem expressing myself, finding the right words, but I am quite a good programmer.


Well, it's probably not equivalent, but it's a good hint (aka : you can't express clearly something that isn't well structured in your mind). Plus, it also let me test verbal skills themselves, which is extremely important as soon as you're working in a team.

Another way of looking at it is that "finding the right word" often means find the good "fit" between abstract concepts shapes or feelings, and a construct of known language items. That's precisely what you're doing as a developer.


Ok, so it's based on reasoning full of assumptions that may be wrong, not on empirical evidence. For me, it's quite hard to think aloud when solving some problem, because it forces me to translate what goes in my mind into words, which is quite distracting, I cannot fully immerse into the problem.


I suppose you're more a "tool" developper than a "business process automation" one. In the latter, you very often have to talk through the problem with your customer.

But it is an assumption indeed... Only it is based on my personnal experience.

As for the comment mentionning google developpers, the ones i see are good enough verbally to talk about their jobs at google i/o.


That you're "quite a good programmer" might be a myth too :P


I don't think so, because I've developed some quite popular software and done well in programming contests.


Programming contests show nothing about your organizational skills, which is what companies should be looking for.

See https://news.ycombinator.com/item?id=5911235

Mathematicians perform very well on programming contests, but they are usually terrible software engineers.

Anyways, I was just kidding :)


I think that mathematicians don't do well in software development because they simply have much less experience than other devs :) I think that what's most important in software engneering is experience, motivation, intelligence, common sense and good taste (aesthetics).


Google is full of engineers that did well on programming contests, they even have their own contest, and i don't think they are all terrible software engineers.



I don't think the problems in algorithmics contests are in the category of brainteasers. As far as i know Google still relies heavily on knowledge of algorithms and problem solving capability to hire their engineers.


Fair point. I still think there are two types of programmers: problem solvers and system architects.

The former (e.g. a data scientist) will be good solving a problem but bad building a big architecture. The latter, the other way around.

I guess the question is which kind of programmer you are and which skills to look for.


I think there are probably more than just two types of programmers.


"organizational skills". aka politics :/




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: