Recently Techcrunch published this article, On Secretly Terrible Engineers. I’ve been thinking about it and digesting it for several days because there are somethings in it I agree with, but a lot that I don’t and I’ve been trying to think about how to phrase my objections. David Harney does an excellent job of deconstructing and refuting the article on his blog and I recommend reading it. I largely agree with what David’s written so I won’t rehash it, but I do want to talk about some issues I have with the article in the large.
Let me summarise what I took from the article, “Stop wasting time asking irrelevant technical questions in interviews, if the candidate has worked in the industry for more than a handful of years they must know what they’re doing.”
I’m going to address this in two parts. First, “Stop wasting time asking irrelevant technical questions in interviews.” Ok, this I agree with. Too often when interviewing candates we revert to asking them to implement FizzBuzz or Quicksort or some other algorithm that can easily be looked up and memorised before the interview. Now don’t get me wrong, if your company is developing a commercial FizzBuzz application or you require your engiineers to repeatedly implement sorting then by all means ask these questions as they’re key to your business, but if you’re not doing either of those things why not ask something more relevant to the problems your engineers face every day.
Let me tell you about the best technical interview I’ve taken part in. A few years back I interviewed at an ecommerce company. As part of the interview process I was required to write some code, but instead of one of the usual contrived exercises the company provided me with a small sample of their production code and asked me to implement a minor feature. I then had to defend my solution in a code review with two of their engineers. This was a great test. For me, it gve an insight into their code base and engineering team, how the code was structured and the sort of things I could be working on as an employee. For the company the test was able to show them not only if I could code, but also did I to understand the requirements, did I ask questions to uncover the implicit requirements that weren’t in the spec, how I went about designing a solution and ultimately whether I would be a good fit with their team. And all that from a problem that wasn’t any more complex than FizzBuzz or Quicksort. I don’t know whether the root cause here is laziness or lack of imagination, but it isn’t helping anyone. If coming up with a standard test is proving hard pick a feature from your backlog or issue tracker or better yet use a feature you’ve just implemented…it’ll be something you’re familiar with and you’ll be better able to judge how someone else approached the problem.
Now the second part, “If a candidate has worked in the industry for more than a handful of years they must know what they’re doing.” This is the part I have real problems with. The assumption that someone must know what they’re doing because they’ve been in the industry for several years is just wrong. Why is it wrong ? Well for starters assuming that time spent doing equates directly with ability doesn’t stack up. True, you would expect that someone with 5 years experience would know more than someone with 1 year, but it depends on context. If the person with 5 years experience has been building web-based CRUD applications but the person with 1 year has been building desktop applications, but you’re building a mobile app then who is the most competent ? You don’t know, you have to test them. Ok you say, but what if you are building a web-based CRUD app, surely then the first person would be the most competent ? Maybe, but you still have to look at what they’ve done and what they’re able to do and you can only do that by testing them.
The second problem is how do we define cometence ? In the article the author seems to equates competence with the ability to write code, the argument being that since the candidates have been in the industry for so long they must be able to write code and therefore they must be competent, but this argument is flawed. Writing code isn’t end goal of what we do it’s an artefact of how we do it. What I mean is what we do is solve problems using computers and it sometimes the way we do that involves writing code. Equating knowing how to code with technical competence confuses the medium with the message, it’s equivalent to saying that knowing how to write English is sufficient to write great literature…it’s certainly a pre-requisite, but there’s more to it than that. Similarly the ability to code is table stakes in our industry, but it’s just a starting point.
So if coding is just a start then what is a fair way to judge someone’s ability ? Well frankly it’s all the other stuff that goes on around the code, things like knowledge of design patterns, architectural patterns, being able to design a solution to a given problem and most importanly knowing the trade offs you’re making in your solution, what the alternatives are and being able to explain why those trade offs are the right ones for your solution. What it really comes down to is that competence isn’t down to knowing any particular language or framework, that’s mostly just syntax which, if you have the right mental models of how everything works, you’ll pickup quickly any way, but it is about demonstrating that you have that mental model and can apply it…and that’s what you need to determine in a technical interview.