DEV Community

Jeremy Kahn
Jeremy Kahn

Posted on • Updated on

Making LeetCode interviews more inclusive

Something that has plagued me throughout my career is LeetCode-style interview challenges. For whatever reason, most of the tech industry has converged on this format as the canonical litmus test for whether an individual can code, or whether they are an Imposter Programmer. This is a shame, because LeetCode-style interviews are deeply flawed:

  • An astonishing amount of false negatives. See: All the jobs you were qualified for but missed out on simply because you choked in the coding challenge.
  • Even more false positives. See: All the engineer coworkers you've ever had that were difficult to work with.
  • They are exclusionary to the point of being discriminatory against neurodivergent people. (This is the one that I'm personally most bothered by, because I am Autistic and therefore considered neurodivergent.)

Zooming out a bit, we all deserve better than this ineffective form of interviewing. This post will attempt to illustrate the many flaws with LeetCode interviews and offer some simple and more effective alternatives to use instead.


When I talk about LeetCode-style challenges, I'm talking about live-coding challenges where you're given a prompt like this, on the spot:

// Given a string s, return the longest palindromic substring in s.

 * @param {string} s
 * @return {string}
var longestPalindrome = function(s) {

Enter fullscreen mode Exit fullscreen mode

(This is a verbatim example from LeetCode.)

It's not an unreasonably difficult challenge, and I estimate that nearly all professional programmers could solve it given the necessary resources and time. The issue is that interview candidates are given a specific time limit (typically 45-60 minutes), and are actively scrutinized while they work. When they complete the challenge, they are given another one repeatedly until time is up. The experience is every bit as nerve-wracking as it sounds, and the sheer anxiety causes many people to freeze up and appear as though they genuinely can't code. However, that is typically not the case, as candidates probably wouldn't have gotten far enough in their career to be having the interview to begin with.

The case for having these sorts of trials is typically "to see how the candidate thinks." This focus on the "how" obscures the "what" of the end result. Most companies are not in the business of selling how their employees think. They are in the business of selling what they produce. So wouldn't that be the better thing to focus on in an interview?

Interview exclusion

LeetCode-style interviews, by design, filter for a very specific type of mind: One that can performatively produce complete and working code on the spot. This would be completely acceptable if that's what an engineering job actually entailed, but this is not the case. I've met many, many tremendously talented engineers from many walks of life in my career, and none of them work this way. The way actual, professional code is generally produced is by some mixture of quietly mulling over a problem, maybe going for a walk, talking it out with a coworker, sketching out some ideas, but most importantly: Iterating over the course of hours, days, or even weeks depending on the scale of the problem. No engineer regularly produces production-grade work in under an hour in their day job.

Some among us can summon the energy and the strength to conjure working code into existence within a time limit and under active scrutiny, but that is a subset of the engineers that are worth hiring. The whole includes introverts, people with social anxiety, people who didn't sleep well the night before the interview because they were nervous about it, people with ADHD, Autistic people, and many, many more. The LeetCode filter prevents these qualified people from passing the interview and everybody loses out because of it.

To expand a bit on the Autism point, since I can speak to that personally: The complexities of living with Autism and working with those who do are beyond the scope of this post. I also want to avoid speaking for all Autistic people, since it is an extremely varied and diverse group. However, the abbreviated version of what I want to say is that being an Autistic interviewee being actively judged by a neurotypical interviewer is inequitable to the point of being exclusionary. When I'm trying to navigate a LeetCode-style challenge, am also spending many mental cycles on distracting thoughts like the following:

  • Am I talking enough?
  • Am I talking too much?
  • Am I passing as normal?
  • Am I doing this the way the interviewer expects it to be done, or the way that makes sense to me? Which one should I be doing?
  • Should I be applying academic concepts for the sake of showing that I know them, or should I keep things simple?

Only the remainder of available mental cycles are actually going towards solving the problem, so you're only seeing maybe 30% of my actual coding ability. Many people might read this and say "just relax and focus on the challenge at hand." That is like telling a paraplegic to just believe in themself and stand up. This difficulty is the very essence of Autism and it cannot be willed away.

A better alternative

Given the many flaws of LeetCode-style interviews, how can we do better? I've got good news: It's pretty easy. The first thing you need to do is take a step back and consider the sorts of characteristics you're actually hiring for. I'll make the bold assumption that the engineers on your team aren't actually solving contrived code puzzles in their day-to-day (and if they are, feel free to stick with LeetCode interviews and please never contact me for a job). Instead, list out the technical duties they are actually performing to make the company successful. It hopefully looks something like this:

  • Understanding, improving, and extending business logic
  • Participating constructively in code reviews
  • Collaboratively problem-solving practical technical issues ("The API is experiencing a performance degradation, what do we do?")
  • Finding and fixing bugs

These are critical qualities that can be teased out of a much wider pool of candidates within an hour, simply by recreating relevant scenarios in the interview. Rather that handing interviewees an unimplemented function, show them your team's code and see how they might improve it. You'll learn what you need to know about them, and it will help them make a more informed decision about whether they and your company are a match. Let the candidate make a modest code change and see how they go about leading a Pull Request to get it merged. It's okay if you're working out of an alternate repo that's meant specifically for interviewing. In fact, that's the best way to create a consistent and fair challenge for each potential candidate.

"This sounds hard to do."

It isn't, really. It requires a bit of upfront investment and experimentation to get a balanced and streamlined process, but hiring good people is inherently challenging and you need to put some work into it. Having a more thoughtful and effective interviewing process will save you money in the long run by setting up the candidates you actually want to hire for success.

If you simply must stick with LeetCode challenges, provide candidates the specific problems they will be given a week in advance instead of on the spot. Allow them to come to the interview with the solution already implemented. This will set candidates up for success by allowing them to solve the problem their way, on their schedule, and in whatever way works best for their unique situation. You can then collaboratively analyze the solution and extend it via a pairing session that they lead. This is not an unreasonable advantage, it is an accessibility accommodation. The great thing about accessibility accommodations is that they make things better for everyone. Having ample time to think about a solution ahead of time would be helpful in setting all candidates up for success. And in this case, it changes the nature of the interview to be more of an exercise in collaboration rather than a gate designed to keep people out of the company.

"But what if the candidate cheats by getting help?"

So what if they do? Getting outside assistance ahead of the interview won't help a candidate during an interactive review and pairing session on the code they provide. Also, what's the harm in getting some help? Is there a rule at your company that forbids engineers from getting assistance to develop a solution? As an interviewer, you should be looking for signals that indicate a candidate can produce results by any means necessary, be receptive to feedback, and iterate based on evolving requirements.

"It sounds like you're just bad at coding interviews and are taking it out on the world via a blog post."

You're right, I am bad at coding interviews. I will never be good at them, because I have a disability that makes them disproportionately difficult for me. Funny thing about that, though: I am actually a very capable programmer. I'm not egotistical enough to believe I am exceptional, but speaking objectively: My coding ability has never been called into question as lacking on any job I've ever had. I've never been too insufficient of a programmer to get my job done well. Minimally, I am a "good enough" programmer.

If you're asking how I ever got a job given how bad I am at coding interviews, it's because I tend to get job offers from the exceptional companies that don't have LeetCode-style interviews as part of their hiring process.

We all deserve better

There are many, many exceptional programmers who struggle tremendously with LeetCode-style interviews. Max Howell, the creator of the mission-critical package manager Homebrew that millions of developers depend on, famously got rejected by Google (who uses Homebrew) because of such an interview:

There is something deeply wrong with this way of interviewing. It is depriving companies of excellent and diverse engineers, and it is making people miss out on good jobs that they actually deserve. It's time to go back to the drawing board (but not the whiteboard) on these ineffective hiring practices.

Discussion (0)