Other developers have expressed their own reservations with whiteboarding. “How many people can actually write BFS [breadth-first search] on the spot, without preparing for it in advance?” Sahat Yalkabov wrote in a much-circulated Medium posting earlier this year. “Again, I am not a recent college graduate anymore who has breadth-first search memorized, and aren’t my open-source projects and my prior work experience at Yahoo enough to show that I can write code and deliver software into production?” Developers at work rely on Google, GitHub, and lots of other resources and tools to build their code; for that reason alone, asking them to quickly work through a problem on a whiteboard isn’t an accurate prediction of how they’ll perform once they’re seated in a cubicle and asked to build a project over many weeks or months. Despite those doubts, however, it seems that tech companies will continue to lean on technical interviews in order to select candidates. And as long as the practice continues, factors such as schooling will continue to influence how different candidates perform.
Google: 90% of our engineers use the software you wrote (Homebrew), but you can’t invert a binary tree on a whiteboard so fuck off.— Max Howell (@mxcl) June 10, 2015
Technical interviews are a notoriously stressful part of the job-application process. Is there anything in a candidate’s background that will predict how well they’ll do when confronted with a whiteboard and a difficult problem? Interviewing.io is an organization that helps tech pros practice technical interviews (and interview anonymously with top tech firms). Aline Lerner, its co-founder and CEO, recently crunched data from those interviews and discovered that certain aspects of a tech pro’s résumé may indeed influence interviewing outcomes. Of course a candidate’s background affects their technical-interviewing skills, you say. That’s not a startling conclusion to reach. But Lerner did discover some startling things, at least if you operate under the assumption that certain core attributes of most tech pros’ careers, such as total years in the tech industry or an advanced degree, can influence how well they perform when asked to solve a confounding conundrum on the clock. Lerner found that only three attributes were “statistically significant”: top school, top company, and classes on Udacity and Coursera. Other factors, most notably possession of a Master’s degree, years of experience, and whether the pro had founded a startup, didn’t seem to matter when it came to interview performance. Nor is that the end of Lerner’s conclusions. “For people who attended top schools, completing Udacity or Coursera courses didn’t appear to matter,” she wrote. “However, for people who did not, the effect was huge, so huge, in fact, that it dominated the board.” The educational benefits of MOOCs (Massive Open Online Courses) aside, she thinks that people who participate in online learning tend to be “abnormally driven,” which can help when the time comes to interview for a job. Lerner’s blog posting is well worth the read, especially the parts where she hypothesizes about the reasons why Master’s degrees and years of experience have little effect on interview performance. But are technical interviews actually a good barometer of a candidate’s ability to slot into a particular workplace? That’s a topic of considerable debate in some circles. In June 2015, Max Howell issued a popular Tweet that neatly summed up many of the issues that developers have with whiteboarding problems in front of an interviewer: