The technology giant, Google, is known for hiring the best talent in the market and equally known for their wacky interview questions. For years, Google interviewers used these seemingly random questions to "help" them hire great talent. Turns out though, these questions didn't help at all.
“We found that brainteasers are a complete waste of time,” Laszlo Bock, senior vice president of people operations at Google, told the New York Times. “They don’t predict anything. They serve primarily to make the interviewer feel smart.”
Google's interview question have been all over the internet. We found a list compiled by Lewis Lin, a Seattle based job coach. It was fun to read through their questions; we've pulled out some of our favorites here:
Explain the significance of "dead beef"
Google took data from tens of thousands of interviews and compared how candidates in those interviews score relative to how the hired candidates actually performed in their job. They looked for correlations between scores on specific questions and the overall score provided by specific interviewers to see whether there were people at Google who were particularly good at hiring.
As mentioned before, Google found zero correlation between how well a candidates scored on brain teaser questions and how well they performed in their job.
They also found that only one interviewer was consistently good at at predicting whether a candidate would perform well. Think about that for a moment. Across thousands of interviews, Google learned that all but one of their interviewers were not predictive—you could essentially flip a coin
One of the biggest takeaways from Google's story is that you can actually know, with some sort of certainty, which interview questions are working and which are not. Many hiring managers continue to use the same interview questions over and over—or pick new ones based on recommendations, a recent article read, or how they're feeling on a given day. Google has shown that we can analyze the questions we ask and learn how predictive our questions are.
There are lots of other, specific lessons that can be gleaned from Google's experience, but we'll call out three:
What assumptions about what accurately predicts a good hire are built into your hiring process? How might you test these? What data do you need to start collecting to run the required analyses? Stay tuned for a post on Journeyfront's tips for running your own analysis on the effectiveness of your hiring process.