Skip to content Skip to footer

How to avoid bias while measuring skills

This blog post is based on the sixth episode of the data-driven recruiting podcast hosted by CodeSignal co-founders Sophia Baik and Tigran Sloyan. You can find and listen to this specific episode here or check out the video version embedded below.
One of the main reasons to begin the process of going beyond resumes is to eliminate bias – while still measuring skills. It sounds straight-forward enough, but good intentions don’t always end the way we think they will. Instead, we find that when companies begin to focus on eliminating one form of bias, they can inadvertently introduce a new one. Not exactly a slam dunk. Let’s walk through how this could happen. Your team has acknowledged  that relying exclusively on resumes or pedigrees isn’t going to work anymore. It’s time to start measuring skills directly and implement assessments. So far, so good. The problem shows up when you begin to create those assessments. For example, you want to evaluate a candidate’s ability to write code by asking a question, framed in a particular context. The context of the question can very quickly introduce a bias against a particular demographic.  It’s a common problem, that isn’t just related to recruiting. In the world of education, the SAT is a standardized assessment to measure various skill sets. Although this test is widely accepted, it’s been shown that questions on the test reflect a bias. A question framed by a subject like football, for example, will instantly give an advantage to individuals who have knowledge in that area.  Your best bet for any assessment is to avoid topics that certain groups may have higher familiarity with. Which means the most obvious team to help you create an assessment (your team) may be part of the problem. Unintentionally, companies seek out veteran staff to provide input on standardized questions. But those long-term employees will prepare questions based on what they know to be true, not based on assessment creation skills or even equal employment guidelines. This can be complicated by the current state of a particular group. Let’s talk software engineers for a minute. We know that  group is already heavily skewed towards a specific demographic. It stands to reason that when you ask the current demographic to create questions that will attract new individuals, they’ll be framed by that group’s viewpoint. Which means you’re just reinforcing the vicious cycle of recruiting and hiring the same kinds of people. Even with the best of intentions, your team may not be equipped or have the necessary training to create an unbiased assessment. Ideally, you want to work with professionals who have designed assessments and can help you avoid any landmines. They’ll instinctively know how to limit context and maintain EEOC compliance, ensuring you won’t be favoring one demographic over another.   Bias has many shapes and forms, and it’s easy to replace one type with another. Focus on receiving proper training or support and utilize third-party experts to ensure you can eliminate bias once and for all.