top of page
Algorithmic Discrimination in Employment
Employers use algorithms to help make all manner of employment decisions. Algorithms determine which people will see which job postings. An algorithm's guesses about a person's race, gender, age, or other demographic information will determine which advertisements they are shown.
Once a person applies for a job, algorithms scan resumes, CVs, cover letters, and writing samples to determine which applicants are worthy of interviews.
When a person interviews for a job, algorithms may analyze their facial features, expressions, gestures, body language, and even voice to determine whether they are a good fit for the position and the culture.
Though algorithms are often assigned these tasks, they aren't inherently "better" at the task than a human is; they simply make the same human decisions (filled with potential prejudices and biases) faster. In fact, the algorithms involved rarely look to the future when selecting a candidate, because they're designed to look at a data set -- something that happened in the past -- and replicate a decision. This is a real problem. Many places of work have long histories of prejudice against women, people of color, and other minorities, so these algorithms are often (inadvertently) re-enacting these prejudices.
Why is something so incredibly important to humans -- a job that could mean the difference between being comfortable and struggling to survive, or perhaps the fulfillment of a lifelong dream -- delegated to algorithms that aren't able to witness and appreciate a person's story?
bottom of page