Facial Recognition and Discrimination ;
And we already know that it's inaccurate, racially biased,
and probably unconstitutional.
Facial Recognition is already happening.
Yes, we know: Facial Recognition technology sounds like a bad episode of Black Mirror.
But police departments around the country are already using it.
And since legal scholars have already -- extensively -- proven that police use of facial recognition is inaccurate and discriminatory (and plenty of lawyers and judges are worried that facial recognition violates privacy and free speech rights)
it's no surprise that advocates
from the left and the right
(and major figures like IBM and the Pope)
have called for its regulation
(or banning it outright).
Along side the efforts of other activists and organizations, the IDH advocated for legislation that would prevent the Minneapolis Police Department from using facial recognition software. Ultimately, the ban passed!
While this is a huge victory for digital rights, there's presently no nation-wide regulation on bias facial recognition software. Be encouraged, and remember:
The Fight is Not Over.
Part of ACLU MN/Stanford Race and Technology coalition that led the successful ban on racist facial recognition tech in Minneapolis.
FACial recognition harms
Facial Recognition raises concerns about human rights violations.
Facial recognition technology allows the police to identify and surveil people (a) without their knowledge, and (b) without time or location limits, meaning we live in a new era of potentially limitless and nonstop surveillance
The chilling effect is a legal term used to describe police surveillance and its suppression of people’s 1st amendment right to free speech
Constant surveillance discourages people from sharing their opinions; associating with those who have differing opinions; and participating in the political process, leading to a less vibrant and more tense social-political landscape