#NEIGHBORSNOTNUMBERS

Algorithmic Discrimination in Housing


So why is the public still asleep on the fight against facial recognition?
And why haven't programmers and policy makers fixed it?

The IDH has some ideas.  And solutions.

(1)
There are so many legal and ethical problems with facial recognition that the public (and policymakers) can't keep track.

(Which is why the IDH created a method to categorize them that even high school students understand.)

(2)
While facial recognition technology's proven tendency to racially discriminate is a huge problem, it isn't the only one.
(And most of the advocates only address the issue of algorithmic bias.)

(Which is why the IDH method also categorizes ALL constitutional problems
with police use of facial recognition software.)

 

(3)
By addressing ALL of the legal and ethical problems with facial recognition in criminal contexts, we can build bi-partisan coalitions to regulate them.

 

Facial Recognition
is already happening.

And we already know
that it's
inaccurate,
unjust,
and probably
unConstitutional.



 

Yes, we know: Facial Recognition technology sounds like a bad episode of Black Mirror. 

 

But police departments around the country
are already using it. 

 

And since legal scholars have already

-- extensively --

proven that police use of facial recognition is

inaccurate
and discriminatory

(and plenty of lawyers and judges
are worried t
hat facial recognition violates

privacy and free speech rights)

it's no surprise that advocates
from the left and the right

(and major figures like IBM and the Pope)

have called for its regulation (or banning it outright).