top of page

Algorithmic Ethics

IDH TV's Algorithmic Ethics collection is designed to get you up to speed on the latest video and multi-media assets the IDH has to offer on Algorithmic Ethics.

This page offers a "crash course" on algorithms and the basics of how the IDH combats them, as well as information and insights into specific categories of algorithms. (We've also included a couple of our roundtable and town hall discussions, where the IDH and its partners discuss specific algorithms and algorithmic concerns.

Noen Light purple.png

IDH TV Roundtables and Town Halls

Watch the IDH (Prof. Tom Freeman) and Coded Bias Director Shalini Kantayya discuss algorithmic bias and reform on Nebraska Public Television/PBS.

Full video is to your left.  Annotated video clips are here.

Stanford Race and Tech Fellow Elizabeth Adams (who is also on Forbes' 15 AI Ethics Leaders List) was kind enough to join the IDH  for a roundtable on algorithmic ethics and tech company compliance.  

Those videos -- and clips with annotations -- are here.

AI Ethics 101

AI Ethics 101

Though a certain major movie franchise envisioned an earth that couldn’t stop the rise of robot overlords, the IDH has hope that we can stop this mechanized insurrection by calling for humans, rather than algorithms, to analyze other humans.

Not convinced algorithms are a problem? Still sound like science fiction?


Well, let us introduce you
-- gently --
to how algorithmic discrimination
is already becoming
a way of life in employment decisions. 

(And to Social Credit Scores,
which is where we might be headed
if we let algorithms continue unchecked.)

Here's how we describe the inherent problems
with algorithms in our national pilot curriculum
(for high school students)
with the Anti-Defamation League and Bites Media.

A sneak peak at our lesson plan is

Want a hint at how the IDH gets
students and communities across
the country discussing

(in a bi-partisan, multi-faith,

and cross-cultural way)
algorithmic ethics?

And here's the proof that this method is working in classrooms across the country.

IDH TV Infomercial on Algorithmic Factual Unreliability 

High school lesson plan (used by ADL) is here.  

Complete teacher resources are coming soon!

IDH TV Informercial on Algorithmic Bias. 

High school lesson plan (used by ADL) is here.  

Complete teacher resources are here.


IDH TV Informercial on Algorithmic Storytelling issues.

High school lesson plan (used by ADL) is here.  

Complete teacher resources are here.

Facial Recognition

Facial Recognition

Medical Algorithms

Health-Related and Medical Algorithms

Unfortunately, these algorithms and biases infiltrate into healthcare as well. Here, students explore how these algorithms and systemic ineptitudes affect the health care people receive.

Predictive Policing

Predictive Policing

In an ongoing effort to be more efficient, many police departments seek algorithmic tools to assist them. Many of these tools, however, operate with unintentional bias and with questionable efficacy rates.

Housing Algorithms

Housing Algorithms

Many see algorithms as the way of the future; so many mundane tasks can be completed automatically and without the need for human interaction. That can't possibly be a bad thing, can it?

Employment Algorthms

Employment Algorithms

Jobs are pivotal to our lives, but algorithms used for employment -- both to screen candidates as well as advertise to potential candidates -- often perpetuate discriminatory hiring practices that companies hope to rid themselves of.


Politicians Interviews

IDH interview
w/ City Council Member
Steve Fletcher

bottom of page