Artificial intelligence (AI), which is being rapidly deployed and adopted, is producing some unexpected — and often harmful — results. Coded Bias, the latest film from Shalini Kantayya, digs into how data discriminates. It also showcases the women leading the charge to change the applications of AI.
“There are rules around taking people’s DNA and fingerprints, but there are no rules about obtaining biometric photos and keeping them in a database,” she says.
The film, broadcast on GBH 2 during Women’s History Month, features MIT Media Lab researcher Joy Buolamwini, whose groundbreaking research revealed that facial recognition software did not accurately identify darker skinned faces and the faces of women. And Cathy O’Neil, author of Weapons of Math Destruction, discusses her findings that some algorithms are actually reinforcing old biases and power dynamics.
The AI that drives facial recognition technology identifies patterns or algorithms based on immense data sets. The problem is, as the film shows, the data reflects decades of inequality.
Without safeguards, AI can propagate systemic bias, says Kantayya. Still, AI is being used to make decisions about who gets hired, who qualifies for loans and who gets into college. It is used to compare faces of city pedestrians to watch lists of suspected criminals. It’s used to market education scams to residents of low-income neighborhoods and SAT prep to higher income groups.
“These technologies have not been vetted for racial bias or gender bias or discrimination or that they won't do unintended harm,” says Kantayya.
The film tells the story of people who have fought back: An award-winning teacher who was at risk of losing his job because a mathematical model — not classroom visits — was being used to evaluate job performance. Community members have successfully banned facial recognition technology because it wrongly profiles people and violates civil liberties.
While we refer to it as “artificial intelligence,” says Kantayya, “machines that don't have morals or ethics or empathy cannot be described as intelligent. We are outsourcing our decision-making to machines.”
The film is available now at codedbias.com and will be broadcast on GBH 2 on March 22 at 10pm and March 28 at 6pm, and on GBH 44 on March 27 at 2pm and March 28 at 8pm.
Learn more about Coded Bias
here.