Instead of TSA agents verifying passengers’ identities, several airports are testing facial recognition software. A camera snaps a photo of the traveler and AI technology uses biometrics to compare the facial features from that photo to the passenger’s image on their ID.
The pilot project has expanded to 25 airports across the country, including Logan Airport in Boston.
Dr. Joy Buolamwini, founder of the Algorithmic Justice League and an expert in AI bias, told Greater Boston that facial recognition software has a built-in bias, often failing to identify people with darker skin tones.
"The National Institute of Standards and Technology released research showing that not just dark African-American faces, but also Asian faces were up to 100 times more likely to be failed by these systems than the faces of white individuals,” said Dr. Buolamwini. “There are also studies that show age bias and gender bias as well.”
With more people taking to the skies, the TSA claims facial recognition technology will make airport security screening faster and safer.
But Washington Post technology correspondent Geoffrey Fowler said that TSA has refused to release any data proving these claims.
“So, we really have to at this point just take their word that it is a more accurate than people and speeding things up,” said Fowler.
While the U.S. is increasing its use of facial recognition software, the European Parliament recently passed a draft law called the AI Act which restricts the use of this type of software.
As facial recognition technology’s use becomes more widespread in the U.S., Fowler worries we will become more like China, who has embraced the use of this technology in public places and for policing.
“The TSA says that the images it collects are not shared with other branches of government and it does not use them for law enforcement,” said Fowler. “But the question always is, what will be the next use, because we know it is never limited to just trials.”
Although the program is currently voluntary and travelers can opt out and have their ID read by an agent, both Dr. Buolamwini and Fowler have heard stories of passengers who invited additional scrutiny by refusing the screening.
“The government has already made it very clear that the path and the roadmap is to make what we are seeing as a trial or a pilot mandatory,” said Dr. Buolamwini. “If you don't want this to be the default option, this is the time to have your voice heard. … This is the time to resist.”
WATCH: Leading AI bias expert on TSA facial recognition pilot program