President Trump claimed last week that Google searches are biased against him and perhaps should be regulated.
There's no evidence to support the president's accusation, but we do know that Google searches are biased in other ways.
We’ve seen several examples of the way Google’s search algorithms reflect and reinforce racist stereotypes. And Google is hardly alone. Algorithms increasingly determine what we see and have access to online and in the wider world.
These algorithms are often secret, unregulated, and — intentionally or not — they are often racist, sexist, or otherwise prejudiced.
Cathy O'Neil is a mathematician who has worked in academia and the finance sector. She's written a book called " Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy."
“I would argue that the internet itself is set up to classify us by gender, race, class, education level, consumer power, all that kind of thing,” she said. “The reason it's set up that way is because … we exchange our private data for services like e-mail and web browsing.”
She argues that the entire environment of our world online is a prejudiced machine, and sometimes the algorithms assist companies in breaking the law, she said.
“There are anti-discrimination laws in housing, in employment, in credit … which are essentially being ignored in the context of big data,” she said. “Basically, all these industries have figured out that big data helps them target much more precisely the kind of customer they want and avoid the customers they don't want, and they're just going for it.”
Companies have taken a “catch us if you can” attitude with federal regulators, O’Neil said, adding that this kind of bias in computer models has real consequences.
“We're not just predicting the future … were causing the future,” O’Neil said. “When we say, ‘You don't look like someone who's gonna pay back a loan,’ we're not just making predictions. We're actually refusing to let that person be part of the financial system.”
The solution? O’Neil says the public should push back.
“We should see more organized actions,” she said. “We just have blind trust in these algorithms that are quite actually quite terrible. What we need to do is demand that science is put into data science right now. We need to ask very basic questions of evidence like, ‘For whom does this algorithm fail?’”