When a call comes in to a Child Protective Services hotline, how should the government react? Is the complaint significant enough to merit an investigation? Should caseworkers be sent to the child’s home? Or is the call frivolous? Would having the stress of an investigation do more harm than good?
These are tough questions, and ones that counties and states throughout the country are trying their best to answer. One of them, Allegheny County, is looking to an algorithm for help.
It’s important to note that the algorithm, known as the Allegheny Family Screening Tool, doesn’t decide which kids are taken away from their parents. It’s not used as evidence in family court. Instead, it’s solely to help call screeners decide whether they should send out caseworkers to investigate a complaint.
And there are a lot of complaints and tips every year. There are 4 million referrals annually, according to Emily Putnam-Hornstein, an Associate Professor at USC, director of the Children’s Data Network, and one of the two people behind the algorithm.
It's a tough decision whether to investigate a case on most calls. Even Rhema Vaithianathan, a professor at AUT and co-director of the Centre for Social Data Analytics and the other person behind the Allegheny Family Screening tool, was surprised when she sat in on some of the calls.
“I just couldn’t figure out which call was concerning, and which call wasn’t,” she says, “I was blown away by the process that they went through and how well they did it... it’s staggering to make these kinds of decisions about these children.”
So to determine whether or not to get involved, the screeners would look at computer records: What data did the agency have about the family? What services had they used? Were there any major concerns? And they’d make their decision based on the call and that data. But, every screener is different and even if they’re trying their best, they’re still human. That’s where the algorithm comes in.
Putnam-Hornstein and Vaithianathan’s algorithm does the same thing that the call screeners do: look at the data Allegheny County has access to, and assess whether the child might be at risk.
But the way that the Allegheny Family Screening Tool actually does this is a bit different. Since it’s an algorithm, it’s able to take in all the data the county has on the child and family, and build its recommendation based on the relationship between that data, and how children in similar situations have fared in the past. All in a matter of seconds.
“It’s just by the numbers,” Putnam-Hornstein says. “We’re not necessarily predicting maltreatment — we’re predicting involvement with the child welfare system. We know that there are children who become involved with the child welfare system who are not maltreated. We know that there are children who are maltreated that never become involved with the child welfare system. What we’re trying to do is equip a public system with a tool that can help it make better decisions, so it can intervene earlier with high-risk families.”
The Allegheny Family Screening Tool was first used in 2016, and final results haven’t come in yet. But, early signs look promising to the two scholars. Vaithianathan notes that the algorithm scores have done a good job of predicting the level of concern of the case workers when they visited kids at their homes. But this doesn’t mean that there aren’t critics of the tool.
Virginia Eubanks, for one, believes that it may be doing more harm than good. Eubanks is a professor of political science at the University at Albany, and she’s the author of Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, which focused in part on Allegheny County’s algorithm.
Eubanks says that the major concern parents have is a “false positive” problem. Basically, they’re worried that the Allegheny Family Screening Tool sees harm where no harm actually exists. She thinks that because the data that drives the algorithm is collected on mostly poor and working-class families, it creates a feedback loop. Because they interact more with government programs and subsidies, poor parents are over-surveilled. Because of that surveillance, they have more run-ins with the system, and because they have more run-ins with the system, they have more surveillance... and on and on.
“[These families] believe that it confuses parenting while poor with poor parenting,” Eubanks points out.
Eubanks wants us to remember that the mountains of data that child welfare agencies deal with everyday isn’t objective or blind to who we are. It comes from somewhere. And that somewhere is people. Fallible people.
“You can imagine the unexamined bias of a doctor or a nurse might lead them to see a black family and say, ‘Oh, I think that’s abuse, but to see a white family and say, ‘Oh, maybe they fell off a jungle gym,’” Eubanks says.
And this could mean that poor and working class families might not want to use state-subsidized mental health care, because they’re worried what might be written about them in a computer file somewhere.
“If we’re building tools that make families feel more isolated and increase their stress, we might be creating the kind of conditions we’re trying to eliminate,” Eubanks says.
Putnam-Hornstein sees it in a different light. “I think we have an obligation to the families to use the best data and the best science, so we can create an equitable experience.”
She points out that the Allegheny Family Screening Tool is only used at the hotline. The caseworker who goes out to visit the family doesn’t know the score.
Putnam-Hornstein also notes that if a child welfare agency has information about a family, they have an obligation to use it to protect children. If they happen not to have similar info about a wealthy family, then they don’t have it. But they’re not going to ignore information that indicates a child needs help.
And Vaithianathan says that, with the help of the algorithm (which obviously doesn’t factor in race), the call centers can be less racially motivated.
This whole thing is a difficult question, made even tougher by the fact that, throughout the country, budgets for these services are being tightened.
But the question remains: When the government gets that call, how should it respond?