The fate of the police reform bill passed by the state legislature earlier this month is up in the air. That's after Gov. Charlie Baker sent it back to lawmakers last week with changes. The governor takes issue with several of its provisions, including a limit on the use of facial recognition technology by police. Kade Crockford is director of the Technology for Liberty program at the ACLU of Massachusetts. She spoke with GBH All Things Considered host Arun Rath. This transcript has been edited for clarity.
Arun Rath: Let's break down the limits this bill would put on facial recognition technology, because we hear about this referred to as a ban on facial recognition technology. But it's not a ban, rght?
Kade Crockford: It's not a ban, no. What the legislature sent to the governor's desk a few weeks ago, and unfortunately what the governor rejected, is common-sense regulation that would ensure that we can use this technology with narrow limits and checks and balances in place in serious criminal investigations, like investigations of murders and kidnapings, to try to identify persons in images when police don't know who that person is through the Registry of Motor Vehicles' existing facial recognition program. So it would prohibit most government agencies from acquiring facial recognition technology, but it would enable the Registry of Motor Vehicles to continue using facial recognition to perform fraud checks to investigate fraud in licensing, and it would allow police to obtain warrants to ask the RMV to search against that database to try to identify people in serious criminal investigations. It would also allow police to get around that warrant requirement and go directly to the RMV for a search when there's an imminent threat of serious bodily harm or death.
Rath: So the police would have access to this, but it would just involve a warrant or an emergency authorization, like you just mentioned.
Crockford: That's right, yes. We were particularly concerned to see Gov. Baker, in his letter to the legislature, say that this technology has been instrumental in solving some serious crimes in Massachusetts. That was disturbing to us because it suggested that the governor may not understand what this legislation does, because nothing in this legislation would prevent the police from using this technology to investigate those types of serious crimes. We have reached out to the governor's office to offer to meet with him and his team to discuss their concerns about this section of the legislation. Unfortunately, we haven't heard from the governor's office. We would really like to be able to reach a compromise to deal with this problem now, because this is a key social justice and policing issue that belongs in this. Most of what the legislature is doing with the police reform bill is looking backwards, trying to address problems that have become systemic and ingrained in the culture of policing in the commonwealth. This section deals with a problem that is forward-looking. We are trying to prevent problems from unfolding in Massachusetts, and we believe that's really crucial, not only because Black and brown people who are overpoliced, are the most like to be targeted with this software, but also because there are numbers of studies, including one from MIT done by the academic Joy Buolamwini, showing that facial recognition algorithms tend to be racially biased themselves — exhibiting error rates when evaluating the faces of Black women, for example, up to one in three times. So it is extremely important that the legislature work with us and our other community partners to make sure we can get to "yes" on legislation that brings some common-sense regulation to this dangerous and currently unregulated technology.
Rath: To give us a sense of how these are current real-world concerns — Gov. Baker says that this technology has helped to solve crimes. Do you know what kind of things he's referring to there and what he might be concerned about this legislation disrupting?
Crockford: So it's actually unclear, unfortunately, because as I said, the examples that he gave in the letter rejecting Section 26 did not seem to make any sense, because this legislation, Section 26 of the police reform bill regulating facial recognition technology, would in fact enable police to continue using the technology in precisely the types of investigations Gov. Baker flagged. But, you know, your question raises a really important point, which is that far too often, and this is the case with facial recognition technology, we see the government in the Executive Branch, the police departments, the State Police, going ahead and adopting and using very controversial, and in this case sometimes racially biased, technologies without a public debate, without legislative authorization and without the attendant checks and balances like a warrant protection that a legislative process would produce. That is deeply concerning to us, because what we know in Massachusetts is that the police have been accessing the Registry of Motor Vehicles facial recognition system to perform these searches in criminal investigations for about 15 years now. They've been doing that entirely in secret — in fact, not even notifying criminal defendants that the technology was used in their cases to identify them, which we believe is a significant due process violation. What we're aiming to do here is to bring this technology under small-d democratic control, to subject it to very common sense checks and balances, the same kind of oversight and accountability mechanisms that everyone expects from the criminal legal system — going to a judge to demonstrate that there's probable cause that someone may be involved in a serious crime and then taking the warrant that they obtain from that court to the RMV to ask for a search.
Rath: We know facial recognition technology has a disproportionate effect on communities of color. I remember, you know, getting Google to recognize my relatives. Going back to that point, we've seen this play out. But can you explain how how this plays out in the real world, how it how it hurts communities of color in practice?
Crockford: Absolutely. So there have been a few cases. In fact, in June, The New York Times published a really horrifying story out of Detroit. Again, you know, the use of facial recognition by police in Michigan is totally unregulated as it is in Massachusetts, and that has devastating consequences. So we know now of at least two Black men who have been wrongfully arrested by the Detroit Police Department, including Mr. Robert Williams, whose story was told in The New York Times in June. He was arrested in front of his two small children and his wife for a crime he didn't commit. He went down to the police station, and he said he felt like he was in "The Twilight Zone" in the interrogation room when the police handed him a still image from a video surveillance camera of the suspect in this theft and said, this is you, right? And Mr. Williams said, do you seriously think all black men look alike? And the police let it slip that the error came from facial recognition. They said, oops, the computer must have gotten it wrong. So that's exactly the type of situation that we need to prevent from happening here in Massachusetts. And, you know, the ACLU is far from alone here. We have a coalition of nearly 60 organizations that are supporting these crucial reforms. A number of Boston Celtics players came out and spoke out publicly on this issue, on their social media accounts, including Jaylen Brown, the small forward, number seven for the Boston Celtics, saying the legislature needs to get to "yes" on this to protect Black and brown people and to ensure racial justice in the 21st century in the commonwealth.