The town of Brookline voted overwhelmingly to ban government use of facial surveillance technology during a town meeting Wednesday.
With 179 votes in favor of the ban, eight opposed and 12 abstaining, the town joined Somerville as the second Massachusetts town to block their municipality from using the technology.
Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts, said the vote marks a huge victory for civil rights.
“We are losing control of our personal information because our technology has outpaced our civil rights protections in the law,” Crockford said in an interview Wednesday. “We hope that the state legislature on Beacon Hill will take note of all of this energy in communities across Massachusetts.”
In June, the ACLU launched a campaign to raise awareness for a statewide moratorium on the government’s use of facial surveillance software. The legislation is currently before the Joint Judiciary Committee.
Crockford said she’s hopeful similar bans will be implemented throughout the state.
“Communities are saying we should be in control, we should be dictating how, if at all, these dangerous technologies will be used by our town and city governments,” Crockford said. “We hope that the legislature will listen and will take action to protect all of us throughout the Bay State.”
Supporters of the technology argue that it could have applications for security, public health and the greater good.
A video tweeted Wednesday by Patch.com reporter Jenna Fisher shows town meeting member and Brookline Police Sergeant Casey Hatchett making the case for facial surveillance technology during the town meeting.
“This ban goes too far,” Hatchett says in the video. She argued that while there might be issues with the technology, it has legitimate uses, like “identifying those who are unable to identify themselves.”
Last May, a bipartisan group of congressional lawmakers condemned facial surveillance technology as “invasive and inaccurate.” Shortly after, a review by Georgetown Law’s Center on Privacy and Technology found cases where police abused the systems, contributing to false identification and arrests.
“Face surveillance is dangerous when it works, and when it doesn't,” Crockford said, citing a 2018 study from the Massachusetts Institute of Technology that showed significant gender and racial bias in the use of the technology, and a 2019 University of Essex study that showed an 81% error rate for the technology when used by the London police.
“And even if it worked perfectly, it would present a whole set of separate concerns because we can't leave our faces at home,” Crockford said.