After tens of millions of people saw computer-generated, sexually explicit images of Taylor Swift, politicians’ attention is turning to a growing threat: women and girls being exploited through fake pornographic images.

White House Press Secretary Karine Jean-Pierre called the incident “alarming.”

Few laws prevent someone from creating or sharing this type of non-consensual content, even as widely available artificial intelligence tools make it an easy task.

The protections in Massachusetts are even weaker, as one of just two states in the country without any laws against revenge porn — the other being South Carolina. But a pending bill could make the practice illegal in the commonwealth.

The revenge porn bill, or An Act to Prevent Abuse and Exploitation, addresses situations where sexually explicit images or videos are shared publicly without the subjects’ consent — even if they consented to creating them. (Current Massachusetts law already criminalizes the surreptitious recording of someone in a state of undress, or “ upskirting.”)

State Rep. Tricia Farley-Bouvier of Pittsfield introduced an amendment, which was adopted into the bill last month, that also criminalizes the creation of non-consensual deepfake pornography. Explicit images of an individual that are fabricated through digital means without consent would be considered a form of revenge pornography.

“It’s really a violation of a person to do something like that," Farley-Bouvier said. “Even with these high-profile cases that we’ve been hearing about, some of the bigger [social media] companies have taken them down and stopped their spread. But once something’s on the internet, it’s never, ever gone.”

Two women in blazers give testimony. One looks exasperated, with her hands in the air.
Rep. Tricia Farley-Bouvier, center, proposed the amendment that would also specifically criminalize the distribution of digitally created pornography.
Sam Doran State House News Service

While fake nudes of celebrities have circulated online for years, they used to require specialized photo editing software and some skill to produce. But with widely available AI-powered tools, alarmingly realistic images can be quickly made with minimal effort — putting anyone at risk.

In New Jersey, a 14-year-old high schooler called for federal legislation to address AI-generated pornographic images after photos of her and more than 30 female classmates were manipulated and shared last year.

“Any one of us can have any picture that’s already out there in the public,” Farley-Bouvier said. “They can be quite convincingly changed to be pornography and then shared widely. And it doesn’t take a lot of technical ability to do it. There are just programs out there, just like you can take red eye out.”

The Massachusetts revenge porn bill includes staggered penalties — with a first offense carrying up to a two-and-a-half-year prison sentence and a $10,000 fine. For additional offenses, it would increase to up to a $15,000 fine and a prison sentence as high as 10 years.

Farley-Bouvier said any AI-created explicit content of children would fall under both child pornography laws and the revenge porn bill. One exception included in the bill is if the act occurs between two minors, in which case a judge would have the option of a diversion program for the offending minor.

The bill cleared the Massachusetts House unanimously in January, and is now awaiting action from the Senate before it could ultimately head to Gov. Maura Healey’s desk.

State Sen. John Keenan, a co-sponsor of the bill who represents Norfolk and Plymouth, said a similar bill passed through the House in the 2021-2022 session, but the term ended without the House and Senate resolving their differences.

He said he is “confident” the bill will pass this year, and that high-profile cases such as Swift’s have helped raise awareness.

Similar laws have raised First Amendment questions in the past and the ACLU of Massachusetts is monitoring the bill.

Rachel Davidson, a staff attorney for the organization, acknowledges the real harm that any deepfake can cause, but cautions that constitutional protections still apply for their creators.

“[T]hey are entitled to the same constitutional protections as speech generated in any other way. That means that unless a deepfake falls outside the protection of the First Amendment — because, for example, it constitutes defamation, a true threat or obscenity — lawmakers must narrowly tailor any regulation to meet a compelling government interest,” she wrote in an statement to GBH News.

Hema Sarang-Sieminski, deputy director of Jane Doe Inc., which advocates for those impacted by sexual and domestic violence, said the nonconsensual sharing of explicit images can cause further sexual violence towards an individual.

“The likelihood of increased stalking, further sexual assault, this kind of sharing does create a kind of isolation that is akin to sexual assault,” Sarang-Sieminski said.

These experiences can be traumatic and can destabilize survivors and their ability to maintain economic security, housing, and safety, Sarang-Sieminski said. She hopes the bill that passed the House will become law.

Gail Dines, an anti-porn scholar and activist, said the effect these AI-generated images can have on people is “profound.” She said it can cause depression, fear, anxiety and even job loss.

“Taylor Swift got them down because she’s Taylor Swift,” Dines said. “If you’re not Taylor Swift, it’s almost impossible to get rid of them because they start being shared across all different platforms. So, even if you try and get them down, it’s almost impossible.”