SCOTT DETROW, HOST:

For years, Meta tried to curb the spread of false information and hate speech on Facebook and Instagram. Here’s CEO Mark Zuckerberg speaking at Georgetown University back in 2019.

(SOUNDBITE OF ARCHIVED RECORDING)

MARK ZUCKERBERG: You know, no one tells us that they want to see misinformation, right? That’s why we work with independent fact-checkers to stop hoaxes that are going viral from spreading.

DETROW: But this past week, Zuckerberg said those efforts were a mistake.

(SOUNDBITE OF ARCHIVED RECORDING)

ZUCKERBERG: We tried in good faith to address those concerns without becoming the arbiters of truth. But the fact-checkers have just been too politically biased and have destroyed more trust than they’ve created, especially in the U.S.

DETROW: Zuckerberg is now getting rid of professional fact-checks and other speech rules on Facebook and Instagram in the U.S. He made this announcement, of course, weeks before Donald Trump returns to the White House, and he nodded to the political climate in his announcement. We’re going to spend a few minutes zooming out to talk about why this shift happened and what it means for information on the internet. I’m going to do that with NPR’s Huo Jingnan and Shannon Bond, who both cover how information travels around online. Hey there.

HUO JINGNAN, BYLINE: Hello.

SHANNON BOND, BYLINE: Hello.

DETROW: Before we dive in, I will note that Meta is an NPR sponsor, but we report on them like any other company we cover. Let’s start with this, though. We just heard what sounds like a really big evolution by Mark Zuckerberg. So how did we get to this place?

BOND: Yeah. I mean, I think the thing we can say about Mark Zuckerberg is that he consistently reacts to pressure from the public and from people in power. You know, this whole thing about fact checking, content moderation on Facebook, this all started after the 2016 election and the revelations that Russia had used Facebook and other platforms to try to manipulate American voters. And so under a lot of pressure, Facebook introduced these measures meant to deal with false information and this kind of manipulation, including partnering with outside fact-checkers.

You know, and then I think from there, content moderation really hit a peak during the pandemic and the 2020 election. January 6, 2021, turned out to be a turning point, right? Facebook and other sites banned Trump. And what had been this backlash from many folks on the right, who had been complaining that companies like Meta were stifling their speech, it really escalated after that. It turned into this coordinated campaign casting content moderation and fact-checking as censorship. And so I think what we can see from this week’s announcement is it really signals the success of that campaign. Here is Zuckerberg embracing this view we have long heard from Trump and his allies.

DETROW: Yeah, to the point where Zuckerberg said in this announcement that fact-checkers were part of the problem. Let’s take a moment and fact-check that. How exactly did the system work and what influence the fact-checkers have on Facebook?

HUO: Fact-checkers are like tipsters to Meta. They monitor the platforms, they see what kind of dubious claims are bubbling up and getting traction, and then they check it out. They either add context or they debunk claims as needed. But ultimately, it is up to Meta to decide what it does with that information. Do they take the post down? Do they make it harder to see, or do they penalize the user in some way? Maarten Schenk, a cofounder of Lead Stories, one of Meta’s fact-checking partners here in the United States, says this was the first time he heard Meta had any problem with the program.

MAARTEN SCHENK: In all the years that we worked with Meta, they never complained or talked to us about any bias at all.

HUO: For Zuckerberg’s accusation of political bias, like, Meta has long exempted politicians from being fact-checked on its platform. That said, some of the perception of bias might stem from the fact that claims circulating amongst conservative gets debunked more. But there is no data to support the idea that conservatives are being unfairly targeted on the platform.

DETROW: Did we get any sense of how much this was actually working, how effective this was in terms of actually curbing the impact of false or misleading information?

HUO: So research shows that fact-checking does have an effect on people when it comes to individual pieces of information that it has fact-checked, although the effect only lasts for a couple of weeks if you don’t repeat it. And at the end of it, it doesn’t add up to, say, changing support for politicians who repeatedly lie, at least not in the United States. Still, that does not stop fact-checkers from becoming a target for those who are fact-checked, including Trump.

DETROW: Right. So going forward, Meta says it is replacing fact-checks with community notes, which is, of course, the system that Elon Musk has championed at X. Any sense, Shannon, how effective that will be?

BOND: Let’s start with how community notes works on X. So users can propose notes on posts that they think might be wrong or misleading. Then those notes get voted on. And if enough people with different views agree, these notes might be shown on a post. But Schenk of Lead Stories says that’s an entirely different thing than professional fact-checking.

SCHENK: The shape of the world doesn’t care if social media users have consensus about it or not. That’s not how facts work.

BOND: And he also pointed out many of the community notes we see on X actually rely on professional fact-checks - right? - to check the facts that they’re talking about. And Meta’s fact-checking partners in the U.S., they had to sign onto a code, you know, affirming transparency and nonpartisanship. You know, these are professional journalistic organizations it was working with. When it comes to the people writing community notes, it’s much less clear who those users are or what their motivations might be. And then there are other shortcomings to the program on X. You know, many proposed notes never get shown at all, especially when it comes to contentious topics that people disagree about, and therefore, it’s, you know, hard to reach consensus on.

DETROW: Ditching the fact-checkers got the most attention, but they were not the only changes that Meta announced. What were some of the other policies that the company changed?

HUO: So Meta also announced that it is going to dial back on the automatic content filters, especially pertaining to, quote, “immigration and gender.” So speech previously not permissible could be getting through now. Meta says that previous filters are, quote, “out of touch,” and if something can be said in the halls of Congress, it should be allowed on the platform. Katie Harbath, a former Facebook employee who now works on platform trust and safety at technology consulting firm Duco Experts, says that the devils are in the details when it comes to this change, and we don’t have the full picture when it comes to how it’s going to affect the marginalized groups.

KATIE HARBATH: I think those folks could be particularly negatively impacted on this and may have to - may make decisions about not being on these platforms.

HUO: Meta already changed its community standards, saying that it is OK to call women property and that gay people are mentally ill.

DETROW: Let’s talk about the politics of this. As I mentioned, this comes just weeks before President-elect Trump takes office. He has cheered these changes. He and other conservatives really pressured social media to walk these policies back. What is the broader political context that we need to know?

BOND: Well, I think Meta is very clearly trying to get on the good side of the incoming Trump administration. And we should remember, you know, it’s not just the pressures over these kind of policies. The company is also facing a lot of regulatory risks, including an FTC lawsuit about its business practices, and so this is not the only move Meta has made. This week it also named Trump ally Dana White to its board. Zuckerberg said it would move content moderators from California to Texas in, you know, what seems to be a clear bid to address these claims of liberal bias. Here’s how Brendan Nyhan, a political scientist at Dartmouth College, described all of these moves.

BRENDAN NYHAN: These were designed in the lab to appeal to Trump and his Republican allies. There’s no question.

BOND: And look, Scott, I mean, other tech companies are also making overtures to Trump, but I think what is striking here is just how overt Meta is being about this.

DETROW: Yeah. Yeah. That is NPR’s Shannon Bond and Huo Jingnan. Thank you to both of you.

HUO: Thank you.

BOND: Thanks.

(SOUNDBITE OF LANA DEL REY SONG, “WEST COAST”) Transcript provided by NPR, Copyright NPR.