Donald Trump and Hillary Clinton said some things that were flat out untrue — or misleading — in the first presidential debate Monday night. (Check out NPR's comprehensive fact check here.)

It wasn't clear going in whether the moderator, Lester Holt of NBC, would do any fact-checking of the candidates during the debate. He did. (Before the debate, Fox News' Chris Wallace, one of this year's moderators, as well as past moderators like Jim Lehrer and Bob Schieffer, all in one form or another rejected the idea of umpiring the candidates' statements in real time.)

In this election, with conspiracy theories and mistruths flying, though, fact-checking has taken center stage. It has been more than just a sub-genre of journalism, but has been prominent in what would otherwise be straight-news stories — and even right on the small banner right at the bottom of your TV screen on cable news.

There's a tremendous appetite for fact-checking. NPR fans want it, as evidenced by this survey and the number of people who read our fact checks. And the format has grown at an astounding rate. There were 29 fact-checking brands in the United States in 2015, by one estimate, and 24 of them had been created after 2010.

And yet despite all that truth-squadding, lies live on, (un)healthy as ever. Politicians keep spouting falsehoods, and Americans keep believing them.

So has fact-checking failed? Has American politics at last reached its final destination — a desolate, dusty, God-forsaken, Cormac-McCarthian, "post-fact" landscape? We dug into the research, and we have an answer:

No.

However, it also isn't nearly as effective as many fact-checkers/fact-check aficionados would like it to be. Research on fact checks shows that many people don't expose themselves to them, and even those who do don't always come away believing in new information.

So here's what we know about what works, as well as what keeps people believing falsehoods that have been truth-squadded into the ground.

What works

First things first: If we're going to ask whether fact-checking works, the next question is what "works" means.

One simple measure of whether fact-checking is "working" is whether it better informs voters — and in particular, whether it changes the minds of voters who believe misinformation.

The answer is ... sometimes, but it's tough.

A few things help make a fact check stick in readers' minds, as Brendan Nyhan from Dartmouth College and Jason Reifler from the University of Exeter write in their comprehensive July overview of fact-checking studies (from which this article draws heavily). One tactic that seems to work is providing people with sources that share their point of view. Getting a Republican to refute a widely held Republican belief will probably be more effective than a Democrat refuting it, and vice versa.

Another they list is graphical information — including charts, for example — could make information from numerical fact checks stick better in readers' minds.

There's also evidence that providing people with an alternate narrative can change their minds better than offering a simple refutation. One simplified example from the research: Saying "the senator denies he is resigning because of a bribery investigation" is not that effective, even with good evidence that that's the truth.

More effective would look something like this: "the senator denies he is resigning because of a bribery investigation. Instead, he said he is becoming the president of a university."

Those aren't the only measures of a successful fact check. Another measure of whether fact-checking is "working" is whether it makes politicians change their behavior. There is some evidence that this works — consider Trump's (eventual) reversal on the birther question, or Chris Christie eventually giving up the story that he was appointed U.S. attorney for New Jersey the day before Sept. 11.

That's merely anecdotal, but some research bolsters this evidence. In one study, Nyhan and Reifler sent letters to state lawmakers in states where PolitiFact operates. Some were simply told that there was a study of fact-checking going on, while others were warned in their letters that "Politicians who lie put their reputations and careers at risk, but only when those lies are exposed."

Politicians who received the warning letter, it turns out, were less likely to subsequently be called out for inaccuracies and also received higher ratings from fact-checkers.

As for Lester Holt's real-time fact-checking in Monday night's debate, there isn't much research on the effects of that kind of in-the-moment checks, according to Nyhan. However, it's reasonable to think that fact-checking is helpful. For one, it gets a broad audience — people saw fact checks Monday night without specifically seeking them out. Furthermore, repeating a false claim can make it more believable, so real-time fact checks can mitigate that by following false statements with refutations, as Lucas Graves, assistant professor at the University of Wisconsin and author of a new book about the rise of fact checking, has said.

Those are the most obvious ways that fact-checking can either work or fail. But one more measure of fact-checking's success is whether journalists are paying attention to it, Graves says.

"There's every reason to think that the influence on readers and the influence on politicians is greater when there's consensus among journalists and when journalists are widely willing to treat something as debunked that's been thoroughly debunked," Graves told NPR. "And it's always important to keep accentuating that."

Not all journalists are fact-checkers, in other words, but if they all are aware of good fact-checking sources, they can more easily disseminate good information (and smack down the bad).

The hurdles in fact-checking's way

If one goal of the fact check is to clear up misconceptions, it's clearly not super-effective, as evidenced by the number of Americans who still believe falsehoods.

One obvious reason: It can simply be in a politician's interest to allow a falsehood to spread — or at the very least, it just might not be worth the time to try to stop it, as political scientist Anne Pluta wrote at FiveThirtyEight this year.

Then there's the fact that that fact-check readers are a self-selecting crowd.

"We find that the audience skews toward people who are more politically knowledgeable, sophisticated and interested in politics," said Nyhan (who, a sharp reader will notice, has with his often collaborator Reifler, conducted a lot of research into fact-checking).

A recent study from Nyhan and Reifler found that 46 percent of subjects who scored high on a political knowledge test were "extremely" or "very" interested in reading a fact check, compared to 24 percent of people with low political knowledge.

Not only that, but there's a partisan divide in trusting fact checks. Among high-knowledge subjects, 59 percent of Democrats had "very favorable" opinions of fact-checkers, compared to 34 percent of Republicans. That's stark, and it's also unsurprising, considering how low Republicans' trust in the media is compared to Democrats'.

Another of the biggest hurdles: the ever-stubborn human brain.

"In general, humans are really good at ignoring information that cuts against their ideological preferences," Graves said. "That's true on the left and the right. It's true for more-educated as well as less-educated people. That's always been the case."

People really like believing what their side believes, and they seek out information that confirms those beliefs. In addition, they have a tendency to incorporate (or ignore) any new information in a way that supports their existing worldview. According to one 2000 study Pluta cited, researchers surveyed people's knowledge about welfare. They found that the most misinformed people were also among the most confident in their misinformed ideas — and many of them resist correct information when it's presented.

In fact, it's possible for fact checks to be counterproductive. Presenting people with an untrue claim, even while refuting it, can in some cases make people believe the untrue claim even more, as MIT political science professor Adam Berinsky found in a study about belief in (nonexistent) "death panels" in the Affordable Care Act.

Nyhan and Reifler similarly found a " backfire effect" to trying to correct a voter's information — the researchers found that "direct factual contradictions can actually strengthen ideologically grounded factual beliefs."

One final hurdle to fact checks is the supermassive, superfragmented media marketplace.

In light of the constant roar of information that people face, Graves said, "We need reporters more than ever to tell us whether claims are true or false because there are so many different sources of information today, because it's so much harder today to determine what's a reliable outlet and what isn't."

The upside to a massive amount of news outlets is that a massive amount of information is clicks away. The downside is that the same is true of misinformation. More outlets mean more opportunities for a lie to be repeated, meaning more ways it can embed itself in a voter's head.

"A wild claim in the 1950s about the president's birthplace wouldn't have had the same purchase in political discourse that it has had," Graves said.

One more problem in fact-checking's way: Falsehoods get repeated a lot in news coverage. That repetition helps a lie stick in a viewer or reader's brain — as we wrote in a piece on conspiracy theories recently, it's easier to believe an idea if it's more familiar.

All of that may sound discouraging. But Nyhan cautions that just because fact checks have not succeeded 100 percent of the time, that doesn't mean they have "failed"; after all, it's impossible to know how common misperceptions would be without fact-checkers. Or, as he has memorably put it:

Is Trump a special case?

"Most politicians are very risk averse, very sensitive to the news coverage they get," Nyhan said. "They're worried about being portrayed as inaccurate or dishonest in public. They take the coverage they receive seriously."

This is how it often works: politician says a falsehood, journalists point out falsehood, chastened politician stops saying falsehood.

But that's not how Donald Trump seems to work. PolitiFact's tally of its ratings for Trump, Clinton and past candidates shows that to a remarkable degree, Trump has said things that were found "false" and "pants on fire" — as of mid-August, those statements made up just over half of the Trump statements PolitiFact had checked. It's not exactly a scientific measure, but it gives some sort of a quantification to what many have already pointed out: Trump has exhibited a remarkable inclination to shrug off fact checks.

Even when Holt tried to fact-check Trump on Monday night, Trump dug in his heels on positions that were easily disprovable.

Pundits have reacted in bemusement, then, to the fact that Trump still sometimes gets better marks from voters than Clinton on things like truthfulness.

One thing that helps Trump is that his party already distrusted the media and relatedly — as we pointed out above — doesn't seem to get very excited about fact checks. Trump fans the flames of that distrust at every opportunity.

Moreover, there's a different potential reason why some voters see him as honest: his "tell it like it is" persona. That's what his supporters time and again have said they like about him.

Meanwhile, Clinton's reputation for dishonesty is rooted in part in whatever her motivations were for setting up a private email server — that is, something that is not really fact-checkable.

Long story short, fact checks aren't hyper-effective at making voters hyper-informed political wonks. And there's plenty of reason to think that the fact-checking burden will remain heavy for years to come, as people dig into their respective sides — and their respective sides' versions of the facts.

"For the foreseeable future, our politics are going to be highly polarized, so we're going to be highly vulnerable to partisan misperceptions," Nyhan said.

And that polarization can create people who not only exist in different belief systems, but who seem to exist in different realities. Or, as George Saunders more entertainingly put it in The New Yorker this year:

"Intellectually and emotionally weakened by years of steadily degraded public discourse, we are now two separate ideological countries, LeftLand and RightLand, speaking different languages, the lines between us down. Not only do our two subcountries reason differently; they draw upon non-intersecting data sets and access entirely different mythological systems. You and I approach a castle. One of us has watched only 'Monty Python and the Holy Grail,' the other only 'Game of Thrones.' What is the meaning, to the collective 'we,' of yon castle? We have no common basis from which to discuss it."

So. Polarization will breed both more facts to check and more stubborn minds to change — fun times ahead for fact-checkers nationwide.

Further reading:

  • Writing about the latest fact-checking research necessarily means digging into studies from Nyhan and Reifler, who are among the most prolific academics on the subject of fact checks. For a superdeep dive, this paper from them and Dartmouth postdoctoral fellow D.J. Flynn is a much more comprehensive review of the evidence. (And here's a more digestible guide that Nyhan and Reifler wrote in 2012.)
  • For more information on what makes conspiracy theories and rumors (two particular types of political misinformation) stick, check out our recent article.
  • FiveThirtyEight's Christie Aschwanden has likewise dug into how hard it is to change a voter's mind. Also at FiveThirtyEight, Pluta teases out the differences between misinformed and uninformed voters.
  • Not a fan of fact checks? You're not alone. University of Miami political science professor Joseph Uscinski has plenty of beefs with the form. For his blistering critique, go here.
  • Copyright 2016 NPR. To see more, visit http://www.npr.org/.