YouTube is cracking down on the spread of misinformation by banning misleading and inaccurate content about vaccines.
The streaming platform announced the change in a blog post on Wednesday, explaining that its current community guidelines, which already prohibit the sharing of medical misinformation, have been extended to cover "currently administered" vaccines that have been proven safe by the World Health Organization and other health officials.
The site had previously banned content containing false claims about COVID-19 vaccines under its COVID misinformation policy. The change extends that policy to a far wider number of vaccines.
"We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines," the company said.
YouTube says it has already taken pages down
YouTube says it now bans videos that claim that vaccines aren't safe or effective, or that they cause other health issues like cancer and infertility. In its announcement, the company pointed specifically to videos that inaccurately describe what ingredients are used in vaccines, as well as allegations that vaccines contain properties that can be used to "track" those who receive them.
There are some exceptions: Users are still allowed to share content related to their personal experiences with the vaccine, but only if those videos adhere to the site's community guidelines and the channel in question doesn't routinely encourage "vaccine hesitancy."
The new mandate goes into effect immediately, and YouTube has already removed pages known for sharing anti-vaccination sentiments, like those belonging to prominent vaccine opponents Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and Robert F. Kennedy's Children's Health Defense organization, CNBC reports.
The company says widespread enforcement will take time
But the company, which is owned by Google, warned that the more widespread removal of videos may take some time as it works to enforce the policy.
As big tech companies like YouTube and Facebook have tightened their restrictions regarding vaccine misinformation over the last year, many conspiracy theorists began migrating to other, less-regulated platforms. Rumble, another video-sharing site, has become a popular choice for far-right groups and others who are vaccine resistant, Slate reported in March.
But many conservative pages that spread vaccine misinformation are still active on YouTube and their videos continue to attract millions of views.
Editor's note: Google is among NPR's financial supporters.
Copyright 2021 NPR. To see more, visit https://www.npr.org.