As reported Wednesday in the New York Times, YouTube followed other major social media platforms in banning not just Covid-related misinformation but certain misinformation around all vaccines, including popular yet patently false videos about the alleged danger of measles, Hepatitis B, and other common vaccinations. Specifically, it banned the accounts of several individuals and organizations, including those of Joseph Mercola and the Children’s Health Defense Fund affiliated with controversial anti-vaxxer Robert F. Kennedy Jr. (the son of the former Attorney General RFK). Breaking down the action further, the company banned videos that
claim vaccines generally do not reduce rates of transmission or contraction of disease,
make false claims about what’s in the vaccines,
claim approved vaccines cause autism, cancer or infertility, or
falsely state vaccines contain ID trackers.
YouTube, which is owned by Google, explained its move in a blog post:
Working closely with health authorities, we looked to balance our commitment to an open platform with the need to remove egregious harmful content. We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines.
Long before Covid-19, anti-vaccine misinformation purveyors had found a comfortable home on YouTube. There were so many false or misleading videos that vaccine advocacy groups had been constructively pushed off the platform years ago. Attempts by the groups to get real information out would be undermined immediately by YouTube recommendation engines that suggested false anti-vaccine information to viewers, as if the two were somehow equivalent beliefs.
Many parents already are wary or skeptical of vaccines based on false things they heard from their families or friends. When they were spoon-fed misinformation, especially immediately after seeing a truthful video that might otherwise have cut against their existing suspicions, this ironically only reinforced their false beliefs. The fact that the platform itself had made the bad content recommendations provided even further credibility to what should have been a non-credible source.
“It has been incredibly frustrating to try and share good, science-based information about vaccines on YouTube, only to have the algorithms then suggest anti-vaccine content to our viewers,” said Erica DeWald, communications director of Vaccinate Your Family, the nation’s largest non-profit advocate for vaccines. “We’re hopeful this is a positive step toward ensuring people have access to real information about vaccines and will signal other social media companies to follow suit.”
The move comes after Facebook came under fire in July from President Biden, who criticized the platform for not doing enough to halt misinformation. “They’re killing people,” Biden had said to reporters. “Look, the only pandemic we have is among the unvaccinated. And they’re killing people.” Asked later to clarify, Biden said he had read an article about how a majority of misinformation around vaccines was coming from just 12 bad actors, referred to collectively as the “disinformation dozen” by the Center for Countering Digital Hate for spreading vaccine misinformation and internet hoaxes. Facebook had already taken some enforcement action against certain pages and accounts connected to these dozen accounts.
Facebook protested the president’s characterization vigorously, but apparently the message had been delivered and YouTube had heard it as well. Prior to yesterday’s ban, YouTube had focused on removing “borderline” videos from recommendation algorithms, meaning people had to find them through specific searches. The problem was, links to these videos remained readily available and still spread rapidly across other social media platforms such as Facebook, where anti-vaxxers could simply post the links in their newsfeeds and in comments.
Asked why the company didn’t act sooner, a spokesperson for YouTube explained that they had been focused on stopping Covid-19 related misinformation and only recently realized that false claims about other vaccines were contributing to unfounded fears about the Covid-19 vaccines. “Developing robust policies takes time,” the spokesperson said. “We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”
The move, late as it is, may have come just in time. According to some online misinformation watchdogs, anti-vax groups had been gearing up to use the uproar over vaccinations of school-aged children against Covid-19 as a springboard to grow their online audiences. “Anti-vaccine activists have been very vocal about the fact that they saw Covid as an opportunity to undermine confidence in the childhood vaccine schedule,” said Renée DiResta, who heads anti-vaccine disinformation at the Stanford Internet Observatory. “Seeing YouTube take this action is reflective of the fact that it seems to be aware that that tactic and dynamic was beginning to take shape.”
Too little too late
The greatest misinformation is to make the world believe that the vaccine is safe.