Social media groups under pressure to root out vaccine misinformation


Social media companies are facing renewed pressure to clamp down on a small but dedicated group of anti-vaccine campaigners that researchers blame for flooding platforms with misinformation.

The Center for Countering Digital Hate has urged Facebook, Google and Twitter to ban 12 people it has found are responsible for about two-thirds of online anti-vaccine content, among them Robert F Kennedy Jr, son of the late Democratic senator Robert F Kennedy, and the alternative medicine entrepreneur Joseph Mercola.

The CCDH’s research comes as 12 Democratic state attorneys-general say they are planning to write to both Facebook and Twitter urging them to do more to curb anti-vaccine misinformation. And it comes a day before the chief executives of Facebook, Twitter and Google are all scheduled testify in front of Congress about online extremism and misinformation.

Imran Ahmed, the chief executive of the CCDH, said: “Facebook, Google and Twitter have put policies into place to prevent the spread of vaccine misinformation; yet to date, all have failed to satisfactorily enforce those policies. 

“All have been particularly ineffective at removing harmful and dangerous misinformation about coronavirus vaccines, though the scale of misinformation on Facebook, and thus the impact of their failure, is larger.”


share of Americans polled by Kaiser Family Foundation who say they do not want to get a Covid-19 vaccine immediately

Facebook did not respond to a request for comment.

As supplies of coronavirus vaccine increase across the US, politicians and doctors are beginning to focus more on changing the minds of people who say they are reluctant to take one.

Recent polls conducted by the Kaiser Family Foundation think-tank show 42 per cent of Americans say they do not want to get vaccinated immediately — though that figure is down from 63 per cent in December. The KFF also found that those who are adamantly opposed to taking the vaccine were far more likely to read their news on Facebook than others.

Researchers say much of the anti-vaccine content on social media can be traced back to a handful of individuals or groups. In a report published on Tuesday, the CCDH sampled hundreds of thousands of anti-vaccination posts on Facebook and Twitter and found that 65 per cent of them could be traced back to just a dozen people. 

Mercola, the most prolific among them, has 3.6m followers across his Facebook, Twitter and Instagram accounts, all of which are active. In one of the sample posts highlighted by the CCDH, Mercola wrote on Instagram: “Forced vaccination is part of the plan to ‘reset’ the global economic system.”

Mercola’s office did not respond to a request to comment.

The second-most significant anti-vaxx influencer, the report found, was Kennedy, and head of the anti-vaccination group Children’s Health Defense. While Instagram has already removed Kennedy, Twitter and Facebook have rejected calls to do so.

Kennedy told the Financial Times last week: “I never posted a single inaccurate statement on my Instagram [account]. All of my factual assertions were sourced to government databases or peer-reviewed publications.” He did not respond to a request to comment on CCDH’s report.

Social media platforms have tended to take a piecemeal approach to misinformation moderation, focusing on removing individual pieces of content that breaches their rules rather than targeting the individuals repeatedly responsible.

Since last year Facebook has banned vaccine claims that have been debunked by public health officials. In February the social network said it was expanding those rules to prohibit more general false claims around vaccines, including that they are ineffective or toxic.

Twitter recently introduced a clearer strike system, whereby rule-breakers are hit with increasingly tough restrictions if they repeatedly post misleading content, with five strikes resulting in a permanent ban, for example. It said it was reviewing the accounts flagged by the CCDH report.

Google’s YouTube video website has a strike system too. It said in a statement that it had taken down more than 30,000 coronavirus vaccine misinformation videos since October under its policies, but did not respond to queries about the “superspreaders” of misinformation highlighted in the report.

Coronavirus business update

How is coronavirus taking its toll on markets, business, and our everyday lives and workplaces? Stay briefed with our coronavirus newsletter.

Sign up here

Read More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *