Facebook has vowed to tackle misinformation about Covid-19 vaccine, but misleading messages remain easy to find

Facebook has struggled to tackle anti-vaxxer content for years. Late last year, it enacted new rules to address misinformation about the Covid-19 vaccine, after pledging to reduce the spread of anti-vaxxer content two years ago. But misleading and terrifying content about the Covid vaccines, as well as outright misinformation, continues to spread on the platform at a time when the stakes couldn’t be higher: Misinformation about the vaccine could mean life or death.

Four of Facebook’s top 10 search results for ‘vaccine’ on Facebook were for anti-vaccine accounts, including ‘vaccinetruth’, ‘vaccinefreedom’, ‘antivaxxknowthefacts’ and ‘cv19vaccinereactions’, according to a series of searches conducted by CNN. Business of multiple different Instagram handles started two weeks ago.

Shortly afterwards, Instagram updated its search interface on mobile devices to show three credible results, including the CDC’s account, followed by a “See More Results” prompt. Users clicking that option will then be presented with a bunch of anti-vaccination bills, in what is arguably the digital equivalent of shoving the junk in a bedroom under the bed.

Some of those accounts have garnered significant followers, which begs the question of whether Instagram, which suggested them as a top result for users simply looking for vaccine information, helped them grow an audience. The “cv19 vaccine reactions” account, which is dedicated to documenting claims about vaccine side effects, has more than 77,000 followers. The account often shares baseless reports and insinuates unproven links between people receiving the Covid-19 vaccine and major health events, including stroke or miscarriage.

The fact that some of this anti-vaxx content remains in plain sight on the platforms highlights a controversial distinction in Facebook’s approach: a company spokesperson says Facebook specifically differentiates between misinformation about vaccines, which it fights hard against, and posts that express a more general anti-vaccine sentiment, which it allows on the platform.

In December, Facebook said it would remove claims about coronavirus vaccines debunked by public health officials, including baseless conspiracy theories that they contain microchips. Previously, Facebook’s policy banned misinformation about Covid-19 that “adds to the risk of imminent violence or physical harm.”

Public health experts have said they fear misinformation about Covid-19 vaccines and anti-vaccine content in general on social media could cause people to refuse to have the opportunity. “If they are put off by falsehoods perpetuated through social media, we will have a real problem getting out of this pandemic,” said Dr. LJ Tan, chief strategy officer of the Immunization Action Coalition (IAC).

Joe Osborne, a Facebook spokesperson, said the company has been working to “reduce the number of people who see false information” about vaccines and that it is trying “to do more to address other misleading vaccine content outside of this policy. falls “.

Osborne added that the company is removing claims about the Covid-19 vaccine that have been debunked by public health experts and adding labels and reducing the spread of other disinformation that has been determined to be false by its third-party fact-checking partners.

When a measles outbreak engulfed the US nearly two years ago, Facebook pledged to combat vaccine misinformation by limiting the reach of such content on its platforms, but stopped banning it outright. In March 2019, Facebook said it would “reduce the rankings of groups and pages spreading misinformation about vaccinations” by not including them in recommendations or predictions when users type in the search bar. But two months later, CNN Business discovered that Instagram was still sending out posts from anti-vaccination accounts and anti-vaccination hashtags to anyone looking for the word “vaccines.”
While Facebook removed a large private group dedicated to anti-vaccine content in November 2020, CNN Business found that there are more than 20 anti-vaxxer groups remaining on the platform, with memberships ranging from a few hundred to tens of thousands of users. (The company said the group that removed it in November was flagged for violation of its policy on recidivism – preventing group administrators from creating another group similar to the one the company removed – as well as its policy against the QAnon conspiracy. violated.)

Searching for the word ‘vaccine’ in Facebook’s group feature last week, three of the top 20 results emerged by the platform led to groups promoting anti-vaccine content, including groups called ‘Say No Covid 19 Vaccine “,” COVID-19 Vaccine Injury Stories “and” Vaccine Talk: A Forum for Pro and Anti Vaxxers “- with over 50,000 members. The list fluctuates. A few days later, none of these groups were in the top 20, but results 18-20 pointed to groups discussing vaccine or side effects If you scroll further down, it was easy to find other anti-vaxxer groups in the search results, including one titled “Unvaccinated and Thriving,” which in the description, makes broadly and consistently debunked claims linking vaccines to autism and other conditions and diseases. It is unclear what makes Facebook’s search recommendations possible and why the results show day by day. g change. Facebook did not provide a clear explanation after repeated requests for comment.

Dr. Wafaa El-Sadr, a professor of epidemiology and medicine at Columbia University’s Mailman School of Public Health, called misinformation about vaccines on social media “very dangerous” and said it could have “serious consequences.”

“We are in a race with the virus,” she said. “We must have everyone who is eligible for the vaccines vaccinated as soon as possible.”

A public Facebook group, which has more than 58,000 members, is devoted to reports of alleged “vaccine injuries and reactions.” Several recent posts on the group page contain links that have been marked as “false information” by Facebook’s independent fact-checkers or labeled “Missing Context. Independent fact-checkers say this information can mislead people.” A link shared – and labeled as false by independent fact-checkers – claimed 53 people in Gibraltar died as a result of the Covid-19 vaccine. Despite the warning labels, members of the group continue to engage with these links, voice their doubts about Facebook’s fact-checkers, and share baseless stories or theories about vaccines being dangerous.

“A story doesn’t have to be accurate to change your mind. We’re fighting that now,” said Tan of IAC. “In the age of the Internet, science is not the most compelling story.”

Columbia’s El-Sadr warned people to be wary of anecdotes or individual stories they read in such Facebook groups – which may or may not be true or have a link to the vaccine.

“The vast majority of people have had completely tranquil vaccinations so far,” she said. “We have to keep reminding people of this. These vaccines have had a very safe profile and are incredibly effective.”

.Source