By Elizabeth Dwoskin
As the coronavirus pandemic has raged across the United States, misinformation about vaccines and other health topics has been viewed an estimated 3.8 billion times on Facebook - four times more than authoritative content from institutions such as the World Health Organization and the Centers for Disease Control and Prevention, according to a study by the left-leaning global human rights group Avaaz.
The group also found that Facebook pages promulgating misleading health information got even more traffic during the pandemic than at other times - reaching a one-year peak in April - despite Facebook's policy of removing dangerous coronavirus-related misinformation and reducing the spread of other questionable health claims. In addition, the group found, articles that had been identified as misleading by Facebook's own network of independent third-party fact-checkers were inconsistently labeled, with the vast majority, 84%, of the posts in Avaaz's sample not including a warning label from fact-checkers.
The report, which interpreted data from Facebook's own reported metrics, adds fuel to critics' arguments that major technology companies cannot control the spread of harmful misinformation on their platforms, and in many cases amplify it.
"In the midst of a global health crisis and presidential election cycle, this report is useful because it adds to a growing list of evidence showing how the majority of problematic content is missed by tech companies' moderation systems, and therefore further amplified" by their algorithms, said Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism at Columbia University, whose research has also used Facebook's reporting metrics tool, called CrowdTangle.
The coronavirus crisis was supposed to be an example of Facebook's most robust efforts to prevent harm on its platform. This spring, Facebook chief executive Mark Zuckerberg embarked on a high-profile push to provide better health information to Facebook's 3 billion users, many of whom rely on the platform for the majority of their news about the pandemic.
Zuckerberg launched a banner on top of the Facebook app that would direct users to content from authoritative sources and introduced a new policy of removing harmful misinformation related to the coronavirus, such as the false claim that drinking bleach can kill it. For other types of health misinformation, including false claims from people who are opposed to vaccinations, Facebook chooses not to delete the content but to attempt to limit its spread across the platform by showing it to fewer people.
And yet, throughout the pandemic, Facebook's systems have failed to catch viral misinformation. For example, a documentary called "Plandemic," which claimed that wearing a mask can cause people to develop covid-19, the disease caused by the coronavirus, was shared millions of times before it was removed.
The Avaaz study adds to anecdotal evidence that the company is falling short.
For example, the group pointed to an article that falsely claimed that the American Medical Association was "encouraging" doctors to overcount deaths from covid-19. The article was fact-checked by two independent fact-checking groups, which found it to be misleading. It received more than 6 million likes and comments, and 160 million estimated views, according to Avaaz.
"We share Avaaz's goal of limiting misinformation, but their findings don't reflect the steps we've taken to keep it from spreading on our services," said Facebook spokesman Andy Stone. "Thanks to our global network of fact-checkers, from April to June, we applied warning labels to 98 million pieces of covid-19 misinformation and removed 7 million pieces of content that could lead to imminent harm. We've directed over 2 billion people to resources from health authorities and when someone tries to share a link about covid-19, we show them a pop-up to connect them with credible health information."
A growing body of research has found that online misinformation about health influences people's beliefs and behavior. A recent study by King's College London of 2,000 people in the United Kingdom found that roughly a third thought the coronavirus was cooked up in a lab and that authorities were hiding the true death toll from the virus. Thirteen percent of respondents believed the pandemic was part of a global effort to force people to be vaccinated. People who held these beliefs were more likely to get their news from social media and to violate lockdown rules, the study found.
More than a third of Americans say they won't get a coronavirus vaccine when one is developed, according to a recent Gallup poll.
To conduct its study, Avaaz first identified a sample set of 82 websites that have been identified by fact-checkers as sources of health misinformation, both about the coronavirus and other topics. The group then tracked how problematic articles from those sites were shared across Facebook.
One big finding, which tracks with other research, is that certain Facebook pages act as "super spreaders" of viral misinformation, acting as repeat offenders responsible for a large amount of problematic content. The 42 pages that Avaaz identified as super spreaders collectively have 28 million followers and their content generated an estimated 800 million views. Many of the spreaders include groups that have a long history of opposing vaccination.
The Avaaz researchers were limited in scope by the data that Facebook makes available, a challenge confronting all outside researchers who use Facebook's metrics tools. Publishers and other interested parties can purchase access to CrowdTangle to view how their pages and specific pieces of content on those pages performed in terms of clicks, likes, comments and shares.
Likes, comments and shares are considered engagement metrics. But one basic metric that CrowdTangle does not publish, despite pleas from researchers and publishers, is the number of times a post was merely viewed, even if a user did not actually make a comment or click the like button.
Avaaz extrapolated the number of views from the engagement metrics. It calculated that a Facebook post would have 29.7 times more views than interactions.
The Washington Post