Facebook’s fact checking assertions and policies have been called into question after the British Medical Journal (BMJ), (one of the world’s most prestigious medical publications) has been labelled by the social media network as a spreader of misinformation.

Background

On 2 November 2021 the BMJ published an investigative article based on a whistle-blower report, which called into question data integrity and regulatory oversight issues surrounding Pfizer’s pivotal phase III Covid-19 vaccine trial.

The article included material which revealed a host of poor clinical trial research practices that The BMJ argue could impact the data integrity and of course patient safety

As one would expect, the article was subjected to an external peer review and to the BMJ’s usual high level editorial oversight and review.

The censorship

Soon after the publication, readers began reporting a variety of issues when trying to share the article. Some had been unable to share it at all, whilst others had been flagged with warnings about missing context.  Many reported having their posts flagged with a warning about “Missing context … Independent fact-checkers say this information could mislead people.”

Group administrators where the article was shared received messages from Facebook informing them that such posts were “partly false.”

Others were informed by Facebook that people who repeatedly share “false information” might have their posts moved lower in Facebook’s News Feed.

Readers were also directed to a “fact check” article which was carried out by a Facebook contractor named “Lead Stories”.

Facebook’s policy on fact-checking says that if you repeatedly share misinformation, your account will face restrictions, including having their distribution reduced.

This policy is designed to make sure that responsible and authoritative sources do not reshare information that has been flagged by Facebook as misinformation, if they wish to keep their Facebook account free of restrictions.

Facebook explains what “missing context label” means and even gives examples:

“Missing Context,” is designed for content that may mislead without additional context. For example, cropping a video clip to take out certain words; changing it from: “I would support this candidate if…” to saying “I support this candidate.” Or, claiming that funding for a government program has been “zeroed out” when its funding has been dramatically reduced but not eliminated.”” .

The Open Letter

The BMJ responded with fury. They posted an open letter from to Mark Zuckerberg. In the letter, they claimed that the fact- checking performed by Lead Stories was inaccurate, incompetent and irresponsible.

They pointed out that other than labelling the article as misinformation, Lead Stories failed to provide any assertions of fact or any explanation of what the BMJ article got wrong.

The BMJ said that the title of the fact-checking statement: “Fact Check: The British Medical Journal Did NOT Reveal Disqualifying And Ignored Reports Of Flaws In Pfizer COVID-19 Vaccine Trials” did not make any sense and that the first paragraph inaccurately described the BMJ as a “news blog” (which presumably was either intended to belittle the BMJ as a credible source or perhaps just showed a complete lack of understanding by Lead Stories of what the BMJ is).

Whilst Lead Stories fact checking article contained a screenshot of the BMJ article with a stamp over it stating “Flaws Reviewed,” the letter pointed out that Lead Stories did not identify even a single falsity or untruth in the BMJ article.

The BMJ concluded their open letter with the following words of advice to Facebook:

“Rather than investing a proportion of Meta’s substantial profits to help ensure the accuracy of medical information shared through social media, you have apparently delegated responsibility to people incompetent in carrying out this crucial task. Fact checking has been a staple of good journalism for decades. What has happened in this instance should be of concern to anyone who values and relies on sources such as The BMJ.”

Its not about the facts but rather about what people make of the facts

Following the open letter, Facebook’s removed the label from the article, and accepted that the articles should have “probably” not been flagged.

Lead Stories replied with an unrepentant statement on their own website. They justified their claims about The BMJ’s article on the basis that the article’s headline “Covid-19: Researcher blows the whistle on data integrity issues in Pfizer’s vaccine trial” was being used by anti-vaccine activists as “proof” the entire clinical trial was fraudulent and the vaccine unsafe.

It also explained that the censorship was justified because, at around the same time as the BMJ article was published, unbeknown to the BMJ, there was a spike of fake news going viral on social media, claiming that the CEO of Pfizer was arrested for fraud.

Admittedly, the decision to censure the BMJ articles, was not because the article contained misleading claims or because it presented facts in an unfair or unbalanced manner, but rather for the dubious reason that members of the public might have misunderstood the article to mean that it supported anti-vaccination claims. In other words, Facebook doesn’t believe that internet users are smart enough to understand an article by the BMJ so instead, they labelled  the article as misinformation even though, admittedly, the article contained no misinformation at all.

Are statements of fact-checking of facts facts?

Clearly not.  According to Facebook, assertive statements of false information, misinformation or disinformation following fact-checking are not facts. They are only an opinion, even though, they are stated as facts and they also carry with them various sanctions.

Recently, journalist John Stossel experienced similar treatment to that of The BMJ. He made a video that said that California’s wildfires were mostly caused by poor government management. Facebook censored that as “misleading.” It linked to another Facebook fact checking organisation called Science Feedback that put the following sentence in quotation marks, as if it was something that Mr Stossel said in his video: “Forest fires are caused by poor management. Not by climate change.”

The problem was that he never said that in his video. So after abortive attempts to resolve the issue with Facebook, Stossel decided to sue Facebook for defamation.

In its defence, Facebook stated that the labels they affix to Facebook posts following recommendations by their fact-checkers, “…themselves are neither false nor defamatory; to the contrary, they constitute protected opinion.” 

Unfortunately, this is not the way Facebook represents those “opinions” to internet users.  Presumably, fact-checks are statements of fact, which are stated to contradict another statement of fact.

On its website, Facebook explains that : “Each time a fact-checker rates a piece of content as false, Facebook significantly reduces the content’s distribution … We … apply a warning label that links to the fact-checker’s article, disproving the claim.” 

Whilst in both the BMJ case and the John Stossel case Facebook did not even attempt to disprove facts, Facebook’s claim that its fact-checking labels constitute a protected opinion, placing it in a position which is similar to that of a journalist.

It makes it difficult to see how Facebook can, at the same time, claim to be a neutral platform to which the defence of Section 230 of the Communications Decency Act applies, which generally provides immunity for website platforms with respect to third-party content.

This post was originally published on the Social Media Law Blog and is reproduced with permission and thanks.