My Experience Having Scientific Discussion Censored By Facebook
March 16, 2021
By Lisa Petrison, Ph.D.
In addition to running the Paradigm Change websites, I am the founder and administrator of a group called Mold Avoiders on Facebook.
The group was founded in 2015 and has the purpose of allowing those who are pursuing mold avoidance as described in the book A Beginner’s Guide to Mold Avoidance to discuss whatever topics seem to be of particular relevance to them.
Most of the people in the group have a history of ME/CFS, which has been acknowledged to be closely related to – or perhaps the same thing as – the phenomenon that has become known as “Long Covid.”
In addition, a high percentage of the group members have reported having experienced serious bouts with either formally diagnosed Covid or Covid-like illness themselves and to have had difficulties fully recovering from the illness.
It thus is the case that issues related to Covid are of particular relevance to Mold Avoiders group members, with group posts on the topic generating a considerable amount of interest.
Medical journal papers are frequently discussed in the group, which includes a substantial number of Ph.D.’s, M.D.’s and other people with science backgrounds.
On December 10, a Mold Avoiders group member posted a link to a peer-reviewed article in the medical journal Antiviral Research, published in June 2020.
The article is indexed on Pub Med (PMID: 32251768).
The title of the article is “The FDA-approved drug ivermectin inhibits the replication of SARS-CoV-2 in vitro.” Here is the abstract:
Although several clinical trials are now underway to test possible therapies, the worldwide response to the COVID-19 outbreak has been largely limited to monitoring/containment. We report here that Ivermectin, an FDA-approved anti-parasitic previously shown to have broad-spectrum anti-viral activity in vitro, is an inhibitor of the causative virus (SARS-CoV-2), with a single addition to Vero-hSLAM cells 2 h post infection with SARS-CoV-2 able to effect ~5000-fold reduction in viral RNA at 48 h. Ivermectin therefore warrants further investigation for possible benefits in humans.
As an introduction to the topic, the Mold Avoiders member – who posts in the group via her public Facebook page “Chemical Free Gal” – wrote the following:
Does anyone find it interesting that an Anti-parasite drug is being used successfully to treat the current virus?
I was reading some articles from Italy and a few other countries and they all recommend using Ivermectin as a prophylactic agent and to treat infected patients.
“Ivermectin is an inhibitor of the COVID-19 causative virus (SARS-CoV-2) in vitro. A single treatment able to effect ~5000-fold reduction in virus at 48 h in cell culture. Ivermectin is FDA-approved for parasitic infections, and therefore has a potential for repurposing. Ivermectin is widely available, due to its inclusion on the WHO model list of essential medicines.
https://www.sciencedirect.com/science/article/pii/S0166354220302011?
It was my belief that this was an interesting article that had potential relevance to the general understanding of the etiology of the disease as well as possible future treatment strategies.
I therefore approved the post and kept a close eye on the thread.
To my recollection, the discussion (which has been deleted by Facebook) centered solely on the topic of why Ivermectin might be helpful for the illness as well as the methodology described in the paper.
I did not observe any group member make a suggestion that people contact their doctor to try to get the drug prescribed off-label or attempt to obtain the drug illegally for personal use.
It was solely a discussion of the science.
On December 15, 2020, Facebook sent me a notification that they had removed the thread and had given me an “Administrator Violation” for having approved it to begin with.
The notification read:
Group Quality
We want to help you keep your group safe. Group Quality lets you know if content in your group goes against Facebook policies, and what you can do about it, and what you can do.
WARNINGS
Admin violations
These are violations that were posted or approved by an admin or moderator. If more admin violations occur, we may disable your group.
You approved one piece of content that Facebook later removed for violating Community Standards.
A link was then supplied to a separate page showing a picture of the removed post. The introduction stated:
Violations Lisa Petrison Approved
Chemical Free Gal’s post goes against our Community Standards on misinformation that could cause physical harm.
I immediately wrote to Facebook through the “Groups Support” section the following note:
Hi. I received an administrator violation as approving a post that FB thought could cause harm (see attachment). The post was a link to a Pub Med indexed article and it was just a general discussion about the topic, without any suggestion that people try to acquire the drug prior to it being approved. FB says that if this happens again, it will close the group. How can I find out what kinds of posts are and are not acceptable to you, since this one was just a discussion about a general peer-reviewed article indexed in Pub Med? Thanks!
Following is the response that I received:
Moreover, I would really appreciate it if you could take part in our short customer experience survey. Your feedback would be important to us as we strive to provide the best customer experience to our users.
Best regards,
Ivo
I wrote back:
Hi Ivo,
I don’t mind that the thread was deleted. That is fine.
What concerns me is that FB is saying that I committed an “Administrator Violation” for having approved the post and that if something like that happens again, the group will be shut down.
I still do not think that there was anything wrong with the post because it was just a general discussion of the science of peer-reviewed articles that had been listed on Pub Med.
The original post said nothing about the idea that people should try to get the drug and take it themselves. It was just a discussion of the science, which I feel should be allowed.
And I did not see any comments that suggested that people might want to get the drug and take it either.
So I do not feel that this was a post that should be classified as having the potential of causing people harm. It was just a general science discussion.
Again, I don’t mind that you deleted it. But it is worrisome that you are saying that you may close my whole group because I approved something that I still do not think is in violation of FB’s own policies.
So I would like some further guidance on this.
Thanks very much for your help.
Best, Lisa Petrison
Following is the response that I received:
Hi, Lisa.
Thanks for contacting the Facebook Groups Admin Support Team. My name is Rui and I’ll be glad to help you today. We hope you are well.
I’ve read the case and I can see that you reported a situation related to content that was removed and the respective message that was sent regarding your Group being disabled.
As I’m colleague explained, all the decisions are subject to appeal and if the appeal is successful, then the Group will be cleared of those infractions.
That being said, these messages about your Group being potentially disabled are sent automatically and they do not necessarily correspond to the situation of your Group, so for now there is no need to worry.
Just keep managing your Group according to our Community Standards and all shall be fine.
We hope this clarifies the case and we wish you all the best.
If you have a different question, or encounter another issue, please create a new support request. Let us know how we did by completing the survey you receive.
Sincerely,
Rui
Facebook Groups Admin Support Analyst
I am not certain whether the Chemical Free Gal page protested the removal of the original post from the Mold Avoiders group.
My reading of the correspondence with the Facebook support staff suggested to me that there was nothing that I could do to either get the post reinstated or to clear the Administrator Violation that had been levied against me.
I remained confused about how to avoid getting future Administrator Violations and thus how to prevent my group from being removed from Facebook since I did not believe that I did anything that was not consistent with Facebook Community Standards to begin with.
Since that time, I have checked the Group Quality section of the Mold Avoiders group numerous times. Occasionally the Admin Violation against me will seem to have been removed, but then the next time I check, it will be listed again.
On January 30, 2021, I found online on a website called OversightBoard.com a description of a case that sounded similar to what I had experienced. The article stated:
The Oversight Board has overturned Facebook’s decision to remove a post which it claimed, “contributes to the risk of imminent… physical harm.” The Board found Facebook’s misinformation and imminent harm rule (part of its Violence and Incitement Community Standard) to be inappropriately vague and recommended, among other things, that the company create a new Community Standard on health misinformation.
In October 2020, a user posted a video and accompanying text in French in a public Facebook group related to COVID-19. The post alleged a scandal at the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products), which refused to authorize hydroxychloroquine combined with azithromycin for use against COVID-19, but authorized and promoted remdesivir. The user criticized the lack of a health strategy in France and stated that “[Didier] Raoult’s cure” is being used elsewhere to save lives. The user’s post also questioned what society had to lose by allowing doctors to prescribe in an emergency a “harmless drug” when the first symptoms of COVID-19 appear.
In its referral to the Board, Facebook cited this case as an example of the challenges of addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.
Facebook removed the content for violating its misinformation and imminent harm rule, which is part of its Violence and Incitement Community Standard, finding the post contributed to the risk of imminent physical harm during a global pandemic. Facebook explained that it removed the post as it contained claims that a cure for COVID-19 exists. The company concluded that this could lead people to ignore health guidance or attempt to self-medicate.
The Board observed that, in this post, the user was opposing a governmental policy and aimed to change that policy. The combination of medicines that the post claims constitute a cure are not available without a prescription in France and the content does not encourage people to buy or take drugs without a prescription. Considering these and other contextual factors, the Board noted that Facebook had not demonstrated the post would rise to the level of imminent harm, as required by its own rule in the Community Standards.
The Board also found that Facebook’s decision did not comply with international human rights standards on limiting freedom of expression. Given that Facebook has a range of tools to deal with misinformation, such as providing users with additional context, the company failed to demonstrate why it did not choose a less intrusive option than removing the content.
The Board also found Facebook’s misinformation and imminent harm rule, which this post is said to have violated, to be inappropriately vague and inconsistent with international human rights standards. A patchwork of policies found on different parts of Facebook’s website make it difficult for users to understand what content is prohibited. Changes to Facebook’s COVID-19 policies announced in the company’s Newsroom have not always been reflected in its Community Standards, while some of these changes even appear to contradict them.
The Oversight Board overturns Facebook’s decision to remove the content and requires that the post be restored.
In a policy advisory statement, the Board recommends that Facebook:
* Create a new Community Standard on health misinformation, consolidating and clarifying the existing rules in one place. This should define key terms such as “misinformation.”
* Adopt less intrusive means of enforcing its health misinformation policies where the content does not reach Facebook’s threshold of imminent physical harm.
* Increase transparency around how it moderates health misinformation, including publishing a transparency report on how the Community Standards have been enforced during the COVID-19 pandemic. This recommendation draws upon the public comments the Board received.
I had not even known that there was such a thing as a Facebook Oversight Board before finding this website (and had not been notified of that fact by the Facebook support staff who responded to my queries).
I wanted to submit an appeal to the Oversight Board myself, but I found in a description of the Appeals Process that this only could be done within 15 days of receiving a response from Facebook staff.
On February 24, 2021, the Daily Mail (the largest circulation newspaper in the UK) wrote an article about the potential use of Ivermectin in Covid, titled “Drug used to treat lice and scabies could cut Covid deaths by up to 75%, research suggests.”
On March 5, 2021, The Wall Street Journal ran an opinion article authored by the WSJ Editorial Board on Facebook’s censorship of scientific discussion called “Fact Checking Facebook’s Fact Checkers: The media giant is employing left-wing vetters to limit scientific debate.”
The article concluded:
Scientists often disagree over how to interpret evidence. Debate is how ideas are tested and arguments are refined. But Facebook’s fact checkers are presenting their opinions as fact and seeking to silence other scientists whose views challenge their own.
We’ve been leery of proposals in Congress to modify Section 230 protections that shield internet platforms from liability. But social-media giants are increasingly adding phony fact checks and removing articles flagged by left-leaning users without explanation. In short, they are acting like publishers in vetting and stigmatizing the content of reputable publishers. The legal privileges that enable these companies to dominate public discourse need to be debated and perhaps revised.
On March 9, 2021, the Children’s Defense Fund (a website run by attorney Robert F. Kennedy Jr.) published an article headlined: “Facebook Censors Scientific Debate: As the social media giant continues to vet content and silence opposing views, roughly two dozen states have introduced bills that would allow for lawsuits against the platform for censoring posts.”
One would think that after all this, Facebook might have reviewed their policies with regard to censoring discussions of scientific findings and made some alterations in its algorithms.
However, I have continued to hear from people who as recently as last week have been punished by Facebook in various ways by discussing science-based issues related to Covid in a factual and responsible way.
This is especially concerning to me personally since I continue to be operating in a climate of fear with regard to discussing anything at all in my group.
This type of censorship of sober scientific discussion is opposed to everything that I – as a trained scientist – believe that science is all about.
I thus felt it necessary to share the details of my own experience in a public way, so that others who may be affected by this development or who may be concerned about it will be aware of it.
About The Author
Lisa Petrison is the founder of Paradigm Change and Mold Avoiders.
She holds a Ph.D. in marketing and social psychology from the Kellogg School of Management at Northwestern University.
Links on this page are in orange (no underlining).