Technology

Fundamental Science: The anti-vaccine that screamed wolf

by

The fable tells that a young shepherd who felt alone in the countryside decided to shout “wolf” to attract the attention of the village and gain company. The peasants answered the first few times, but soon got tired of the prank and stopped running. The day the wolf actually appeared, the screams sounded in vain and the sheep – as well as the shepherd, in some versions – ended up being devoured.

In December 2021, the editors of British Medical Journal, one of the most traditional medical journals in the world, published an open letter to Mark Zuckerberg. In the letter, the editors warn that Facebook’s fact check marks as “decontextualized” shares of a story published by the magazine, about irregularities in a clinical trial of Pfizer’s vaccine in the US. The company stuck to its warning, claiming that the article was being shared with false information, that the problems affected only three of the 153 centers in the study, and that the allegation came from a source who interacted with anti-vaccine activists on Twitter.

Even those who do not follow the debate on vaccination must have already received some alarming news in the family or condominium WhatsApp group. Vaccines against Covid-19 will turn us into genetically modified organisms and allow the patenting of human beings. Or maybe insert microchips that will make us controllable by 5G antennas. Or simply cause an epidemic of hospitalizations that will cause more damage than the disease itself.

None of this information is true, which is causing growing pressure on social media platforms to be more proactive in removing misinformation. But with billions of posts shared on Facebook and nearly a million hours of video posted on YouTube daily, checking content becomes a task that is far beyond the capabilities of a human being – or even an army of them.

Not surprisingly, much of the work has been delegated to algorithms – not just on vaccines but on numerous issues related to Covid-19. Youtube’s list of policies lists dozens of categories of content not allowed by the platform, ranging from claims about risks of vaccines or masks to claims that medications like hydroxychloroquine and ivermectin are effective or safe in treating the disease.

Policies have been enforced across platforms – and have been removing not just random freaks but profiles with scientific credentials. Twitter recently suspended the account of American immunologist Robert Malone, who calls himself the inventor of mRNA vaccines (a somewhat exaggerated title, although Malone has important contributions to the topic), after he claimed that the Pfizer vaccine caused more disease than that prevented. It was not the first controversial statement by Malone, who has been gaining popularity with the anti-vaccination movement for months.

Who watches the watchmen?

The decision to prevent an immunologist from talking about vaccines, however, has raised questions about the social network’s authority to arbitrate on scientific topics. Malone is smart enough not to mention microchips, and his statements focused on the balance between risks and benefits of vaccines. From the information that survives, his arguments do indeed seem out of whack, with claims of neurological and reproductive damage that have no empirical basis. That said, it’s hard to judge the merits of his suspension, as his tweets disappeared along with his profile, and Twitter don’t bother to explain their reasons.

The censorship of algorithms does not only apply to people, but also to specific materials. A debate between scientists in favor and against the use of ivermectin in Covid-19 was recently removed from youtube, although it has been restored by the platform, which suggests that the censorship was the work of an algorithm designed to take down content with the word “ivermectin”. It is not new, by the way, that early treatment activists refer to the drug with names such as “ywermmeqtynah” or simply “i” to escape the machines.

The filtering scope of platforms applies even to scientific articles. Posts with links for studies published in scientific journals – not necessarily of great reputation, but in theory peer-reviewed – have been removed from social networks, on the grounds of publicizing unrecognized treatments. It is true that there is a huge universe of low-quality studies on Covid-19 (or anything else); but if academic science is unable to perform effective quality control, it seems unlikely that delegating the task to social networking algorithms will solve the problem.

It is obvious that criticism of vaccines carries with it potential public health risks. More than that, it is recognized that a useful technique for undermining scientific consensus is to create the impression that a valid debate exists when it has already been overcome by the scientific community, as in the case of smoking or global warming. It is far from obvious, however, to identify when legitimate debate gives way to fabricated doubt – and if this is difficult for scientists, it is to be expected that algorithms or fact-checkers end up getting it wrong.

Vaccines, for example, have real side effects, even if these are rarely serious. So far, it seems undeniable that the benefits of vaccination against Covid-19 have outweighed the risks – which makes questions about their effectiveness, as in the recent technical note from the Ministry of Health, border on flat earth. When considering issues such as the vaccination of children, however, where the data are less numerous and the potential benefits are smaller and more difficult to estimate, considering the issue as a closed debate seems reckless, not least because new evidence may lead to changes in recommendations. .

It is clear that the anti-vaccination movement knows how to take advantage of these doubts, and it is not by chance that the questioning of childhood vaccination has become a banner of the Federal Government to please its base. Turning the issue into an information war against Bolsonaro and his cronies, however, can backfire, by strafing reasonable opinions and discarding evidence before we can even discuss it. And amidst the wolf-shrieks of anti-vaccinations, the chance to smother warnings about real dangers becomes ever greater.

The Joe Rogan Effect

The issue has no easy solutions. On the one hand, it is likely that most content removed from networks does indeed deserve to be deleted. On the other hand, it is debatable whether algorithms or social media guidelines are competent to not throw the baby out with the bathwater. In addition to the technical ability to identify relevant opinions, there are questions about the scope of the platforms’ power. As has been reiterated by figures such as journalist Glenn Greenwald, it seems at the very least risky to delegate the function of determining what can be said – including in scientific debates – to the most powerful corporations in human history.

And even if corporations were entirely well-meaning and algorithms perfect, the question of the effectiveness of censorship as a strategy would still remain. Days after his suspension from Twitter, Robert Malone was interviewed on the Joe Rogan Experience, Spotify’s most popular podcast, whose average audience is estimated at 11 million people per episode. In his conversation with the comedian and former UFC commentator, Malone reiterated his claims about vaccine risks, referred to the situation created by social media as “global totalitarianism” and a “mass psychosis” comparable to that in Nazi Germany, and won more attention than ever.

The interview was banned by Youtube, but remains available on Spotify. The campaign is now to remove it from there, and it has gained the support of artists such as Neil Young and Joni Mitchell, who took their work from the streaming platform. But after being seen and heard by millions of people, it is unlikely to disappear – not least because it was already added by a Republican congressman to the US Congressional record as a way of “fighting the censorship of Big Tech”. Since then, Malone has also gained space on the conservative Fox News television network, in protests against vaccine mandates and in WhatsApp groups around the world. And it’s not unlikely that attempts to cancel it will only bring it more spotlight, through the phenomenon commonly known as the “Streisand effect”.

In addition to attracting attention, the effort to censor disinformation ends up handing conspiracy theorists on a plate with evidence that there is a collusion to obstruct the truth, thus reinforcing their worldview. And even for people with reasonable doubts about vaccines, censorship and a lack of transparency can contribute to increasing them by decreasing trust in institutions. Executing enemies of the regime creates martyrs – and even the stupidest dictatorships know that sometimes the best option is to let them speak.

Going beyond censorship

For these reasons and others, the British Royal Society, one of the oldest scientific societies in the world, has just recommended in a report on online information that governments and social media platforms should not count on removing content as a solution to the problem of disinformation. scientific. The document also proposes that the scientific community support fact-checking initiatives, in order to facilitate the identification and construction of consensus in areas where these are not evident.

Even the Royal Society concedes that freedom of expression is not an absolute value, and that there may be valid limits to the circulation of some information. Beyond the obvious cases, however, it seems best not to let the slack of scientific truth be left to social media. This involves letting a lot of people talk without being right, but it seems necessary so as not to suppress valid uncertainties or silence dissenters of the consensus who have something relevant to say. Treating complex issues with a war mentality, after all, ends up creating casualties among civilians, and you never know when friendly fire might turn against you.

*

Olavo Amaral is a professor at the Institute of Medical Biochemistry Leopoldo de Meis at UFRJ and coordinator of the Brazilian Reproducibility Initiative.

Subscribe to the Serrapilheira newsletter to keep up with more news from the institute and from the CiĂªncia Fundamental blog.

Source: Folha

anti-vaccinecoronavaccoronaviruscovid vaccinecovid-19leafpandemicPfizersciencescientific researchvaccinationvaccination campaignvaccinevaccine against covid-19vaccine passportvĂ­rus

You May Also Like

Recommended for you