Facebook Papers: understand how the social network did not act to stop racism

by

For at least two years, Meta (formerly Facebook) has discussed how to avoid the normalization of racism and improve the experience of black people on the social network, but the adoption of an effective policy was only accelerated after the assassination of George Floyd in the United States and the #BlackLivesMatter movement gaining global prominence.

Until November of this year, when the company announced that it will measure racial data, structural changes in the platform’s policy were hampered by the lack of data and the lack of effective actions to support minorities, according to reports from employees recorded in internal 2020 documents. and 2021.

The reports are in so-called Facebook Papers, internal documents sent to the US Securities and Exchange Commission (SEC) and provided to Congress in an edited form, with names hidden, by the lawyers of Frances Haugen, a former employee of the Goal.

A sheet it is part of the consortium of media outlets that had access to these papers and analyzed the racial policy documents in partnership with Núcleo Jornalismo.

Black Americans are among the groups most engaged in the social network. In 2019, as shown by an internal survey of the company itself, the social network detected that harmful viral videos, such as police brutality, had an impact even on the offline lives of black users.

The study highlights that “police videos were a constant source of stress for black participants”, which even changed behavior outside the networks. Memphis city police have used fake profiles to monitor black activists, the report says.

The problem also involves algorithmic discrimination, a challenge to many technology companies.

In September, for example, some network users were asked if they wanted to “continue to watch primate videos” — the label was linked to a video of a black man. The recommendation artificial intelligence system, first programmed by people, associated characteristics of that image with a black person.

“Our teams work every day to improve the experiences of marginalized communities that use Instagram and Facebook,” Meta said in a statement. The group emphasizes that it has assessed possible civil rights implications for new products, according to a recent report.

Both the discrimination by algorithms —skewed by the imagination of those who program them — and the exposure of black people to low-quality content were recurrent topics of discussion in recent years. However, the issue only becomes urgent after the anti-racist uprising in the United States, according to the documents released.

Days after Floyd’s murder, an official suggested tracking any “disproportionate impact” on black users, a report shows. From there, a series of discussions began on how to create a racial justice policy so that the company would not help to perpetuate systemic racism.

One of the obstacles for Facebook, cited in a number of documents, was the lack of statistics on blacks to create policy. The social network does not intentionally capture data on users’ race, which, according to employees, would make measurement difficult.

“While presumably we don’t have a policy designed to put minorities at a disadvantage, we definitely do have emerging policies, practices and behaviors that do,” says an official in a post from a group called Integrity Ideas to Combat Racial Injustice. He had his name hidden by Haugen’s legal counsel.

“We must comprehensively study how our decisions and how the mechanics of social media support and do not support minority communities,” he says.

Although machine learning systems can implicitly guess the race of many users, as highlighted by an interlocutor in the same document, the lack of direct measurement of blacks came to be the objective to be pursued by the company for the creation of a policy .

“A more cynical view is that part of the reason we avoid measuring race is because we don’t want to know what our platform is really doing — particularly on Facebook. If you can’t measure it, you can’t act,” says one employee.

The creation of an internal “oversight board” (as the external board that judges publications that fall into the questionable area of ​​use and removal policies) was considered, composed of employees from all over the world that would act as a ” official voice in small group deliberations.”

In September 2020, a document titled “Allowing Measurement of Social Justice with ​US CEPs” was released internally suggesting metrics for identifying blacks without invading users’ privacy. It is the first most practical measure related to algorithms.

The idea was to cross-reference US zip codes and US census data (such as the IBGE in Brazil) in order to measure their products by race and ethnicity in the corresponding neighborhoods.

For Paulo Rená, an activist at the Aqualtune Lab and a university professor, a mathematical inference about a person who lives in a certain neighborhood, has a certain income and a surname can be a welcome alternative, as long as it considers nuances.

“We have to be careful with these inferences, because if I have an average that says 90% of people in a group are one way, that doesn’t determine that the person I’m analyzing will be in that 90% and not the 10%” , evaluates.

Almost a year later, in November of this year, Meta announced that it will finally cross-reference this data to use it in a statistical way on the platform, which indicates a step towards racial policy. He says he consulted more than two dozen privacy and civil law experts to define a non-invasive and personifiable method.

For the Indian researcher Ramesh Srinivasan, PhD at Harvard and founder of the Digital Culture Laboratory at UCLA, in California, an efficient solution is to share the platform’s moderating power.

He says Facebook can’t behave like some kind of black box of technology.

“It doesn’t need to publish its source code, but it does need to cede the power of content moderation to many different organizations that have the expertise to understand racial issues, gender issues, trans queer issues as well as geographic issues,” he suggests.

OTHER SIDE

Sought, Meta says it “is constantly evolving based on the issues we encounter on our platforms and the feedback we get from the community.” It says it released a report in November detailing its progress on a civil rights audit carried out in 2020.

“The report highlights our work to develop a review process for product teams, in which we assess the potential civil rights implications of new products. A key part of that effort is our working closely with civil rights experts. , it is essential that all communities have positive and safe experiences on our platforms, and we will continue working to achieve this goal”, highlights the company.

.

You May Also Like

Recommended for you