LinkedIn conducted social experiments with 20 million users over five years

by

LinkedIn has conducted experiments with more than 20 million users over five years, which, while intended to improve the platform for members, may have affected the job opportunities available to some users, according to a new study.

In experiments conducted around the world from 2015 to 2019, LinkedIn randomly varied the proportion of weak to strong contacts suggested by its “People You May Know” algorithm. [pessoas que talvez você conheça] –the company’s automated system for recommending new connections to its users. The tests were detailed in a study published this month in the journal Science co-authored with researchers from LinkedIn, the Massachusetts Institute of Technology, Stanford University and the Harvard Business School.

LinkedIn’s algorithmic experiments could surprise millions of people because the company did not inform users that the tests were being done.

Tech giants like LinkedIn, the world’s largest professional network, routinely conduct large-scale experiments in which they test different versions of application features, web designs, and algorithms with different people. The old practice, called A/B testing, seeks to improve consumers’ experiences and keep them engaged, which helps companies make money from premium membership fees or advertising. Users often have no idea that companies are running tests on them.

But the changes made by LinkedIn are hints at how these tweaks to widely used algorithms can become social engineering experiments with life-altering consequences. Experts who study the social effects of computing said that conducting long, large-scale experiments that could affect people’s job prospects raises questions about industry transparency and research oversight.

“The findings suggest that some users had better access to job opportunities or a significant difference in accessing those opportunities,” said Michael Zimmer, associate professor of computer science and director of the Data, Ethics and Society Center at Marquette University in Milwaukee, Wisconsin. “These are the kinds of long-term consequences that need to be considered when thinking about the ethics of engaging in this kind of ‘big data’ research.”

The Science study tested an influential theory in sociology called “the strength of weak ties,” which holds that people are more likely to get jobs and other opportunities through close acquaintances than through close friends.

The researchers looked at how LinkedIn’s algorithmic changes affected users’ job mobility. They found that relatively weak social ties on LinkedIn were twice as effective in getting jobs as stronger social ties.

In a statement, LinkedIn said that during the study, it “acted consistently” with the usage agreement, the company’s privacy policy and member settings. The privacy policy notes that LinkedIn uses users’ personal data for research purposes. The statement added that the company used the latest “non-invasive” social science techniques to answer important research questions, “without any experimentation on the members.”

Microsoft-owned LinkedIn did not directly respond to a question about how the company considered the possible long-term consequences of its experiments on users’ employment and economic status. But the network said the survey did not disproportionately benefit some users.

The purpose of the research was “to help people at scale,” said Karthik Rajkumar, an applied research scientist at LinkedIn and one of the study’s authors. “No one has been put at a disadvantage in finding a job.”

Sinan Aral, an MIT professor of management and data science who was the study’s lead author, said the LinkedIn experiments were an effort to ensure that users had equal access to employment opportunities.

“Doing an experiment on 20 million people and then releasing a better algorithm for everyone’s job prospects as a result of the knowledge you learn from that is what they’re trying to do,” Aral said, “instead of anointing a few people to have social mobility and others don’t.”

Experiments with users of large internet companies have a troubled history. Eight years ago, a Facebook study was published describing how the social network silently manipulated which posts appeared in users’ news feeds to analyze the spread of negative and positive emotions on its platform. The week-long experiment, conducted with 689,000 users, quickly generated a backlash.

LinkedIn’s professional networking experiments were different in intent, scope, and scale. They were designed by LinkedIn as part of the company’s ongoing efforts to improve the relevance of its “people you might know” algorithm, which suggests new connections to members.

The algorithm analyzes data such as employment history, job titles, and members’ ties to other users. It then attempts to assess the likelihood that a LinkedIn member will send a friend invite to a suggested new connection, as well as the likelihood that this new connection will accept the invitation.

For the experiments, LinkedIn adjusted its algorithm to randomly vary the prevalence of strong and weak ties that the system recommended. The first wave of tests, conducted in 2015, “had more than 4 million experimental subjects,” the study said. The second wave, held in 2019, involved more than 16 million people.

During testing, people who clicked on the “people you might know” tool and analyzed the recommendations were assigned different algorithmic paths. Some of these “treatment variants,” as the study called them, caused LinkedIn users to form more connections with people with whom they had only weak social ties. Other adjustments caused people to form fewer connections with weak ties.

It is not known whether most LinkedIn members understand that they may be subject to experiments that could affect their job opportunities.

LinkedIn’s privacy policy says the company may “use personal data available to us” to research “workplace trends, such as job availability and skills needed for those jobs.” Its policy for outside researchers seeking to analyze company data clearly states that these researchers will not be able to “experiment or perform tests on our members”.

But none of the policies explicitly tell consumers that LinkedIn itself can experiment and test its members.

In a statement, LinkedIn said, “We are transparent with our members through our research section of our user agreement.”

In an editorial statement, Science said, “It was our understanding, and that of the reviewers, that the experiments performed by LinkedIn operated under the guidelines of their user agreements.”

After the first wave of algorithmic testing, researchers at LinkedIn and MIT came up with the idea of ​​analyzing the results of these experiments to test the theory of the strength of weak ties. While the decades-old theory has become a cornerstone of social science, it has not been rigorously proven in a large-scale prospective study that randomly assigned people to social connections with different strengths.

External researchers analyzed aggregated data from LinkedIn. The study reported that people who received more recommendations for moderately weak contacts generally applied for and accepted more jobs — results that fit the weak tie theory.

The 20 million users involved in the LinkedIn experiments created more than 2 billion new social connections and completed more than 70 million job applications that led to 600,000 new jobs, the study said. Weak tie connections proved more useful for job seekers in digital fields such as artificial intelligence, while strong ties proved more useful for jobs in industries that were less reliant on software, the study said.

LinkedIn said it has applied the findings about weak ties to a number of features, including a new tool that notifies members when a first- or second-degree connection is hiring. But the company made no study-related changes to its “people you might know” feature.

MIT’s Aral said the study’s deeper significance was that it showed the importance of powerful social networking algorithms — not just for amplifying problems like disinformation, but also as fundamental indicators of economic conditions like employment and unemployment.

Catherine Flick, a senior researcher in computing and social responsibility at De Montfort University in Leicester, England, described the study as yet another corporate marketing exercise.

“The study has an inherent bias,” Flick said. “It shows that if you want to get more jobs, you should be more on LinkedIn.”

Translated by Luiz Roberto M. Gonçalves

You May Also Like

Recommended for you

Immediate Peak