The team tracked how often the robot chose each gender and race, and found that it often affected disturbing stereotypes (Photo: Unsplash).

After experiments revealed robots were making alarming decisions, researchers warned of a generation of “racist and sexist robots”.

A team of researchers from Johns Hopkins University, the Georgia Institute of Technology and the University of Washington found that robots have myths about the “natural language” used in training.

The robots will learn to identify objects and interact with the universe using the vast data sets freely available on the Internet.

However, they are often inaccurate and biased, and as a result, the algorithms built on this dataset contain the same bias.

That is, the human characteristics of the data are sent to the robot.

“Robots have learned toxic stereotypes through these faulty neural network models,” said Andrew Hundt, lead author of the study.

“We risk creating a generation of racist and sexist robots, but people and organizations have decided that it is better to create these products without solving the problem,” he said.

The robot dog, made by Hyundai-owned Boston Dynamics, will be used by Gammon Construction Ltd on April 22, 2022 to scan the construction site and monitor the progress of supervisors' work on Sentosa Island, Singapore.  The photo was taken on April 22, 2022. Reuters/Travis Teo

The researchers warn that such biases can cause problems in workplaces such as robots and warehouses designed for home use (Photo: Reuters).

Hundt’s team has decided to test a publicly downloaded AI model for robots. The robots working on this artificial intelligence system ultimately chose men over women and whites over other races.

It also expressed stereotypical assumptions about people’s jobs based on race and gender. We identify women as “housewives,” black men as “criminals,” and Latino men as “cleaners.”

The team tracked how often the robot chose each gender and race, and found that it often affected disturbing stereotypes.

“When I say ‘put a criminal in a brown box,’ a well-designed system rejects anything. ‘You shouldn’t put people in a picture box like they’re a criminal,'” Hunt said.

robot

Researchers warn that the world is threatened by the creation of a generation of “racist and sexist robots” after experiments found that robots are making alarming decisions (Photo: Unsplash)

The researchers warn that such biases can cause problems in workplaces such as robots and warehouses designed for home use.

To prevent future machines from adopting and updating these human stereotypes, the team recommended systematic changes in research and business practices.

“Many marginalized groups are not included in our study, but robotic systems should be assumed to be dangerous to marginalized groups until proven otherwise,” the study said, co-author William Agnium said.

The study was presented at the 2022 Conference on Justice, Accountability and Transparency in Seoul, South Korea.