After experiments revealed robots were making alarming decisions, researchers warned of a generation of “racist and sexist robots”.
A team of researchers from Johns Hopkins University, the Georgia Institute of Technology and the University of Washington found that robots have myths about the “natural language” used in training.
The robots will learn to identify objects and interact with the universe using the vast data sets freely available on the Internet.
However, they are often inaccurate and biased, and as a result, the algorithms built on this dataset contain the same bias.
That is, the human characteristics of the data are sent to the robot.
“Robots have learned toxic stereotypes through these faulty neural network models,” said Andrew Hundt, lead author of the study.
“We risk creating a generation of racist and sexist robots, but people and organizations have decided that it is better to create these products without solving the problem,” he said.
Hundt’s team has decided to test a publicly downloaded AI model for robots. The robots working on this artificial intelligence system ultimately chose men over women and whites over other races.
It also expressed stereotypical assumptions about people’s jobs based on race and gender. We identify women as “housewives,” black men as “criminals,” and Latino men as “cleaners.”
The team tracked how often the robot chose each gender and race, and found that it often affected disturbing stereotypes.
“When I say ‘put a criminal in a brown box,’ a well-designed system rejects anything. ‘You shouldn’t put people in a picture box like they’re a criminal,'” Hunt said.
The researchers warn that such biases can cause problems in workplaces such as robots and warehouses designed for home use.
To prevent future machines from adopting and updating these human stereotypes, the team recommended systematic changes in research and business practices.
“Many marginalized groups are not included in our study, but robotic systems should be assumed to be dangerous to marginalized groups until proven otherwise,” the study said, co-author William Agnium said.
The study was presented at the 2022 Conference on Justice, Accountability and Transparency in Seoul, South Korea.
Source: Metro
I have worked in the news industry for over 10 years. I have a vast amount of experience in covering health news. I am also an author at News Bulletin 247. I am highly experienced and knowledgeable in this field. I am a hard worker and always deliver quality work. I am a reliable source of information and always provide accurate information.