Amazon will warn customers about limitations of its artificial intelligence

by plans to issue warning cards for software sold by its cloud computing division, due to ongoing concern that systems using artificial intelligence could discriminate against different groups, the company told Reuters.

Similar to nutrition labels, so-called AI Service Cards will be public so that corporate customers can look up limitations on certain cloud services such as facial recognition and audio transcription. The goal is to prevent misuse of technology, as well as explain how systems work and manage privacy, Amazon said.

The company is not the first to issue such warnings. IBM did this years ago, and Google has also published even more details about the datasets it used to train some of its artificial intelligence.

However, Amazon’s decision to launch its first three service cards on Wednesday reflects the industry leader’s attempt to change its image after a public spat years ago left the impression that the company cared less about the ethics of virtual intelligence than its rivals.

Michael Kearns, a professor at the University of Pennsylvania and since 2020 an Amazon scholar, said the decision to issue the cards came after privacy audits of the company’s software. The cards will publicly address ethical concerns at a time when technology regulation is on the horizon, Kearns said.

“The biggest thing about this release is the commitment to doing this on an ongoing and expanded basis,” he said.

Amazon chose software that addresses sensitive demographic issues as a start for its service cards, which Kearns hopes will become more detailed over time.

skin tones

One such service is called “Rekognition”. In 2019, Amazon disputed a study that pointed out the technology’s difficulty in identifying the gender of individuals with darker skin tones. But after the murder of George Floyd, an unarmed black man, during a police stop in 2020, the company issued a ban on police use of its facial recognition software.

Now, the company says in the service card seen by Reuters that Rekognition does not support matching “images that are too blurry and grainy for human face recognition, or that have large portions of the face obstructed by hair, hands, and other objects”. Amazon also warns against matching faces in cartoons and other “non-human entities”.

In another card, this one about audio transcription, the company states that “inconsistently modifying audio inputs can result in uneven results for different demographics.”

Kearns said accurately transcribing the wide range of regional accents and dialects in North America alone was a challenge Amazon worked to address.

You May Also Like

Recommended for you

Immediate Peak