Must Read

Basic guide to caring for a Covid-19 patient at home

A basic guide to caring for a Covid-19 patient at home Some people infected with the Covid-19 coronavirus can recover...

Italy fears that coronavirus will provoke a social outbreak in the south

Italy fears that coronavirus will provoke a social outbreak in the south <img src =...

Coronavirus map in real time: how it progresses in the world

Coronavirus map in real-time: how it progresses in the world Following the Covid-19 coronavirus outbreak in Wuhan, China in December,...
Brian Adam
Professional Blogger, V logger, traveler and explorer of new horizons.

Voice assistants have racial issues

One of the first voice assistants on Amazon./
One of the first Amazon voice assistants.

Racial biases are not exclusive to facial recognition, also to voice assistants

Stanford researchers in the United States have studied the different voice recognition systems in the market: Apple, Google, Amazon, IBM and Microsoft and have found a pattern. Racial biases. To carry out the study, the researchers interviewed 42 white and 73 black people and used voice recognition systems to transcribe what they were saying.

According to the study, the voice assistants of these technology companies make far fewer mistakes in recognising the voice of a caucasian person than with an African American. The report shows that about 2% of the words quoted by white people are unreadable by these systems. In the case of African-American people, the figure rises to 20%

The conclusions, published in the journal ‘Proceedings of the National Academy of Sciences’, highlight that these systems erroneously identified the words of white people about 19% of the time, while for black people the error increases to 35%

The Microsoft Assistant is the system that it worked better in the research, as it misidentified around 15% of white people, compared to 27% of black people. Second, according to experts, is the Amazon system, which did not correctly identify about 18% of the words of white people and 36% of black people.

Followed by Amazon would be the Google Assistant, with an error rate of 20% with white people and around 36% with African-American people, and IBM, with an error of more than 20% in white people and almost 40% with black people.

For its part, the Apple system is the one that has lower performance, since it failed about 23% of the time with white people and 45% with black people.

More Articles Like This

Latest News

Basic guide to caring for a Covid-19 patient at home

A basic guide to caring for a Covid-19 patient at home Some people infected with the Covid-19 coronavirus can recover...

Italy fears that coronavirus will provoke a social outbreak in the south

Italy fears that coronavirus will provoke a social outbreak in the south <img src = "https://intallaght.ie/wp-content/uploads/2020/03/1585406752.jpg" alt="Image taken in Rome....

Coronavirus map in real time: how it progresses in the world

Coronavirus map in real-time: how it progresses in the world Following the Covid-19 coronavirus outbreak in Wuhan, China in December, this is how the disease...

WhatsApp fumes in the confinement

WhatsApp fumes in the confinement It is the main consumer application for the duration of the quarantine, its traffic...

The distribution of funeral urns stokes doubts about China’s figures

The distribution of funeral urns stokes doubts about China's figures Image of a Wuhan clinic.  Images of warehouses and trucks carrying...