Voice assistants have racial issues
Racial biases are not exclusive to facial recognition, also to voice assistants
Stanford researchers in the United States have studied the different voice recognition systems in the market: Apple, Google, Amazon, IBM and Microsoft and have found a pattern. Racial biases. To carry out the study, the researchers interviewed 42 white and 73 black people and used voice recognition systems to transcribe what they were saying.
According to the study, the voice assistants of these technology companies make far fewer mistakes in recognising the voice of a caucasian person than with an African American. The report shows that about 2% of the words quoted by white people are unreadable by these systems. In the case of African-American people, the figure rises to 20%
The conclusions, published in the journal ‘Proceedings of the National Academy of Sciences’, highlight that these systems erroneously identified the words of white people about 19% of the time, while for black people the error increases to 35%
The Microsoft Assistant is the system that it worked better in the research, as it misidentified around 15% of white people, compared to 27% of black people. Second, according to experts, is the Amazon system, which did not correctly identify about 18% of the words of white people and 36% of black people.
Followed by Amazon would be the Google Assistant, with an error rate of 20% with white people and around 36% with African-American people, and IBM, with an error of more than 20% in white people and almost 40% with black people.
For its part, the Apple system is the one that has lower performance, since it failed about 23% of the time with white people and 45% with black people.