Facebook had to investigate and turn off the artificial intelligence assist feature because an automated prompt from the app saying “keep seeing videos about Primates” popped up when a Black man saw an automated prompt. This information comes from a video in a British tabloid.
On Monday, Facebook apologized for its unacceptable mistake, and it’s working to prevent this feature issue from happening again.
The Daily Mail posted a video featuring black men interspersed with white civilians and police officers. All are human and have no connection or appearance with monkeys or primates.
From a screenshot sent by a friend, Darci Groves, former director of content design at Facebook posted the issue to a product feedback forum for current and former Facebook employees. A Facebook Watch product manager commented on her post that the company is investigating the cause, and they also consider this to be unacceptable.
A Facebook spokesperson, Dani Lever said: “As we have said, while we have made improvements to our AI, we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
Race is a very sensitive issue. That’s why big tech companies have been under scrutiny for years over how AI works on this issue. Facial recognition technology is difficult for people of color so it often makes mistakes and is labeled as racist.
In 2015, photos of black people were labeled “gorilla” on Google Photos. After that, Google had to apologize and offer remedial measures. After that, words like “gorilla”, “chimp”, “monkey” were blocked and strictly censored by Google.
Facebook trained its face and object recognition algorithms through its image archive. This is the world’s largest archive of user-uploaded photos. The company also tailors content based on users’ previous browsing and viewing habits. Sometimes, suggestions for viewing related content are given to the user. And we don’t know if a lot of content contains “primate” messages.
Other issues related to race have plagued photo-sharing apps, typically Facebook and Instagram. One particular example is that in July, due to a failure to successfully finish a penalty during the European Championship, three members of the England national football team were subjected to racial discrimination.
The problem of racism is not only happening on social networks, but it also occurs in the companies that provide these services. In 2016, in a common space at the company’s headquarters in Menlo Park, California, the phrases “Black Lives Matter” were replaced by “All Lives Matter.” This request was made by Facebook CEO Mark Zuckerberg. Last year, President Donald Trump posted an article about the killing of George Floyd in Minneapolis, hundreds of Facebook employees held a virtual walkout last year to protest the company because it allowed this post to exist.
Then, Facebook had to hire a vice president of civil rights to create a civil rights audit. An annual diversity report released in July found that the percentage of black employees in the US was 4.4%, up 0.9% from last year.
According to a former Facebook employee, Groves said, “Facebook can’t keep making these mistakes and then saying, ‘I’m sorry,’.” She said that leaders at the company don’t make solving race problems a priority.