The video, posted by The Daily Mail on June 27, 2020, shows clips of Black men and police officers. An automatic prompt asked users if they would like to “keep seeing videos about Primates,” despite the video clearly featuring no connection or content related to primates.
“As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make,” Facebook said in a statement to The New York Times. “We apologize to anyone who may have seen these offensive recommendations.”
A former content designer at Facebook flagged the issue after a friend forwarded a screenshot of the prompt. A product manager for Facebook Watch reportedly called the error “unacceptable” and said that the company would look “into the root cause.”
“This was clearly an unacceptable error and we disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again,” Facebook spokesperson Dani Lever said in a statement to USA TODAY.
Facebook immediately disabled the A.I. program responsible for the error.
Technology companies have dealt with similar issues in the past, with some critics claiming facial recognition technology is biased against people of color.
Google Photos in 2015 mistakenly labeled pictures of Black people as “gorillas,” for which Google apologized and tried to fix the error. Wired later found that the solution was to block the words “gorilla,” “chimp,” “chimpanzee” and “monkey” from searches.
Facebook did not respond to a Fox News request for comment.