Facebook turned off topic recommendation after the company’s AI mistook black men for “primates” in a video on the social network. Reported by the New York Times.
Facebook users who recently watched videos from a British tabloid featuring black men were shown an automatically generated prompt asking if they wanted to “keep watching primate videos.”
The video, dated June 27, 2020, was published by The Daily Mail and showed clashes between black men and white civilians and police officers. Although humans are among many species in the primate family, the video had nothing to do with monkeys, chimpanzees, or gorillas.
A Facebook spokesman said it was “clearly an unacceptable error,” adding that the recommendation software had already been disabled.
“We apologize to anyone who may have seen these offensive recommendations,â€� Facebook said in response to a request from Agence France-Presse. – We completely disabled the topic recommendation feature as soon as we realized it was happening. We will find out the reason and prevent this mistake from happening again. ”
Facial recognition software has been criticized by civil rights advocates who point to accuracy problems, especially for people who are not white.
A screenshot of the recommendation was shared on Twitter by former Facebook content design manager Darcy Groves. “This ‘keep watching’ suggestion is just not acceptable,” Groves tweeted to her former Facebook colleagues “Blatantly.”