Jimu news reporter Hu Xiuwen
On September 4, Al Jazeera reported that Facebook had announced that it had disabled its feature for features for features after mistaking black people for "primates" in a video on social networks.
According to The New York Times, in recent days, Facebook users have seen an auto-generated prompt when watching a black video in a British tabloid asking if they would like to "continue watching videos about primates."
A Facebook spokesperson called it a "clearly unacceptable error" and said the recommended software involved was offline. "We apologize to anyone who might have seen these offensive suggestions." Facebook said in response to an AFP investigation. "We disabled the entire topic recommendation feature as soon as we became aware that this was happening so that we could investigate the cause and prevent it from happening again."

Civil rights advocates have slammed facial recognition software, pointing to problems with the software's accuracy, especially for non-white people.
While humans are one of many species in the primate family, the video has nothing to do with monkeys, chimpanzees, or gorillas. Darci Groves, a former content design manager at Facebook, shared a screenshot of the recommendation on social apps. "This 'keep looking' prompt is unacceptable." Groves said, "That's outrageous. ”
For more exciting information, please download the "Jimu News" client in the application market, please do not reprint it without authorization, welcome to provide news clues, and pay as soon as you adopt it. 24-hour reporting hotline 027-86777777.