[ad_1]
The social media giant founded by Mark Zuckerberg apologized for “obviously unacceptable mistakes,” but had previously faced accusations of racial bias.
After Facebook mistakenly identified black people as “primates” in a video on the social network, it announced that it would disable its topic recommendation function.
A Facebook spokesperson called it an “obviously unacceptable error” and stated that the recommended software involved has been offline.
“We apologize to anyone who might see these offensive suggestions,” Facebook said in response to the Agence France-Presse investigation.
“We immediately disabled the entire topic recommendation feature when we realized that this happened so that we can investigate the cause and prevent this from happening again.”
Facial recognition software has been criticized by civil rights advocates who point out accuracy problems, especially for non-whites.
According to the New York Times, Facebook users who watched black British tabloid videos in recent days will see an automatically generated prompt asking them if they are willing to “continue to watch videos about primates.”
The June 2020 video in question published by the Daily Mail is titled “White people call black people to the police at the dock.”
Although humans are one of many species of primates, this video has nothing to do with monkeys, chimpanzees or gorillas.
Darci Groves, former Facebook content design manager, shared the recommended screenshot on Twitter.
“This kind of’keep watching’ prompt is unacceptable,” Groves wrote on Twitter, targeting this message to a former colleague of Facebook.
“This is ridiculous.”
The social media giant founded by Mark Zuckerberg has been facing some controversy in recent years.
In 2020, hundreds of advertisers signed the “Stop Hate for Profit Campaign” organized by social justice groups such as the Anti-Defamation League (ADL) and Free Press to force Facebook to take specific measures to stop hate speech and misinformation. George Floyd, a black man, died after being detained by the police.
In a 2019 Al Jazeera article, Philadelphia-based freelance journalist and media studies professor David A Love also claimed that Zuckerberg’s company is willing to “support hate groups, white nationalists And extreme right-wing extremists”.
[ad_2]
Source link