Facebook has apologized after it’s users were met with an automated prompt asking if they would like to “keep seeing videos about primates”. The video in question was a British video featuring black men and had nothing to do with primates. After investigations Facebook has disabled the A.I powered feature responsible for the said message.
It also apologized for the incident on Friday terming it as “an unacceptable error”. The company has promised to scrutinize the recommendation feature to prevent such mistakes in future. Facebook’s spokeswoman, Dani Lever, said in a statement, “As we have said, while we have made improvements to our A.I., we know it’s not perfect and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
This is not the first time Facebook is caught in racial controversy. Its employees once staged a virtual walkout protesting the company’s handling of posts relating to the killing of George Floyd. During the 2016 ‘Black Lives Matter’ slogan, some Facebook employees could cancel it out replacing it with “All Lives Matter”.
Facebook has since hired a vice president of civil rights. Nonetheless, it’s too early to conclude as it’s not yet clear if this noble institution has won over the racial controversy monster.