Facebook apologises for mistakes in removing hate speech

The investigation reported by ProPublica this week showed that in one case Facebook censors, called content reviewers, approved a picture of a corpse with the statement "the only good muslim is a f...... dead one" was while another post stating "death to the Muslims!!!" was removed.

Indo-Asian News Service
San Francisco, Publish Date: Dec 30 2017 11:46AM | Updated Date: Dec 30 2017 11:46AM
Facebook apologises for mistakes in removing hate speechFacebook apologises for mistakes in removing hate speech

Facebook has apologised after an investigation exposed inconsistencies by moderators in removing offensive posts reported by the social network's users.

The investigation reported by ProPublica this week showed that in one case Facebook censors, called content reviewers, approved a picture of a corpse with the statement "the only good muslim is a f...... dead one" was while another post stating "death to the Muslims!!!" was removed.

In an analysis of 900 posts, the US-based non-profit investigative newsroom found that content reviewers at Facebook often make different calls on items with similar content, and do not always abide by the company's guidelines. 

The posts were submitted to ProPublica as part of a crowd-sourced investigation into how facebook implements its hate-speech rules. 

ProPublica asked Facebook to explain its decisions on a sample of 49 items.

People who submitted these items maintained that Facebook censors had erred, mostly by failing to remove hate speech, and in some cases by deleting legitimate expression.

Facebook admitted that its reviewers had made a mistake in 22 cases, but the social network defended its rulings in 19 instances. 

In six cases, Facebook said that the users had not flagged the conten correctly, or the author had deleted it. In the remaining two cases, Facebook said it did not have enough information to respond.

"We're sorry for the mistakes we have made... They do not reflect the community we want to help build," Facebook Vice President Justin Osofsky was quoted as saying by ProPublica.

"We must do better," he added.

Facebook, according to Osofsky, will double the size of its safety and security team, which includes content reviewers and other employees, to 20,000 people in 2018, in an effort to enforce its rules better, the report said on Thursday.

x
This site uses cookies to deliver our services and to show you relevant news and ads. By using our site, you acknowledge that you have read and understand our Cookie Policy, Privacy Policy, and our Terms of Service.That's Fine