Facebook Moves Faster in Preventing Harmful Content

A new report from Facebook shows how far the company has gone in preventing and taking action on content that goes against its community standards.

The company released its industry-leading community standards Enforcement Report (CSER) for the first quarter of 2021. Not only does the report show the extent Facebook has gone in preventing harmful content in its communities, it also shows that the online social media giant is keen on protecting the community’s safety, privacy, and dignity.

Readers can see new data on how Facebook is performing in enforcing 12 of its policy areas on its platform, and 10 on Instagram from January to March 2021. There could be seen what improvements were made in prevalence, providing moderation operations across different Facebook products.

The company appointed what it calls EY to audit the latest report, as part of its commitment to independent oversight.


Facebook is expanding its reporting efforts by adding more prevalence metrics for Instagram in such things as adult nudity and s3xual activity, violent and graphic content.

Another thing the network is succeeding to erase is the prevalence of hate speech on its platforms, especially Facebook. That is reducing now because Facebook has made changes in content found in News Feed.

As time goes by, the online giant keeps getting better at combating things that are not favorable to the world at large. It claims that it is getting more efficient at enforcing its community standards through a multipronged approach that now includes the use of Artificial Intelligence (AI), human moderation, user reporting tools, and collaboration with external experts.

Hate speech on Facebook has reduced for the third quarter in a row, and this very much pleasing to the team. In addition to that, Facebook announced today that it will also launch its Transparency Center to provide a single destination for information about its integrity efforts.