Another day, another stop on Facebook's never-ending apology tour.
On Tuesday, that takes the form of the social network’s latest transparency report, which, for the first time, includes stats on the vast amounts of content the company removes for violating its community standards.
This content is a tiny fraction of the total content Facebook removes every year. This is the regular content they remove that violates its “community standards,” the social network’s rules that prohibit hate speech, graphic violence, nudity and all the other stuff Facebook doesn’t want you to see.
There’s no mention of the company’s efforts to better safeguard user data, for one. Though the company’s made some efforts in the weeks following the Cambridge Analytica disclosures to notify users whose data was misused and identify other developers who may have misused data, there’s no mention of it in the report.
Similarly, there’s little mention of misinformation or fake news, another thorny issue Facebook’s struggled to combat. Though the report’s sidebar links to a help center article and a post from Mark Zuckerberg on the topic, the report itself doesn’t add any new information whatsoever.