BBC ran an investigation into child pornography rings on Facebook. In the course of their investigation, they would hit the ‘report’ button on 100 pages and posts that violated Facebook policies.
Facebook’s responses were staggering.
This is bad: First off, Facebook removed only 18 of the 100 reported materials, which included:
- pages explicitly for men with a sexual interest in children
- images of under-16s in highly sexualised poses, with obscene comments posted beside them
- groups with names such as “hot xxxx schoolgirls” containing stolen images of real children
- an image that appeared to be a still from a video of child abuse, with a request below it to share “child pornography”
BBC didn’t leave it there. The investigators contacted Facebook to ask why the images weren’t removed. So Facebook asked them to send along the reported materials. BBC did so.
Facebook’s response was to call the police on the reporters, the people who were merely highlighting that Facebook had refused to remove these materials!
According to BBC, Facebook’s only public comment since, is to make a brief comment explaining that it was company policy to do so. Which tells me Facebook is simply trying to stick their heads in the sand about a widespread problem of sex abuse being promoted on the website.
BBC’s conclusion is golden: “If the company’s moderation policy is shown to be ineffective for something as comparatively simple as child porn, Facebook’s role in navigating the complex world of fake news looks increasingly questionable.”