Is it still safe to use Facebook?

Published: Posted on

By Dr Rosalind Jones, Lecturer in Marketing
Department of Marketing, University of Birmingham


Facebook needs to work on creating an improved alliance with outsourced companies or bring the work in-house, ensuring compliance for a better and safer world.

A recent Channel 4 Dispatches program – Inside Facebook: Secrets of the Social Network – saw an undercover reporter gather footage as a trainee content moderator at Facebook’s largest centre for UK content moderation.

Facebook has hit a new record low; this program has illuminated some serious data management and protection issues. Through poor content moderation, Facebook allows for the protection of those carrying out criminal activity, far-right politics which allows for hate speech and racist content, and child abuse.

Shocking content remains on Facebook for years, despite being flagged up by users as inappropriate. As long as the message alongside the video/image isn’t condoning the content, it is left on by trained moderators. These posts can remain on the platform for a prolonged period of time, with footage of one child abuse victim still being shown six years later. During the program, an NSPCC official viewed the footage and was visibly upset. He observed that while these videos remained on Facebook, victims were still continuing to suffer from the abuse.

Along with others, venture capitalist Roger McNamee was interviewed. McNamee was a mentor to Mark Zuckerberg and an early investor in Facebook. He has distanced himself from Zuckerberg and Facebook, speaking out against their practices and describing Facebook’s business model as relying on extreme content to make money from online advertising.

These facts are certainly shocking, but I was more aghast at the three main issues that I saw:

  1. There is an issue that essential moderation work is outsourced to another company, in this case CPL Resources Plc. in Dublin who have worked with Facebook since 2010. By now, Facebook should be alert to the dangers of outsourcing to any company for such important components of their business. It is very difficult to ensure adequate employee resources and appropriate training, in this case, these issues have caused long delays in even the most urgent cases.
  1. There appears to be no policy or regulation between government(s) and other social media networks to clarify what constitutes as ‘free speech’ and what should be moderated and referred to government authorities. Social media platforms and government(s) need to swiftly engage in debate, develop policies and produce a clear set of guidelines to ensure safety for consumers.
  1. Worrying views were expressed by content moderators who revealed that their role was not to ‘regulate’ or ‘control’ content and that serious cases, such as when someone is in immediate danger, were not always referred to the appropriate authorities. This issue enforces the need for Facebook (and other platforms) to reassure its consumers that they have procedures in place to ensure protection of the most vulnerable and that data is managed appropriately by well-trained employees.

What occurs online is impacting our society offline. There is plenty of research evidence to suggest that the more we consume violent images we are dulled to their shock value, feeding our appetite for more grisly viewing. As market leaders, Facebook have a great opportunity to lead the way and be a responsible business. In addressing the issues of online moderation, Facebook needs to work on creating an improved alliance with outsourced companies or bring the work in-house, ensuring compliance for a better and safer world. As a Facebook user myself, I’m awaiting its reply to what the documentary has brought to light.


Leave a Reply

Your email address will not be published. Required fields are marked *