A Multi-Stakeholder Dialogue on Internet Platforms, Sexual Content, and Child Protection

2018 is the year of the techlash, in which public opinion has hardened towards Internet platforms that are seen as having failed to adequately address the online manifestations of a range of social problems. Platforms are increasingly being asked to take proactive measures to prevent misinformation, hateful speech, terrorist content, and copyright infringing content, from appearing online to begin with.

The paradigmatic case of such proactive content filtering is one in which larger platforms, in particular, already have good experience; namely, the automated filtering and removal of child sexual abuse (CSA) material by reference to hashes of known illegal images. But can platforms do more to prevent child sexual abuse than can be accomplished through such automated means?

In the United States, an answer to this question has been forced by the passage of FOSTA/SESTA which narrows platforms’ safe harbor protection from liability for users’ content. Although originally touted as a narrow measure targeting child sex traffickers, the final law also makes platforms liable for promoting or facilitating consensual adult sex work, and in practice some content that does not relate to sex work of any kind has also been removed.

We propose to promote a more evidence-based approach to the question of what platforms can do to help reduce child sexual abuse, beyond the removal of manifestly illegal content, by convening a two-part multi-stakeholder dialogue on this topic, with the objective of suggesting a set of model terms of service for Internet platforms with respect to child protection.

Currently, many platforms do already have child protection policies as part of their content policies or community standards, however these can be vague and unpredictable in their application even by a single platform, let alone between platforms. Smaller platforms may not have well-developed policies on this topic at all. Even in mid-size platforms, trust and safety teams are typically composed of members who deal with other forms of abusive content such as spam and fraud, but which lack dedicated expertise in child protection.

Although referring to policies on sexual content more generally, rather than to child protection policies specifically, U.N. Special Rapporteur David Kaye notes in his 2018 report that the application such policies has resulted in the removal of resources for members of sexual minorities, and depictions of nudity with historical, cultural or educational value. CSA prevention resources have also been removed in some cases.

The first convening of this two-part initiative will bring together stakeholders including mental health professionals, representatives of the sex industries, child protection workers, human rights experts, and survivors of child sexual abuse (CSA), in a private gathering with platform representatives to discuss and suggest best practices for policies that would protect children, while avoiding such unforeseen impacts that would infringe on the human rights of children or others.

This first convening is specifically designed to help industry participants fulfil the Guiding Principles on Business and Human Rights, which require companies to “Conduct due diligence that identifies, addresses and accounts for actual and potential human rights impacts of their activities, including through regular risk and impact assessments, meaningful consultation with potentially affected groups and other stakeholders, and appropriate follow-up action that mitigates or prevents these impacts.”

At the second convening which will be held at the Internet Governance Forum, a document prepared on the basis of the discussions held at the first convening will be presented for broader community feedback. The anticipated outcome of this second meeting will be the publication of a set of model terms of service for Internet platforms with respect to child protection.

About Prostasia Foundation

Prostasia Foundation is a new s.501(c)(3) nonprofit child protection organization, formed in the wake of the passage of FOSTA/SESTA, whose mission is to ensure that the elimination of child sexual abuse is achieved consistently with the highest values of the society that we would like our children to grow up in.

About Jeremy Malcolm

Jeremy Malcolm is Prostasia Foundation’s Founder and Executive Director. Jeremy led the development of the Manila Principles on Intermediary Liability while he was Senior Global Policy Analyst at the Electronic Frontier Foundation. He is also a member of the Multistakeholder Advisory Group of the Internet Governance Forum.