Here is a consolidated version of the current text:
Best practice principles for sexual content moderation and child protection
Version 0.3—July 3, 2019
1. Prevention of harm
Sexual content should be restricted where it causes direct harm to a child. Indirect harms should not be the basis for blanket content restriction policies unless those harms are substantiated by evidence, and adequate measures are taken to avoid human rights infringements.
2. Evaluation of impact
Companies should evaluate the human rights impacts of their restriction of sexual content, meaningfully consult with potentially affected groups and other stakeholders, and conduct appropriate follow-up action that mitigates or prevents these impacts.
3. Transparency
Companies and others involved in maintaining sexual content policies, databases or blocklists should describe the criteria for assessing such content in detail, especially when those policies would prohibit content that is lawful in any of the countries where such policies are applied.
4. Proportionality
Users whose lawful sexual conduct infringes platform policies should not be referred to law enforcement, and their lawful content should not be added to shared industry hash databases, blocklists, or facial recognition databases.
5. Context
The context in which lawful sexual content is posted, and whether there are reasonable grounds to believe that the persons depicted in it have consented to be depicted in that context, should be considered before making a decision to restrict or to promote it.
6. Non-discrimination
Content moderation decisions should be applied to users based on what they do, not who they are.
7. Human decision
Content should not be added to a hash database or blocklist without human review. Automated content restriction should be limited to the case of confirmed illegal images as identified by a content hash.
8. Notice
Users should be notified when their content is added to a hash database or blocklist, or is subject to context-based restrictions.
9. Remedy.
Internet platforms should give priority to content removal requests made by persons depicted in images that were taken of them as children, and provide users with the means of filtering out unwanted sexual content.