Facebook Moderators Told To Err On The Side Of An Adult With Potential Child Sexual Abuse Material

0
2449

Recent reports have shown that Facebook’s content moderators are instructed to “erronside an adult” when unsure about the age of a child victim in potential child sexual assault material (CSAM).

According to the New York Times, Facebook’s training materials for content moderators tell them to “errond on the side of the adult” when they don’t know the age or gender of the person in a photo or video that is suspected of being child abuse material (CSAM).

Facebook and other companies are required to check their platforms for child abuse material and report any such content to the National Center for Missing and Exploited Children. Many tech companies employ content moderators who review any content that is flagged as possible CSAM.

According to The Verge, the policy instructing Mark Zuckerberg’s moderators “err on side of an adult” was created for Facebook’s content managers at third-party contractor Accenture. It is also mentioned in a California Law Review article that was published in August.

Interviewees also discussed a policy known as “bumping up,” with which they disagreed. This policy is used when a content moderator cannot determine whether a subject in a suspected CSAM photograph is a minor (“B”), or an adult (“C”) easily. Content moderators are instructed in such cases to assume that the subject is an adult. This will allow more images to be reported to NCMEC.

Antigone Davis, Facebook’s head for safety, stated to the Times that the policy was accurate and attempted to address privacy concerns regarding adult sexual imagery. Davis stated to the NYT that “the sexual abuse of children online was abhorrent” and that Facebook has a strict review process that flags potential CSAM more than any other company.