Facebook and Instagram Are Launching a Tool to Help Other Sites Remove Child Abuse Images
1 min readPhoto credit: Leon Neal / Getty Images. Article by Mack DeGeurin. Gizmodo – February 27, 2023.
Facebook and Instagram are taking some of their strongest steps yet to clamp down on child sexual abuse material (CSAM) that is flooding their social networks. Meta, the parent company of both, is creating a database in partnership with the National Center for Missing and Exploited Children (NCMEC) that will allow users to submit a “digital fingerprint” of known child abuse material, a numerical code related to an image or video rather than the file itself. […]
Click here to view original web page at gizmodo.com
Related articles:
NCMEC, Google and Image Hashing Technology – Google Safety Centre
Meta Backs New Platform To Help Minors Wipe Naked, Sexual Images Off Internet – Forbes
‘Take It Down:’ a Tool for Teens to Remove Explicit Images – US News