It’s estimated that 89% of sexual solicitations made by a predator to a child were done within chat or instant messages. Microsoft is determined to help change that, with the release of “Project Artemis.”
Project Artemis is a tool to help identify predators in online chat. It was “developed in collaboration with The Meet Group, Roblox, Kik and Thorn,” a tech nonprofit specializing in technology that helps protect children from sexual abuse.
The tool is designed to evaluate conversations, looking for communication styles and patterns predators use to target children. According to Microsoft, “the development of this new technique began in November 2018 at a Microsoft ‘360 Cross-Industry Hackathon,’ which was co-sponsored by the WePROTECT Global Alliance in conjunction with the Child Dignity Alliance.”
Once deployed, the tool “evaluates and ‘rates’ conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review. Human moderators would then be capable of identifying imminent threats for referral to law enforcement, as well as incidents of suspected child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC).”
The tool will be freely available through Thorn “to qualified online service companies that offer a chat function.” Interested parties can contact Thorn directly at [email protected].