

Microsoft Corp. has built an automated system that can flag when sexual predators are trying to groom children in the chat rooms of video games or through messaging applications, it said this week.
Project Artemis, announced Thursday, works by looking for common patterns of communication that are known to be used by predators that target children. If a pattern is detected, the system will flag the conversation so it can be reviewed by a human moderator, who then determines whether to inform the police.
Microsoft has been testing Project Artemis for a while, using it to monitor chats on its Xbox platform. The company is now considering incorporating it into other services, such as Skype.
“‘Project Artemis’ is a significant step forward, but it is by no means a panacea,” Microsoft’s digital safety chief Courtney Gregoire wrote in a blog post. “Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems. But we are not deterred by the complexity and intricacy of such issues.”
Microsoft said it developed Project Artemis alongside The Meet Group Inc., which operates social media sites and chat services such as MeetMe, LOVOO, Skout and Tagged; Roblox Corp., creator of the popular massively multiplayer online game of the same name; Kik Interactive Inc., creator of the Kik messaging app; and Thorn, a nonprofit organization that promotes the use of technology to protect children online.
Thorn said it will begin licensing Project Artemis this week for free to “qualified online service companies” that offer a chat function.
Microsoft previously developed a tool called PhotoDNA that’s used by law enforcement agencies to find and remove known images of child sexual exploitation. The tool works by converting illegal images to a digital signature called a “hash” that can be used to find copies of the same image when they’re uploaded somewhere else online.
THANK YOU