x
Breaking News
More () »

AI giving rise to disturbing new crime trend

And law enforcement's current methods of protecting children haven't caught up with the latest technology.

SAN ANTONIO — As the growth of artificial intelligence continues across various industries, law enforcement is noticing another disturbing trend: the use of AI in the creation of child sexual abuse material, or CSAM.

The National Center for Missing and Exploited Children says the agency received over 36 million reports of CSAM in 2023, up 12% from the prior year. Within that statistic are cases of CSAM that are AI-generated.

“It's a major epidemic here in the United States and throughout the world,” said Craig Larrabee, special agent in charge with Homeland Security Investigations (HSI). Larrabee said HSI has historically investigated child pornography cases, but the manner in which CSAM images and videos are created has changed over time. 

Currently, investigators are seeing an increase in AI-generated CSAM nationwide.

“It's a drain on law enforcement, the courts and everything else to try to go after these criminals,” Larrabee said.

Part of the challenge in prosecuting these types of cases is that many state laws pertain to real-life victims, not virtual ones. Federal law is also a bit murky on how cases can be prosecuted. In addition, investigators have the challenge of distinguishing a real video from an AI-generated one, tying up valuable resources and manpower. 

Regardless of the challenges, Larrabee said AI-CSAM cases often involve perpetrators who are victimizing others in real life.

“When we do those cases, we’re also finding victims,” Larrabee said. “That person is not staying in that fantasy realm; they're also involved in sharing pictures or videos of actual children who have been victimized.”

In terms of protecting children against AI-generated CSAM, current methods have not caught up to technology, allowing perpetrators to grab photos off the internet or use photos of children they know in the creation of pornographic images.

“Once that picture is gone and out of your phone, it can go anywhere,” Larrabee aid. “Anybody can manipulate it. It can be sent out to anybody. So that's a big concern of ours.”

>TRENDING ON KENS 5 YOUTUBE:

Before You Leave, Check This Out