x
Breaking News
More () »

'I don't want to live in fear anymore': Texas girl victimized with deepfake nudes pushes for federal law

Elliston Berry was 14 when a classmate created fake pornographic images of her that spread on social media. She's pushing for the "Take it Down Act."

DALLAS — Going into her sophomore year at Aledo High School, Elliston Berry is healing from what happened to her last year.

In October 2023, a student took her Instagram photos and plugged them into artificial intelligence, creating fake nude images of the 14-year-old girl. She said nine other classmates were victimized too.

"I had woken up with many, many text messages from my friends just telling me that there were these images of mine circling around," said Elliston.

"It was so realistic. It is child porn," said Anna McAdams, Elliston's mom. "We really could not take care of her. We, in that moment, were helpless. [...] More and more pictures were coming out throughout that first day into the second day."

Anna said the school, the sheriff's office, and Snapchat couldn't stop the spread of the photos.

For more than eight months, they couldn't get the social media platform to take down the image. At that point, they didn't know how far it spread.

"It wasn't until I went to Washington a couple of weeks ago and Senator [Ted] Cruz realized those pictures are still up there. I spent eight-and-a-half months, and he was able to get ahold of somebody at Snapchat and they immediately, within 24 hours, took the accounts down."

For the teenager who made the deepfake pornography, Anna said, "He has probation, a slap on the wrist, and then he gets done at 18. It'll be expunged. But these pictures could forever be out there of our girls."

That's why the mother-daughter duo is speaking out about what happened to Elliston.

"I was a freshman, and I was only fourteen," said Elliston. "Even today, I'm still fearful that these images will resurface. Ever since that day, I've lived in fear."

Elliston and Anna, alongside Senator Cruz, are pushing for federal change with the TAKE IT DOWN Act. It stands for "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks".

This is bipartisan legislation that would protect victims of real and deepfake non-consensual intimate imagery (NCII). If it is signed into law, it would:

  • Criminalize the publication of NCII or the threat to publish NCII in interstate commerce
  • Require websites to take down NCII upon notice from the victim within 48 hours, in addition to making reasonable efforts to remove copies of the images
  • Protect good faith efforts to assist victims
  • Protect lawful speech

Senator Cruz hopes to get this bill signed by the end of the year. Elliston and Anna will continue to share what happened to protect others from being victimized.

Before You Leave, Check This Out