close
close

Association-anemone

Bite-sized brilliance in every update

Law enforcement officers race to stop child pornography with artificial intelligence | News, Sports, Jobs
asane

Law enforcement officers race to stop child pornography with artificial intelligence | News, Sports, Jobs



WASHINGTON (AP) — A child psychiatrist who altered a first-day-of-school photo he saw on Facebook to make a group of girls appear nude. A US Army soldier accused of creating images of children he knew being sexually abused. A software engineer tasked with generating sexually explicit hyperrealistic images of children.

US law enforcement agencies are cracking down on a troubling spread of child sex abuse images created through artificial intelligence technology – from doctored photos of real children to computer-generated graphic depictions of children. Justice Department officials say they are aggressively pursuing criminals who exploit AI tools, as states race to ensure that people who generate “deepfakes” and other harmful images of children can be prosecuted under their laws.

“We need to signal early and often that this is a crime, that it will be investigated and prosecuted when the evidence supports it,” Steven Grocki, who heads the Justice Department’s Child Exploitation and Obscenity Division, said in an interview for The. Associated Press. “And if you’re sitting there thinking otherwise, you’re dead wrong. And it’s only a matter of time before someone holds you accountable.”

The Justice Department says existing federal laws clearly apply to such content and recently brought what is believed to be the first federal case involving purely AI-generated images — meaning the children depicted are not real, but virtual . In another case, federal authorities in August arrested a US soldier stationed in Alaska, accused of broadcasting innocent images of real children he knew through an AI chatbot to make the images sexually explicit.

The pursuits come as child advocates work urgently to curb misuse of the technology to prevent a flood of disturbing images, which officials fear could make it harder to save real victims. Law enforcement officials fear that investigators will waste time and resources trying to identify and track down exploited children who don’t really exist.

Meanwhile, lawmakers are passing legislation to ensure local prosecutors can bring charges under state law for AI-generated “deepfakes” and other sexually explicit images of children. Governors in more than a dozen states have signed laws this year cracking down on digitally created or altered images of child sexual abuse, according to an analysis by the National Center for Missing and Exploited Children.

“We’re playing catch-up as law enforcement with a technology that, frankly, is moving much faster than we are,” said Ventura County, California District Attorney Erik Nasarenko.

Nasarenko promoted legislation signed last month by Gov. Gavin Newsom that makes it clear that AI-generated child sexual abuse material is illegal under California law. Nasarenko said his office could not pursue eight cases involving AI-generated content between last December and mid-September because California law required prosecutors to prove the images depicted a real child.

AI-generated images of child sex abuse can be used to groom children, law enforcement officials say. And even if they are not physically abused, children can be deeply affected when their image is made to appear sexually explicit.

“I felt like a part of me was taken away. Even though I wasn’t physically raped,” said Kaylin Hayman, 17, who starred in the Disney Channel series “Just Roll with It” and helped promote the California bill, after becoming a victim of the images “deepfake”.



Today’s news and more in your inbox