close
close

Association-anemone

Bite-sized brilliance in every update

AI “nude” image case closed by Pensacola Police Department
asane

AI “nude” image case closed by Pensacola Police Department

play

Pensacola police say they have closed the case an 18-year-old from Pensacola who admitted to taking dozens of “nude” pictures of local high school girls and young women using an app that exists to “undress” people, and he won’t be charged.

while investigators charged his ex-girlfriend, 17-year-old Jaylyn Lee, with the release of the images, they say the former Washington high school student who made them did not violate the laws they applied to his case, which included “promoting an altered sexual representation” and “possessing/controlling the viewing of child pornography “.

Police say part of the problem is that apps like “Undress Me,” the one the teenager used to take the photos, are legally allowed to exist. They say taking or owning photos taken with AI isn’t illegal, but sending them is. They say they would like to see lawmakers close the “loophole” that prevented them from charging the man.

“Laws always tend to lag behind technology, and that’s what’s happening in this case,” said Mike Wood, public information officer for the Pensacola Police Department. “If you’re going to allow people to do it and have it on their phone, what do you think they’re going to do with it? It should be illegal for any of them.”

1st District Attorney Ginger Madden said her office has been working with police to evaluate possible charges against the man at the center of the case and they agree he has not broken any laws.

“They had looked at it, I think mostly under child pornography, and they had also tried to assess it under the AI ​​Act, which required the promotion, dissemination, and I don’t think that kind of nudity is against the law, with unless it’s broadcast,” Madden said.

Why some say the police are applying a double standard to the case

Some parents aren’t buying it, however, and believe that police and prosecutors are selectively enforcing the law to protect the teenager whose family is in high positions of political and legal authority. The News Journal is not releasing his name because he has not been arrested.

“I’m confused that it’s not considered sexual for him, but it is for (his ex-girlfriend),” said Julie Harmon, whose daughter is depicted in one of the AI-generated images. “Why is she accused of spinning and promoting AI sexual images when he is not accused of creating sexual images? I think it all has to do with his name.”

Police say they considered arresting the man under the same statue that led to his ex-girlfriend’s third-degree felony charge, but said they couldn’t because he didn’t “promote” or share the images, from as far as they know.

Florida enacted a “deep fake” law in 2022 to address the use of AI in creating altered sexual depictions, but criminal charges related to sexual depictions require the images to be “promoted” as they are shared and/or transferred. Police say his ex-girlfriend broke up when he shot a video of the altered home footage she had on her phone and sent that video to others, including the depicted victims.

Police say they also considered charging him under state child pornography laws. However, they say the images are not “sexual” enough to meet the legal standard of possession or representation of child pornography, even though they depict “nudity” and many of the girls were minors at the time.

The parents say police told them that while detectives believe the young man was using the photos for his own personal sexual gratification, the nudity alone did not make them “sexual.” According to police, he altered images the victims posted on social media by “undressing” them and adding AI-generated genitalia.

“Her crime is promoting AI-generated sexual photographs of an identifiable person. I said, ‘But it was his photos that he created,’ and they said, ‘That’s the gap in the statute that makes us so frustrated,'” Harmon said, recounting her conversations with investigators.

“‘The law she was accused of prohibits the promotion or transmission of the images because they depicted the girls in the nude. Sexual conduct is not required under this statute, as is the case with child pornography. The problem is that the statute does not prohibit making or owning the images, which is all I could prove he did.”

However, police admit they do not know if he shared the images with others because they did not seize his mobile phone. They said there was not enough evidence that he had committed a crime to warrant obtaining a warrant to obtain his phone and review it.

Attorney Autumn Beck Blackledge believes police have the “wrong legal handle” as they apply state child pornography and child abuse laws. Blackledge’s daughter is also depicted in an altered nude image created by the young man and they, along with other victims, want him held accountable.

“If promoting an image containing child sexual behavior is a crime and Miss Lee can be arrested for it, then under the same statue is possession, control or viewing,” Blackledge said. “That’s what (the man) did. Not only did he generate the child pornography, but he showed it to Miss Lee on the phone, he also showed it to other young people in this community who either weren’t interviewed or lied about seeing- oh, he uploaded the photos he created. photo album sharing site Visco and he owned more than three pornographic images, which indicates under the law that he intended to distribute them.

“Child pornography is defined in 827.071 and specifically includes depictions of ‘female genitalia,'” Blackledge continued. “… These photographs contain identifiable female genitalia of children. If the girlfriend could be charged, then he should be.”

About a dozen parents and victims contacted the News Journal about the case. While not everyone wants to go on the record or see the rough young man punished, most want to see some accountability and believe his family connections influence the outcome of the case.

Both Madden and police say that’s not true, despite skepticism among some parents.

“While it may be fair for her to be charged, it’s unfair that she was charged and he wasn’t,” Blackledge said. “This course followed by law enforcement will encourage continued abuse of women, it does not deter anti-female behavior but now adds to that abuse by only arresting the woman and not the man. To be clear, (the man) created these photos with a paid service only, saved them, displayed them, uploaded them and is not responsible.

“What is clear is that the PPD officers did not issue any subpoenas to retrieve (the man’s) phone, which will certainly prove that he did everything he was apprehended for and more. A full and complete investigation was not done,” she said.

How the “Undress Me” app works.

Police and parents say the man used the “Undress Me” app to create the altered nude images. The app’s policy states that it can only be used by adults and that they can’t use images without someone else’s consent or on children’s photos.

The website appears to be primarily for the creation of personal “adult content”. Creators make deliberate decisions at every step of the process. Users can X-ray an image for free, but must pay if they want to “undress” someone with it.

There are other AI-generated options besides “undress” that are sexually explicit. Although authorities agree that the teenager’s intention was likely to create content for sexual gratification, they say this is not illegal. Police say this case is the third such case in Florida involving students and AI-generated sexual content.

“It would seem to us that if someone has a lot of naked pictures, there’s a sexual gratification that comes from that,” Madden said. “It would seem like common sense, but unfortunately the law doesn’t seem to support it. There needs to be either a change in the law or a law created to address this, and I think there will be because it’s happening more and more often.”

Parents agree that there needs to be changes in the law, but believe the outcome of this case sends the wrong message to victims.

“This is clear sexism,” Blackledge said. “I guess it’s still true that when a kid from a prominent family who’s accused of something that abuses women, the first thing people still think about is the bright future for this male child, you know an Eagle Scout who’s headed to Auburn . The university and how he wouldn’t want to mess them up over some naked photos of some girls.”

What’s next for the girl in charge of sending the pictures

Madden said Lee is scheduled to be arraigned in juvenile court in December.