Home MitNews Miami Teens Arrested for Creating AI-Generated Nude Images of Classmates
MitNews

Miami Teens Arrested for Creating AI-Generated Nude Images of Classmates

Share
Miami Teens Arrested for Creating AI-Generated Nude Images of Classmates
Share


March 14th, 2024: Two teenagers from Miami, Florida, aged 13 and 14, were arrested on December 22, 2023, for allegedly creating and sharing AI-generated nude images of their classmates without consent.

According to a police report cited by WIRED, the teenagers used an unnamed “AI app” to generate the explicit images of male and female classmates, ages 12 and 13.

The incident, which took place at Pinecrest Cove Academy in Miami, led to the suspension of the students on December 6th and was subsequently reported to the Miami-Dade Police Department.

The arrests and charges against the teenagers are believed to be the first of their kind in the United States related to the sharing of AI-generated nudes.

Under a 2022 Florida law that criminalizes the dissemination of deepfake sexually explicit images without the victim’s consent, the teenagers are facing third-degree felony charges, which are comparable to car theft or false imprisonment.

As of now, neither the parents of the accused boys nor the investigator and prosecutor in charge have commented on the case.

The issue of minors creating AI-generated nudes and explicit images of other children has become increasingly common in school districts across the country.

While the Florida case is the first known instance of criminal charges related to AI-generated nude images, similar cases have come to light in the US and Europe.

The impact of generative AI on matters of child sexual abuse material, nonconsensual deepfakes, and revenge porn has led to various states tackling the issue independently, as there is currently no federal law addressing nonconsensual deepfake nudes.

President Joe Biden has issued an executive order on AI, asking agencies for a report on banning the use of generative AI to produce child sexual abuse material, and both the Senate and House have introduced legislation known as the DEFIANCE Act of 2024 to address the issue.

Although the naked bodies depicted in AI-generated fake images are not real, they can appear authentic, potentially leading to psychological distress and reputational damage for the victims.

The White House has called such incidents “alarming” and emphasized the need for new laws to address the problem.

The Internet Watch Foundation (IWF) has also reported that AI image generators are leading to an increase in child sexual abuse material (CSAM), complicating investigations and hindering the identification of victims.





Source link

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

By submitting this form, you are consenting to receive marketing emails and alerts from: techaireports.com. You can revoke your consent to receive emails at any time by using the Unsubscribe link, found at the bottom of every email.

Latest Posts

Related Articles
A Competitive Edge for Digital Agencies
MitNews

A Competitive Edge for Digital Agencies

Running a digital marketing agency today is like juggling flaming swords while...

A Game-Changer for Digital Agencies in 2025
MitNews

A Game-Changer for Digital Agencies in 2025

Web development projects demand rigorous Quality Assurance to maintain functionality and ensure...

How AI Content Creation Tools Are Helping Marketing Agencies
MitNews

How AI Content Creation Tools Are Helping Marketing Agencies

One of the biggest setbacks for marketing agencies has been scaling content...

Generate Client-Ready Pitches Quickly – Weam – AI For Digital Agency
MitNews

Generate Client-Ready Pitches Quickly – Weam – AI For Digital Agency

Every business pitch you draft shares a simple goal: to persuade a...