Subscribe to Our Free Newsletter For Updates on AI in US Congress

Rep. Bilirakis Introduces Bill to Prohibit AI-Generated Child Pornography

Joseph HolzmanFebruary 20th, 2025

This article explores the background and makeup of Rep. Gus Bilirakis's (R-FL) bill "To amend title 18, United States Code, to prohibit child pornography produced using artificial intelligence (H.R.1283)

On February 13, 2025, Rep. Gus Bilirakis (R-FL) introduced the bill “To amend title 18, United States Code, to prohibit child pornography produced using artificial intelligence” (H.R.1283). This bill, if passed, would amend Title 18 of the United States Code, which encompasses crimes and criminal procedures.


What Does Child Pornography Have to Do with Artificial Intelligence?


Most people are aware of the capabilities of artificial intelligence (AI) through the widespread prevalence and use of AI chatbots such as OpenAI’s ChatGPT and Google’s Gemini. These chatbots tend to center around text-based tasks–that is, they can help you write emails, debug code, and edit essays. However, the capabilities of AI extend beyond text. AI-powered image and video generation tools have become increasingly sophisticated and accessible to the general population over the years. Tools such as DALL-E, MidJourney, and Stable Diffusion can now produce images and videos that are stunningly realistic. This has raised ethical concerns over the use of AI to generate pornographic content, particularly that involving children.


Pornographic content involving children—both images and videos—is classified as Child Sexual Abuse Material (CSAM). CSAM is illegal under current U.S. federal law. The Protection of Children Against Sexual Exploitation Act of 1977 (S.1585) is the cornerstone bill that outlawed the production, distribution, and possession of CSAM. After passage, the bill was codified into Title 18 of the United States Code in 1978.


What Is Rep. Bilirakis Trying to Achieve With H.R. 1283?


One of the most significant amendments to Title 18 was the Child Pornography Prevention Act of 1996 (CPPA), which criminalized computer-generated and virtual CSAM, even if no actual child was involved. However, in Ashcroft v. Free Speech Coalition (2002), the Supreme Court struck down key parts of the CPPA, ruling that banning entirely fictional depictions—such as digital artwork or AI-generated images—violated the First Amendment. Since no real children were involved, the Court ruled that the First Amendment protects purely fictional depictions unless they are presented as real or used for criminal purposes. As a result, Title 18 as it stands today does not explicitly ban AI-generated CSAM. It is this loophole that Rep. Bilirakis is seeking to close with H.R. 1283.


What Roadblocks Could H.R. 1283 Face?


Because H.R. 1283 will challenge the landmark Ashcroft ruling, it will likely face heavy constitutional scrutiny under the First Amendment. Opponents of the bill may argue that it may set a dangerous precedent for restricting digital content. H.R. 1283 must also define what AI-Generated CSAM is in precise legal terms. While AI image and video generation has advanced significantly, defining what qualifies an image as realistic or sexualized remains a major legal gray area. Thus, law enforcement and policymakers must develop mechanisms to differentiate AI-generated CSAM from real CSAM.


In regards to legislative and political barriers, H.R. 1283 could face pushback from both lawmakers and lobbyists. Although child protection has historically been a bipartisan issue, some lawmakers—particularly those hesitant to expand AI regulations—may resist government intervention in private-sector technology. Lawmakers on both sides of the aisle may also oppose the bill over free speech concerns. On the lobbying side, Big Tech and AI Companies such as Google and OpenAI may lobby against the bill if it creates substantial liability risks. Social Media companies such as Meta and X may also push back if the criminalization of AI-Generated CSAM could result in stricter content moderation requirements.