Georgia Legislators Target AI-Generated Child Sexual Abuse Material
Proposed legislation in Georgia seeks to criminalize the distribution of AI-generated sexually explicit images of children, with penalties of up to 15 years in prison, regardless of whether the child exists. Senate Bill 9, which has garnered bipartisan support, is poised to pass the Georgia General Assembly.
This bill arrives amid escalating apprehension surrounding the misuse of artificial intelligence to generate child sexual abuse material, a growing concern nationally. Similar legislative efforts have gained traction in states like California and Tennessee. The Georgia bill unanimously passed the House Technology and Infrastructure Innovation Committee and proceeded to the House Rules Committee, signaling strong support.
According to co-sponsor Sen. Emanuel Jones, the legislation is intended to deter AI crimes that are becoming increasingly prevalent across the country.
This legislative push gained urgency following the arrest of Ronald Richardson in Gilmer County. Richardson, a Pepsi vendor who had access to Gilmer County High School’s soda machines, allegedly took advantage of his position to photograph young girls. Authorities claim he used AI to transform their photos into child pornography, which he then distributed online. Richardson was charged in January with multiple counts of sexual exploitation for possessing child pornography. This case, believed to be the first of its kind in Georgia, underscored the need for specific laws addressing AI-generated child sexual abuse. “There’s already been one case and it’s just a matter of time before others. We need to get ahead of it,” Jones stated.
In December, a student alerted the school resource officer about Richardson’s requests for videos. Parents of eight alleged victims, aged 12 to 17 at the time, have filed a lawsuit against Richardson and Pepsi. The investigation began when a student reported Richardson’s conduct; the Gilmer County Sheriff’s report detailed Richardson frequently offering the student free soft drinks before requesting explicit images.
Richardson was initially temporarily removed from his route but was later reinstated due to Pepsi’s difficulty in finding replacement drivers, according to the lawsuit. Following further reports in December, Richardson was terminated and is now in jail facing multiple charges of sexual exploitation. Although he was not employed by the school district, he possessed an access badge permitting him entry to both the high school and middle school.
Gilmer County Sheriff Stacy Nicholson stated that numerous minors were victimized in the case. “All of the sexually explicit (nude) photographs of the minor children that Richardson is alleged to have possessed are actually normal snapshots which he captured from various social media pages and then altered (or had altered) using AI to make the images appear nude,” Nicholson said.
Richardson was denied bail, and the case was transferred to the DeKalb County District Attorney’s Office.
Sen. John Albers, R-Roswell, who chaired the Senate Study Committee on Artificial Intelligence and co-sponsored SB 9, emphasized the bill’s aim: “SB 9 is designed to modernize and strengthen the legal provisions concerning obscenity and the use of emerging technologies in criminal activities, thereby offering better protection to the citizens of Georgia from evolving threats and inappropriate materials.”
In addition to state-level actions, the U.S. Senate unanimously passed the Take It Down Act on February 13, which would criminalize the distribution of nonconsensual, sexually explicit images and videos. This act would also mandate that technology companies remove such images within 48 hours of receiving a victim’s notice. First Lady Melania Trump recently advocated for the bill’s passage at a Capitol Hill roundtable.
Thomas Kadri, an assistant professor at the University of Georgia School of Law, noted the current lack of legal avenues for removing sexual deepfakes. He added, there is also potential for states like Georgia to create laws around civil liability, allowing victims to sue perpetrators.