By Redaccion
[email protected]
In response to the emerging rise of AI-generated child sexual abuse material (CSAM), Ventura County District Attorney Erik Nasarenko has teamed up with California Assemblymember Marc Berman, who recently introduced Assembly Bill 1831. The new legislation aims to address the escalating threat posed by artificial intelligence (AI) in the creation of lifelike, illicit content involving children. CSAM refers to visual depictions of sexually explicit conduct involving minors.
«This legislation is in response to the dangerous convergence of artificial intelligence and child exploitation,” District Attorney Erik Nasarenko said. “As technology evolves, so must our laws. This bill sends a clear message that our society will not tolerate the malicious use of artificial intelligence to produce harmful sexual content involving minors.”
District Attorney Nasarenko is a co-sponsor of the bill, along with the California District Attorneys Association, Common Sense Media, SAG-AFTRA, and University of San Diego School of Law Children’s Advocacy Institute. This unique coalition demonstrates a broad range of support for efforts to curtail the sexual exploitation of children.
“The sexual exploitation of children must be illegal, full stop. It should not matter that the images were generated by AI, which is being used to create child sexual abuse material that is virtually indistinguishable from a real child,” said Assemblymember Marc Berman. “We must stop the exploitation of children. It is critical that our laws keep up with rapidly evolving AI technology to ensure predators are being prosecuted and children are being protected.”
Assemblymember Berman represents the 23rd California Assembly District, which includes southern San Mateo County and northern Santa Clara County in the heart of the Silicon Valley. Berman has brought similar bills forward, including one regarding the creation of deepfakes, which are videos and photos made with real people manipulated to look and sound like someone else.
In the digital age, AI and computer technology advancements enable the creation of highly realistic computer-generated content, including CSAM. This poses a significant risk to individuals, particularly real children and previously victimized ones who form the basis for the machine generated images.
Hundreds of CyberTips from the National Center for Missing and Exploited Children are reviewed every month by expert investigators within the District Attorney’s Office Bureau of Investigation. One recent example of CSAM discovered by Bureau investigators involved an AI-generated video uploaded to YouTube, which showed a child engaging in intercourse with an adult male. This CyberTip could not be prosecuted because it was determined by Ventura County investigators that the content did not depict actual people.
In another example, investigators from the Ventura County Sheriff’s Office investigated the possession and transfer of CSAM amongst three individuals. It was determined one of the suspects was using his computer to create CSAM images. He admitted to creating and distributing made-to-order sexually explicit images of children for financial gain. Despite the obscene nature of the images and the fact that they appeared to depict young children, he could not be prosecuted.
Existing law prohibits the creation, distribution, and possession of CSAM depicting an actual child. It does not address AI-generated images that depict the likeness of a child, even those that are virtually indistinguishable from an actual child, or those that are AI-generated to look like a known child or child celebrity. AB 1831 would prohibit the creation, distribution, and possession of obscene CSAM images generated by artificial intelligence, adding new restrictions to California law.
AB 1831 was introduced on January 12, 2024, and is expected to be heard soon in the Assembly Public Safety Committee.