
HARRISBURG: Senators Tracy Pennycuick (R-24), Scott Martin (R-13) and Lisa Baker (R-20) have introduced a bill to better protect young people against the serious threats posed by child abuse materials generated using artificial intelligence (AI).
The Senators noted there has been a startling increase in the amount of AI-generated child sexual abuse material (CSAM) being created and shared in recent years, including troubling cases in school settings.
Senate Bill 1050 would require mandated reporters – such as teachers, child care workers and health care providers, among others – to report all instances of CSAM they become aware of, including those produced by a minor.
“The damage inflicted by the creation and dissemination of these materials can be devastating. For minors, already in the formative stages of their lives, the trauma, anxiety, and fear caused by these acts can be life-altering,” Pennycuick said. “While our progress on this issue has been significant, we need to do more to protect our children from the dangers of CSAM, especially when these images surface in school settings.”
“Protecting children from emerging threats in the digital world is a critical need. The tools that criminals use to target young people are constantly changing and evolving, and the law needs to change to keep pace,” Martin said. “I appreciate all the stakeholders and families who have worked with us over the past several months to bring forward this comprehensive legislation addressing mandated reporting so we can ensure fewer children are subjected to these incredibly damaging materials.”
“Ensuring the safety and security of our children is of the upmost importance. While we have made great strides in addressing the use of AI in CSAM, we must continue these efforts by updating our child protection laws to require that these cases be reported to the appropriate authorities,” said Baker. “This bill is an essential first step toward targeting those who find novel ways of luring children, whether the producers are adults or fellow students.”
In the past year, the Senate has already taken important steps to combat the issue of sexually explicit materials created through AI by passing Act 125 of 2024 and Act 35 of 2025, which addressed deepfakes and sexual deepfakes. Senate Bill 1050, a bipartisan proposal, would build on these accomplishments and ensure cases involving CSAM are reported and investigated promptly.
CONTACT:
Liz Ferry (Senator Pennycuick)
Jason Thompson (Senator Martin)
Jennifer Wilson (Senator Baker)