
Independent MP Kate Chaney introduced an AI bill to criminalize AI-generated child sexual abuse material, closing a loophole that currently fails to handle synthetic content, with offenders facing up to 15 years in prison.
The child abuse bill would ban creating, distributing, or training AI models to produce exploitative content, while exempting law enforcement’s investigative use. Even though AI-generated child abuse material is surging globally, Australia must act now to prevent perpetrators from exploiting undetectable synthethic imagery.
Currently, while possessing or sharing child abuse bill material is illegal, there is no specific prohibition against downloading or sharing AI generated child exploitation content.
Chaney highlighted the growing accessibility of these technologies, noting that some of the most popular AI generators are being visited millions of times online.
“[This] clearly needs to be done urgently and I can’t see why we need to wait to respond to this really significant and quite alarming issue,” said Chaney.
Chaney explained that while it’s a difficult task to legislate A, there are quick steps that lawmakers can take to plug existing legal outs to include high risk applications like AI creating child abuse content.
The AI Bill Main Terms
Chaney’s child abuse law would create new offenses against carriage services to use to access, distribute, or create child abuse material using AI tools. The proposed new bill would also criminalize sharing or scraping data for training AI that generates such material.
Criminals would face up to 15-year term time at jail. But the bill does not exclude using such technologies by law enforcement and intelligence agencies for investigation to ensure they are not hindered in their war against child abuse.
“These tools enable the on-demand, unlimited creation of this type of material, which means perpetrators can train AI tools with images of a particular child, delete the offending material so they can’t be detected, and then still be able to generate material with word prompts,” Chaney explained.
The AI tool misuse law gap makes it even more difficult for police to track these crimes.
Legislators Face a Moral Test
With speedy development of AI child abuse bill technology, lawmakers have a basic moral challenge for the AI bill. The risk of delay is glaring, with child abuse material produced using AI technology increasingly difficult to detect and trace. These instruments are not only making policing more difficult but also assisting perpetrators to evade justice through erasing all crime evidence.
“I think it’s incredibly important to recognize that every AI generated image starts with a real child. That child is harmed somewhere in the process,” said Chaney.
“This legislation is a clear step toward plugging the gaps in current laws to protect our children from these evolving threats.”
Child Protection Experts Back the Bill
The child protection experts wholeheartedly support the child abuse law, believing it addresses a critical gap in the existing legal mechanism.
Former police detective inspector, Jon Rouse, stated that while the existing laws in Australia address the prosecution Australian criminal code act 1995 and of child sexual abuse content production, they do not as of yet incorporate the use of AI to create such content.
Australian Chief of the International Centre for Missing and Exploited Children, Colm Gannon, also acted and showed his support. There is a strong consensus that these Australia AI CSAM tools have no place in society, and this bill is a “clear and targeted step to close an urgent gap,” Gannon said.
Attorney General Michelle Rowland reaffirmed her commitment to combat child exploitation, insisting on the government’s strong framework. But Rowland insisted that laws must continually adapt to keep pace with technological threats targeting children.
With technology advancing in AI, the government must prioritize regulating an AI bill in this session, Chaney feels.
“As Attorney-General, I am fully committed to combating child sexual exploitation and abuse in all settings, including online, and the government has a robust legislative framework in place to support this,” Rowland said.
As the AI world continues to grow daily, the importance of putting strict and possible Criminal Code Act in place has never been more in need, offering safeguarding to vulnerable children from the risks posed by such rapidly emerging technologies.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.