Back
Youth
Act for Kids is a national child protection organisation.

Modern sex education must address AI risks

Act for Kids

The growing risks associated with artificial intelligence must be incorporated into age-appropriate sex education talks with kids, a leading child protection organisation has warned.

Act for Kids says parents, carers, schools and educators need to be informed and prepared, with generative AI emerging as a strong trend in online child abuse and exploitation and sextortion material.

The warning follows a spate of serious incidents involving Australian school students using AI technology to generate illegal child abuse material inappropriately depicting classmates and peers.

In 2024, the National Centre for Missing and Exploited Children (NCMEC) flagged a staggering 1325 per cent rise in incidents involving AI-generated child sexual abuse and exploitation content, according to their CyberTipline Report. 

Similarly, Australia's eSafety investigators noted a 218 per cent increase in reports of AI-generated child sexual abuse and exploitation content between 2023 and 2024.

Act for Kids CEO Dr Katrina Lines said it was disturbing how rapidly generative AI had become increasingly accessible, easy to use, and widely available across online platforms.

“Addressing this issue involves much-needed legislative change, but it also presents a new and important opportunity for parents and carers to educate themselves about the risks and impacts of AI,” Dr Lines said.

Act for Kids national ambassador and former detective inspector Jon Rouse APM has dedicated his career to protecting children from sexual abuse.

His research, conducted with the Canadian Centre for Child Protection, found child sexual abuse material distributed online could continue to circulate on the internet two decades after a case was investigated and marked as “solved”.

“Artificial intelligence adds another obstacle in the ongoing fight against online circulation of child abuse material,” Mr Rouse said.

Act for Kids is encouraging everyone who has kids or works closely with kids to talk with them about the appropriate use of AI or incorporate this into age-appropriate conversations around sex education and consent.

“Young people need to know that there are real-life consequences when AI is used to generate child abuse material – it’s a crime and they could face very real penalties,” Dr Lines said.

Mr Rouse adding, “This is a stark reminder of the pervasive trauma this offending can cause survivors and overarchingly, how significant the global failure has been to address it.” 

 

Act for Kids’ tips on how to speak to kids about sex education:

•           Start conversations with your child from a young age.

•           Keep conversations open and age appropriate – follow your child’s lead and use words they can                      understand

•           Use the correct anatomical words for body parts.

•           Answer questions in a calm, casual manner.

•           Ask your child what they already know about AI so you can ensure they have the appropriate                          information.

•           Don’t make it awkward – it’s important to remember if you don’t talk to them, they may get their                        information online or from an unsafe or unreliable source.

•           Talk regularly, rather than having ‘the chat’.

•           Explain the importance of consent, especially in a sexual context – ‘yes’ means yes, ‘no’ means no,                and ‘maybe’ means no.

•           Seek resources if you’re unsure about a topic.

•           Remind your child they can always ask you questions and talk to you.

For children needing support:

•           Kids Helpline: 1800 551 800 or www.kidshelpline.com.au

•           Lifeline: 13 11 14 or www.lifeline.org.au

If you are experiencing online harm or abuse, whether or not generative AI is involved, you can make a report to eSafety.   


Contact details:

Melanie Whiting

0427 794 666

[email protected]