Categories: Education News & Digital Safety

Deepfake Images at Sydney School Prompt Police Probe

Deepfake Images at Sydney School Prompt Police Probe

Overview: Police investigating deepfake images from a Sydney school

Australian police are investigating reports that sexually explicit images—created by digitally altering the faces of female students from a Sydney high school—have circulated online. The reports have prompted concern among families, school staff, and government officials about the privacy and wellbeing of students in the digital age.

According to statements from Ryde Police Area Command, officers have commenced an investigation, and inquiries are ongoing. Several families attended Eastwood Police Station on Wednesday evening in connection with the case. The Department of Education has said the school is cooperating with authorities as the investigation proceeds.

What happened and who is involved

While details are still emerging, the incident reportedly began after a male student who was sent the images alerted his school. The matter has since drawn attention from both local police and education authorities, who are working to understand how the deepfakes were created and circulated, and to safeguard students’ privacy and safety.

Education officials emphasised that such content affects not just the individuals depicted but the broader school community, noting the harms associated with manipulated images circulated among peers and online networks.

Official response and next steps

The Department of Education spokesperson underscored ongoing collaboration with law enforcement and highlighted the need for supports at the school involved. Acting Education Minister Courtney Houssos said the issue is being discussed at departmental and ministerial levels, both in New South Wales and nationally, acknowledging that the problem extends beyond a single state.

Minister Houssos described deepfake risks as a nationwide challenge, stressing that schools are microcosms of broader social trends. She pledged that appropriate support would be provided to students and staff while authorities investigate and authorities seek to determine responsibility and prevent recurrence.

NSW laws: strengthening penalties for AI-generated sexual imagery

In a proactive step, the New South Wales Parliament recently strengthened laws to curb the creation of intimate images using artificial intelligence. The Crimes Act 1900 was amended to make it illegal to produce sexually explicit deepfakes of a real, identifiable person without their consent, with penalties of up to three years in jail. Attorney-General Michael Daley noted the changes were designed to keep pace with technology and to deter exploitation, particularly in school communities.

Deputy Attorney-General and education advocates have highlighted the severe impact of deepfakes on young people, including risks to mental health and safety. Legal reforms aim to close gaps that allowed the covert creation of intimate images and to deter individuals from engaging in these harmful practices.

National context and ongoing response

Federal discussions have also focused on tightening controls around AI-enabled manipulation. Federal Communications Minister Anika Wells signalled a crackdown on tools such as nudify apps, indicating a broader government strategy to address emerging online safety threats faced by students across Australia.

Education leaders have reiterated that technology literacy and digital citizenship are essential components of student welfare. Schools are encouraged to implement clear reporting channels, support services, and privacy safeguards to respond quickly when incidents like this emerge.

What families can do

Parents and carers are advised to talk with children about online safety, privacy, and the potential harms of deepfakes. Schools should review their policies on digital behaviour, ensure access to counselling services, and communicate with families about available resources. The investigation remains active, and authorities will provide updates as more information becomes available.