IEU deplores ‘deepfake’ sexual harassment of staff and students

The IEU is deeply concerned that members in several schools have been targeted by fake pornography generated by students.

IEU General Secretary Dave Brear said the IEU has recently provided support for several teachers who were made victim of AI-generated fake pornography by students in their schools. 

“These have been devastating events for the staff involved and their families,” David told ABC News.

“It remains a real problem that some schools and school systems have not taken incidents seriously enough, thereby redoubling the trauma suffered by those targeted. 

“Schools must make it an immediate priority to educate students about the responsible, lawful use of AI and social media. Students must be made aware that this sort of vile behaviour is criminal and that it will be reported to police. These discussions should take place as part of a broader discussion around respect for others, in particular respect for women.”

ABC News quoted the IEU Victoria Tasmania reporting on the topic.

The prevalence of such incidents has gone through the roof.

The ABC News report quoted a victim of deepfake porn who said, “perpetrators often don't think about the lifelong consequences this type of material can have on a person”.

“"This has the capacity to affect every aspect of your life from your employability, from your future earning capacity, from your reputation to your emotional and mental and physical health," she said. 

"It's reached points where I'm like I want to change my name and travel overseas and forget this ever happened.  

"You can't escape it. It always follows you." 

The Age has reported that there are estimates that there were 95,820 deepfake videos online in 2023 – “550 per cent more than in 2019”.

“Various reports estimate that between 96 and 98 per cent of those are non-consensual pornography. The biggest deepfake porn sites generate millions of hits a month.”

In October 2023, the national eSafety office warned schools to overhaul their safety policies after the deepfake phenomenon arrived in Australian schools from the US.

The eSafety Commissioner, Julie Inman-Grant, warned that the phenomenon was way more widespread than commonly understood.

In mid-June, a teenage boy was arrested after fake nude images of 50 girls at a regional Victorian private school were shared on social media.

Faces of the girls, in years 9 to 12 at Bacchus Marsh Grammar, were used in images of nudes generated by artificial intelligence and put on Instagram, then shared by students on Snapchat.

The Herald Sun reported that a 15-year-old student from Catholic boys’ school Salesian College, in Chadstone, was expelled in early June for producing explicit ‘deepfake’, images of a teacher at the school, using AI.

The Oxford Dictionary defines a deepfake as “a video of a person in which their face or body has been digitally altered so that they appear to be someone else, typically used maliciously or to spread false information”.

It is imperative that students and schools understand that manipulating anyone’s likeness and sharing it is a reprehensible act, a gross breach of trust and privacy. It is also an illegal action that should be met with criminal prosecution.

Incidents of such behaviour must not be swept under the rug or treated as harmless pranks.

Too many schools and employers aren’t taking the issue seriously enough.

At one boy’s school where a group of students were caught creating ‘deepfake’ pornographic images of staff, union members raised the alarm with school and employer leadership, emphasising the severe impact of the incidents upon educators and the school community.

The perpetrators were very briefly suspended and told they could not bring smartphones to class. But when they returned to lessons, they defied the smartphone ban, without penalty. Then mates of the perpetrators physically harassed staff who had bravely spoken out about the misbehaviour. The lack of accountability created a hostile, unsafe learning environment and exacerbated bullying behaviour towards both staff and students.

Many staff at this school have expressed dissatisfaction with how the cases were handled, feeling that the ‘consequences’ for students simply didnt adequately address the seriousness of the situation or prevent future misconduct.

Members at that school have stressed the importance of ensuring educator safety and upholding trust in educational institutions, calling for stronger disciplinary measures, policy reviews, and increased support for vulnerable teachers.

To help improve the situation at the school, the IEU offered to provide supportive staff discussions, known as Women’s Rights at Work (WRAW) chats, designed to empower and support impacted staff. School leadership and the employer rejected this proposal.

Urgent action is needed to address these sorts of disciplinary shortcomings and to better support affected teachers. Surely students suspended for manipulating digital images cannot be allowed to bring devices to class which can take photographs and edit images.

Policies governing student use of AI, and smartphones need to be made fit for purpose in the 21st century. Smartphones are powerful computers that can be used for egregious harassment, and the entire education system needs to get up to speed with their destructive potential.

But the first step for every school and employer is simple: treat the creation and distribution of deepfakes as the awful criminal acts they are. 

 

Previous
Previous

IEU on do it yourself safety in independent schools

Next
Next

The IEU’s Health and Safety Superheroes