H
society
H
HOYONEWS
HomeBusinessTechnologySportPolitics
Others
  • Food
  • Culture
  • Society
Contact
Home
Business
Technology
Sport
Politics

Food

Culture

Society

Contact
Facebook page
H
HOYONEWS

Company

business
technology
sport
politics
food
culture
society

© 2025 Hoyonews™. All Rights Reserved.
Facebook page

The rise of deepfake pornography in schools: ‘One girl was so horrified she vomited’

about 11 hours ago
A picture


‘It worries me that it’s so normalised,He obviously wasn’t hiding it,He didn’t feel this was something he shouldn’t be doing,It was in the open and people saw it,That’s what was quite shocking.

”A headteacher is describing how a teenage boy, sitting on a bus on his way home from school, casually pulled out his phone, selected a picture from social media of a girl at a neighbouring school and used a “nudifying” app to doctor her image.Ten years ago it was sexting and nudes causing havoc in classrooms.Today, advances in artificial intelligence (AI) have made it child’s play to generate deepfake nude images or videos, featuring what appear to be your friends, your classmates, even your teachers.This may involve removing clothes, getting an image to move suggestively or pasting someone’s head on to a pornographic image.The headteacher does not know why this particular girl – a student at her school – was selected, whether the boy knew her, or whether it was completely random.

It only came to her attention because he was spotted by another of her pupils who realised what was happening and reported it to the school.The parents were contacted, the boy was traced and the police were called in.But such is the stigma and shame associated with image-based sexual abuse and the sharing of deepfakes that a decision was made that the girl who was the target should not be told.“The girl doesn’t actually even know,” the head said.“I talked to the parents and the parents didn’t want her to know.

”The boy on the bus is just one example of how deepfakes and easily accessed nudifying technology are playing out among schoolchildren – often to devastating effect.In Spain last year, 15 boys in the south-western region of Extremadura were sentenced to a year’s probation after being convicted of using AI to produce fake naked images of their female schoolmates, which they shared on WhatsApp groups.About 20 girls were affected, most of them aged 14, while the youngest was 11.In Australia, about 50 high school students at Bacchus Marsh grammar in Victoria reported that their images had been faked and distributed – the mother of one student said her daughter was so horrified by the sexually explicit images that she vomited.In the US, more than 30 female students at Westfield high school in New Jersey discovered that deepfake pornographic images of them had been shared among their male classmates on Snapchat.

It’s happening in the UK, too.A new poll of 4,300 secondary school teachers in England, carried out by Teacher Tapp on behalf of the Guardian, found that about one in 10 were aware of students at their school creating “deepfake, sexually explicit videos” in the last academic year.Three-quarters of these incidents involved children aged 14 or younger, while one in 10 incidents involved 11-year-olds, and 3% were younger still, illustrating just how easy the technology is to access and use.Among participating teachers, 7% said they were aware of a single incident, and 1% said it had happened twice, while a similar proportion said it had happened three times or more in the last academic year.Earlier this year, a Girlguiding survey found that one in four respondents aged 13 to 18 had seen a sexually explicit deepfake image of a celebrity, a friend, a teacher or themselves.

“A year ago I was using examples from the US and Spain to talk about these issues,” says Margaret Mulholland, a special needs and inclusion specialist at the Association of School and College Leaders.“Now it’s happening on our doorstep and it’s really worrying.”Last year the Times reported that two private schools in the UK were at the centre of a police investigation into the alleged making and sharing of deepfake pornographic images.The newspaper said police were investigating claims that the deepfakes were created at a boys’ school by someone manipulating images taken from the social media accounts of pupils at a girls’ school.The children’s commissioner for England, Dame Rachel de Souza, has called for nudification apps such as ClothOff, which was investigated as part of the Guardian’s Black Box podcast series about AI, to be banned.

“Children have told me they are frightened by the very idea of this technology even being available, let alone used,” she says.It’s not easy to find teachers willing to speak about deepfake incidents.Those who agreed to be interviewed by the Guardian insisted on strict anonymity.Other accounts were provided by academics researching deepfakes in schools, and providers of sex education.Tanya Horeck, a professor of film and feminist media studies at Anglia Ruskin University, has been talking to headteachers as part of a fact-finding mission to uncover the scale of the problem in schools.

“All of them had incidents of deepfakes in their schools and they saw this as an emerging problem,” she says,In one case, a 15-year-old girl who was new to a school was targeted by male students who created a pornographic deepfake video of her,She was so distressed she initially refused to go to school,“Almost all the examples they told me about were boys making deepfakes of girls,” says Horeck,“The other thing that I noticed is that there’s this real tension around how they should handle these issues.

So some teachers were saying, ‘Yeah, we just get the police in right away and students are expelled’ – that kind of approach,” Horeck says.“Then other teachers were saying, ‘Well, that’s not the way to handle it.We’ve got to have more of a restorative justice approach, where we’re talking to these young people and finding out why they’re doing these things.’“So there seems to be some kind of inconsistency and uncertainty on how to deal with these cases – but I think it’s really hard for teachers because they’re not getting clear guidance.”Laura Bates, the founder of the Everyday Sexism Project, says there is something particularly shocking about deepfake images.

In her book The New Age of Sexism: How the AI Revolution Is Reinventing Misogyny, she writes: “Of all the forms of abuse I receive they are the ones that hurt most deeply – the ones that stay with me.It’s hard to describe why, except to say that it feels like you.It feels like someone has taken you and done something to you and there is nothing you can do about it.Watching a video of yourself being violated without your consent is an almost out-of-body experience.”Among school-age children, the impact can be huge.

Girls and young women are left feeling violated and humiliated.School friendship groups are shattered and there can be a deep sense of betrayal when one student discovers another has created a deepfake sexualised image of them, and shared it around the school.Girls can’t face lessons, while teachers with little training do their best to support and educate.Meanwhile, boys and young men are being drawn into criminal behaviour, often because they don’t understand the consequences of their actions.“We do see students who are very upset and feel betrayed and horrified by this kind of abuse,” says Dolly Padalia, the CEO of the School of Sexuality Education, a charity providing sex education in schools and universities.

“One example is where a school got in touch with us,A student had been found to have taken images of lots of students within the year group and was making deepfakes,“These had then been leaked, and the fallout was quite significant,Students were really upset,They felt very betrayed and violated.

It’s a form of abuse.The police were involved.The student was removed from school and we were asked to come in and support.The school responded very, very quickly, but I would say that’s not enough.In order for us to really be preventing sexual violence, we need to be more proactive.

”It is estimated that 99% of sexually explicit deepfakes accessible online are of women and girls, but there are cases of boys being targeted.The charity Everyone’s Invited (EI), which collects testimonies from survivors of sexual abuse, has run into at least one such case: “One student shared with the EI education team that a boy in their year group, who was well liked and friends with many of the girls, was targeted when another boy created an AI-generated sexual image of him.That image was then circulated around the school, causing significant distress and trauma.”EI also flags how these tools are being trivialised and used in disturbing ways, such as the “changing your friend into your boyfriend” filter.“On social media platforms like TikTok and Snapchat, they are increasingly accessible and normalised.

While this may seem playful or harmless to some, it reflects and reinforces a culture where consent and respect for personal boundaries are undermined.”Against a backdrop of widespread misogyny in schools, a growing number of teachers are also being targeted, EI and others report: “It is something that, as a society, we urgently need to confront.Education has to stay in front of technology, and adults must feel equipped to lead these conversations rather than shy away from them.”Seth James is a designated safeguarding lead – a senior member of staff with overall responsibility for child protection and safeguarding within a school – and the author of the DSL Blog.“For everyone working in schools, it feels like new sets of challenges and risks are constantly being thrown up by technological developments,” he says.

“AI generally – and particularly deepfakes and nudify apps – feel like the next train coming down the track.“‘More education’ is appealing as a solution to these sorts of challenges – because it’s intuitive and relatively easy – but on its own it’s like trying to hold back a forest fire with a water pistol.And likewise, the police seem completely overwhelmed by the scale of these issues.As a society we need broader solutions, and better strategy.”He continues: “We should all try to imagine how we would have felt 20 years ago if someone had suggested inventing a handheld device which could be used to create realistic pornographic material that featured actual people that you know in real life.

And then they’d suggested giving one of these devices to all of our children.Because that’s basically where we are now.We’re letting these things become ‘normal’ on our watch.”Jessica Ringrose, a professor of sociology of gender and education at University College London’s Institute of Education, has worked in schools on issues including masculinity, gender inequality and sexual violence.She is also co-author of a book called Teens, Social Media, and Image Based Abuse, and is now researching tech-facilitated gender-based violence.

“The way that young people are using these technologies is not necessarily all bad,” she says, “but what they need is better media literacy.” She welcomes the government’s updated relationships, sex and health education guidance, which “recognised that misogyny is a problem that needs to be tackled in the school system”.However, she says: “They need to put the dots together.They need to join up a concern with gender and sexual-based violence with technology.You can’t rely on Ofcom or the regulators to protect young people.

We need proactive, preventive education,”Where is the government in all this? “Our new relationships, sex and health education guidance will make sure that all young people understand healthy relationships, sexual ethics and the dangers of online content such as pornography and deepfake,” a Department for Education spokesperson said,“As part of our Plan for Change mission to halve violence against women and girls, we are also providing schools with new funded resources to help teachers explain the law and harms relating to online content as part of their age-appropriate lessons,”Ringrose stresses the urgency,“These issues are happening – non-consensual creation and distribution of images is happening.

These technologies are at people’s fingertips.I mean, it’s super-easy for any kid to access these things.”She is sceptical about efforts to ban smartphones in schools and worries they will make it harder for young people who may be targeted with abusive imagery to seek help.“Abstinence around things like technology doesn’t work,” she says.“You actually have to teach people to use it properly.

We need to engage with this as a really important element of the curriculum.”Which takes us back to the boy on the bus, where this story began.He was stopped because a girl on the same bus had recently had a lesson in school about online safety as part of her PSHE (personal, social, health and economic) curriculum.She recognised what he was doing and told her teachers.Education works.

In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000.The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331.In the US, call or text the Childhelp abuse hotline on 800-422-4453.In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380.Other sources of help can be found at Child Helplines International Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

societySee all
A picture

The rise of deepfake pornography in schools: ‘One girl was so horrified she vomited’

‘It worries me that it’s so normalised. He obviously wasn’t hiding it. He didn’t feel this was something he shouldn’t be doing. It was in the open and people saw it. That’s what was quite shocking

about 11 hours ago
A picture

UN experts raise concerns over homes rented out by English social landlord

UN experts have said that one of England’s biggest social landlords appears to have systematically failed to ensure the habitability of its rental properties.In a letter to the UK government, they cite the case of a disabled tenant, Sanjay Ramburn, 55, who they say lived with his family of five in an L&Q group property in Forest Gate, east London for several years with no electricity. They experienced four ceiling collapses, as well as severe damp and mould that affected their health.The children developed breathing issues, tinnitus and skin problems. Ramburn, who reported racial harassment and antisocial behaviour at the hands of his upstairs neighbour that he said was not addressed by L&Q, suffered severe mental health issues, the letter says

about 16 hours ago
A picture

Almost 4 million children in UK affected by economic abuse, charity finds

Almost 4 million children in the UK are suffering the impact of economic abuse in their families, with some having pocket or birthday money stolen by the perpetrators, a charity has found.Data from charity Surviving Economic Abuse (SEA) showed that over the past year 27% of mothers with children under 18 had experienced behaviour considered to be economic abuse, where a current or former partner has controlled the family’s money.The research found perpetrators used various means, including stopping mothers accessing bank accounts and child benefits, and refusing to pay child maintenance.As a result, some children are missing out on essentials including clothes and food.A third of the women who experienced economic abuse by a former partner reported that their ex refused to pay child support, or paid it unreliably, despite being able to afford it

about 21 hours ago
A picture

Resident doctors in England to go on strike in run-up to Christmas

Thousands of doctors in England are to go on strike again this month, in a dispute over pay and job security.The British Medical Association has announced that resident doctors – formerly known as junior doctors – will begin a five-day strike action that will run from 7am on 17 December until 7am on 22 December.It is the 14th strike by doctors since March 2023 and follows a similar five-day action last month, which led to warnings that the NHS may have to cut frontline staff and offer fewer appointments and operations if the strikes continue.The BMA said resident doctors were being driven “away from jobs and to the picket line” because the government was failing to make a “credible offer on jobs or pay”.Dr Jack Fletcher, chair of the BMA’s resident doctors committee, said: “With the government failing to put forward a credible plan to fix the jobs crisis for resident doctors at the same time as pushing a real-terms pay cut for them, we have no choice but to announce more strike dates

1 day ago
A picture

Assisted dying bill is safer than any other in the world | Letters

Dr Lucy Thomas raises some interesting points in her defence of the House of Lords’ behaviour on assisted dying (Letters, 26 November). But it is a stretch to suggest that the 1,000 amendments that peers have tabled to the bill represent effective independent scrutiny.What possible justification can there be for requiring every dying person – including a 90-year-old in their final weeks with advanced metastasised cancer – to provide a negative pregnancy test before their request is approved (amendment 458)? I am sure there are many peers who want to scrutinise the bill in a sensible way, but they are being thwarted by a handful who seem intent on stopping law change at any cost.The bill as currently drafted – which MPs have amended and approved – is safer than any other in the world, including in its protections for doctors. Clause 31 ensures that if Dr Thomas doesn’t wish to support her patients with this option, she would be under absolutely no obligation to do so

1 day ago
A picture

Shortage of ‘breakthrough’ weight loss drugs will slow fight against obesity, WHO warns

Weight loss drugs such as Mounjaro offer huge potential to tackle soaring obesity globally but are currently only available to one in 10 of those who need them, the World Health Organization has said.Their proven effectiveness in helping people lose weight means the medications represent “a new chapter” in how health services can treat obesity and the killer diseases it causes, the WHO added.Its statement urged countries to do what they could to ensure that people who would benefit from glucagon-like peptide-1 (GLP-1) therapies could access them. But while eligible adults generally should get them, pregnant women should not use them, the WHO stated.Limits on global production capacity mean that now only at most about 100 million people could receive the drugs – only 10% of the 1 billion who could benefit

1 day ago
technologySee all
A picture

Age of the ‘scam state’: how an illicit, multibillion-dollar industry has taken root in south-east Asia

about 19 hours ago
A picture

Siri-us setback: Apple’s AI chief steps down as company lags behind rivals

about 20 hours ago
A picture

‘It’s going much too fast’: the inside story of the race to create the ultimate AI

1 day ago
A picture

AI’s safety features can be circumvented with poetry, research finds

2 days ago
A picture

ChatGPT-5 offers dangerous advice to mentally ill people, psychologists warn

2 days ago
A picture

How big tech is creating its own friendly media bubble to ‘win the narrative battle online’

3 days ago