Skip to main content

Science
& Technology
in Policing

New Research Highlights Public Attitudes Towards Non-Consensual Sexual Deepfakes

Publication
Book text with the word Deepfakes highlighted

A new study commissioned by the Office of the Police Chief Scientific Adviser and conducted by Crest Advisory reveals concerning public attitudes towards non-consensual sexual deepfakes - a growing form of technology-enabled abuse.

The nationally representative survey of 1,700 people aged 16 and over in England and Wales found that around 25% of respondents agreed with or felt neutral about the legal and moral acceptability of viewing, sharing, creating or selling sexual or intimate deepfakes - even when the person depicted had not consented.

Key Findings

  • Prevalence and Exposure: 67% of respondents said they had seen or might have seen a deepfake. Most described the content as humorous (43%), scam-related (43%) or political (42%). However, 21% admitted to viewing a sexual deepfake of someone they don’t know, and 14% had seen one of someone they know.
  • Attitudes and Risk Factors: Those who considered sexual deepfakes acceptable were more likely to:
    • Hold misogynistic views
    • Actively watch pornography
    • Feel positively about AI
    • Be younger males
  • Public Concern: Six in ten people reported being ‘very’ or ‘somewhat’ worried about having a deepfake made of them, with women more likely to feel this way.
  • Harm Perception: While 92% agreed sexual deepfakes are harmful, many respondents believed other crimes—such as phone theft—are more damaging. This underestimation may discourage victims from reporting abuse.
  • Legal Awareness: Only 14% were aware of current legislation, despite the Government introducing a new offence earlier this year, carrying penalties of up to two years in prison for creating sexually explicit deepfakes.

 

Why This Matters for Policing

Deepfake technology is becoming cheaper, more accessible, and increasingly normalised. The vast majority of deepfake videos online are sexually explicit and disproportionately target women and girls. Crest Advisory’s review of existing evidence shows the psychological impact of deepfake abuse can mirror the effects of sexual assault.

Report author, Crest Advisory Head of Policy and Strategy Callyane Desroches, said:

“The creation of deepfakes is rapidly rising and becoming increasingly normalised as the technology to make them becomes cheaper and more accessible.

Existing surveys have focused on political or financial deepfakes. We have sought to fill this gap to better understand public attitudes towards the growing use of deepfake technology as a form of abuse against women and girls.

While some deepfake content may seem harmless, the vast majority of video content is sexualised – and women are overwhelmingly the targets.

We are deeply concerned about what our research has highlighted – that there is a cohort of young men who actively watch pornography and hold views that align with misogyny who see no harm in viewing, creating and sharing sexual deepfakes of people without their consent.

People under the age of 45 are more likely to be aware of and exposed to deepfakes. And at the same time, there is a lack of awareness about the legal implications of creating and sharing deepfakes.

More research is needed to understand how we can best educate the public about the harm that this content can cause – and how what is being watched online impacts a person’s attitudes and actions offline.

Lastly, we want to look further into what policing and technology companies require to effectively enable intelligence led policing in this new and evolving space.” 

Paul Taylor, Chief Scientific Adviser for Policing, added:

“This research was commissioned as part of a wider programme to support policing in understanding and responding to the evolving threat of deepfakes and align with Government and policing priorities to improve the response to online harms and halve VAWG within a decade.

Crest Advisory’s work provides a scientifically grounded and nationally representative view of public attitudes towards deepfakes, highlighting concerns around the growing use of deepfake technology as a form of gender-based violence against women and girls. Their focus on the psychological and emotional impact of this abuse is essential to influencing the approach required across the justice system to tackle this threat.”

Read the full Crest Advisory report for detailed findings.