Deepfakes as a Violation of Consent

TL;DR

Young Woman Leader Dibya Dahal argues that our image is a part of our identity, and so when sexual digital forgeries are made without a person’s consent, their identity and agency are both stolen.

Sexual digital forgeries, or ”deepfake” technology has moved beyond novelty or satire, with the overwhelming majority of deepfakes now being pornographic in nature and disproportionately targeting women.

This escalation in the creation of sexual digital forgeries raises questions about the problems of identity ownership and consent.

We generally accept that sexual digital forgeries wrong the person depicted because they violate their identity and their right to consent, but how is this so?  

A picture of a young woman evolves into glitching pixels.
Reihaneh Golpayegani / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

How are pictures linked to our identity? 

Identity is not just a matter of public recognition and how we are identifiable to others, but rather its importance lies in its correlation to personal agency and self-representation. Sexual digital forgeries fracture this link between the social and personal by producing representations that appear to be “real” but are outside the victim’s control and agency. 

Our images are not a neutral record of appearance. Images of ourselves are bound up with our personal (and in this case sexual) identity, and the images we choose to share are a way of choosing how we wish for that identity to be represented. By fabricating sexualised depictions from real photographs, these forgeries appropriate and distort that identity, presenting a version of the person they never agreed to portray as. The harm here arises from the absence of consent to create a sexual image at all. 

Digital forgeries undermine the integrity of a person’s public and private self, and do so in a way that cannot be dismissed as harmless fantasy; the forged representation trades on the person’s actual image. However, what reasons do we have for holding that pictures of ourselves are linked to our identity and thus agency?  Philosopher Claire Benn (2024) develops an argument to show that there are two ways an image can be ‘of us’: through its resemblance or through a material connection to a real photograph.

Benn’s analysis of consent in deepfakes offers a thoughtful starting point for how we can view forged images as violation of consent. She argues that an image can be ‘of us’ in two ways:(i) through identifiability and (ii) through material connection (it is causally linked to a photograph of us). Identifiability refers to the ability of others to recognise the person in the image, while material connection refers to the fact that forged images are derived from actual photographs of people; this means that there is a preserving causal link between the victim and the final image, even when it is altered.

Why consent matters

When created, sexual digital forgeries still retain the  physical likeness of the person whose physical, visual (identifiable) identity is appropriated. Regardless of how extensibly it is altered, the new image remains linked to the identity of the person from whose likeness was taken. When an image is sexualised, creating it should require a separate consent, which is different from agreeing to be photographed or to take part in sexual activity. 

The harm is not just about whether the sexual act happened but it is also about how in the forging of images without permission, there is a harm caused to a person in that their likeness (identity) is stolen.  This distinction matters because sexual digital forgeries often start from ordinary, non-sexual images – for example a holiday photo from someone’s social media or a publicity still of a celebrity from an event – which then are altered into explicit content. This harm occurs  to adults, who can choose to consent, and to children, who cannot. Thus, when we view that images are ‘of us’, we can argue that using someone’s likeness or photo to create a sexualised forgery is wrong. By recognising non-consensual sexual digital forgeries as a violation of consent, it becomes clear this is another alarming attempt at the seizure of control over their victim’s autonomy over their identity.

For women, the issue intersects with longstanding societal patterns of objectification. Feminist critiques of pornography describe how pornography sexualises inequality, treating women’s bodies as raw material for others’ gratification. Sexual digital forgeries take this even further, intensifying this problem as they bypass the need for a woman’s participation entirely, all the while still presenting her as a sexual object. 

Ultimately, sexual digital forgeries collapse the distinction between digital creation and personal identity. They make it possible to fabricate sexualised versions of people without their involvement yet still tethered to them through identifiability or material connection. By removing a woman’s ability to consent to how her image is used, sexual digital forgeries strip her of ownership over her own identity. They replace her self-defined sexual representation with one forced by others, reducing her to an object rather than recognising her as an autonomous person. In doing so, sexual digital forgeries undermine not only her control over her likeness but also the basic respect owed to her as a human being.

References

Benn, C. (2024). Deepfakes, Pornography and Consent. [online] Philarchive.org. Available at: https://philarchive.org/rec/BENDPA-5[Accessed 4 Aug. 2025].