🔬 Research Summary by Giuliana Luz Grabina, a philosophy undergraduate student at McGill University, with an interest in AI/technology policy regulation from a gendered perspective.
[Original paper by Lucas T. Kweilin]
Overview: Deepfake technology’s increasing ease and availability indicates a worrying trend in technology-facilitated sexual abuse. This article argues that while deepfake technology poses a risk to women in general, victims of domestic abuse are at particular risk because perpetrators now have a new means to threaten, blackmail, and abuse their victims with non-consensual, sexually explicit deepfakes.
Introduction
In 2018, a video featuring Barack Obama making derogatory statements, swearing, and acting out of character widely circulated online. This deepfake Obama video warned viewers of the dangers of deepfake technology and urged viewers to be “more vigilant with what we trust from the internet.” While most discussions on the harms of deepfake technology focus on its misuse to manipulate elections, spread misinformation, alter public opinion, and threaten national security, there has been considerably less attention to the harms of non-consensual sexual deepfakes, which constitute the majority of deepfakes shared on the internet. Indeed, a study conducted by Sensity AI found that only 35 videos featured politicians, whereas 96% of deepfakes were non-consensual sexual deepfakes—most of which (99%) were made of women.
In this article, author Lucas T. Kweilin examines the harms that deepfake technology poses to women—particularly those who are victims of domestic violence—and proposes that non-consensual deepfakes and other types of image-based abuse can be best regulated under a uniform federal law rather than through inconsistent and unenforceable state laws.
Key Insights
Beyond Public Figures: Non-Consensual Sexual Deepfakes
Deepfake technology uses artificial intelligence and facial mapping knowledge to merge, combine, replace, and superimpose images and video clips, creating authentic-looking videos known as deepfakes. Some of the earliest non-consensual sexual deepfakes posted online featured various female celebrities, including Taylor Swift, Scarlett Johansson, Gal Gadot, and Kristen Bell.
Since then, the author argues, deepfake applications have extended beyond celebrity and political-figure sexual deepfakes. This technology is often employed to produce manipulated pornographic videos of ordinary women and girls without their consent. Although it is difficult to determine the prevalence rate of non-consensual sexual deepfakes, studies have found that deepfake apps have been used to generate fake nude images of more than 68,000 women. Other studies have found that non-consensual sexual deepfakes have targeted at least 100,000 people, including underage children. More recently, TikTok users have been targeted. As the author emphasizes, “the targeting of TikTok is especially concerning because nearly a third of users are under the age of 14, and some have already found videos of themselves to appear on websites like PornHub.”
False Realities, Real Harms: Deepfakes and Domestic Violence
The author categorizes deepfake technology to create non-consensual sexual deepfakes as violence against women, arguing that deepfakes provide a relatively new means to perpetuate domestic violence. Domestic violence is violence between people who have or have had an intimate relationship.
When a current or former intimate partner attempts to control or dominate a relationship by initiating physical, sexual, or psychological abuse on their victim, such behavior is known as intimate partner violence (IPV). As the author emphasizes, domestic violence goes beyond physical violence. It can involve a pattern of domination or coercive control to undermine the victim’s autonomy, social support, equality, and dignity.
Image-based sexual abuse describes the nature and extent of abuse experienced by victims (primarily women) where perpetrators have fabricated or distributed private sexual images without their consent. According to the author, deepfakes are used against women similarly to other image-based sexual abuse in that they strip women of their sexual autonomy.
While anyone can become a victim of non-consensual sexual deepfakes, even without a real compromising image, the author argues that victims of domestic abuse are particularly vulnerable. Indeed, perpetrators now have an endless platform to control, blackmail, intimidate, harass, and abuse their victims. It is also common for perpetrators to disseminate, or threaten to disseminate, compromising media to the victim’s family, friends, employers, coworkers, and peers.
A Gap in Legislation: Progress and Limitations
Fortunately, deepfakes have forced lawmakers to pay closer attention to technology-facilitated abuse. California and Virginia were among the first states in the United States to impose criminal penalties on those convicted of distributing non-consensual deepfake images and videos. Other states, including Illinois, Texas, Washington, and California, have also adopted biometric privacy laws that allow people to take civil action against anyone who uses their identifiable images without their consent.
However, as the author highlights, despite two previous attempts to introduce bills at the federal level, such as the ENOUGH Act from 2017 and the SHIELD Act from 2019, no federal laws protect victims of non-consensual pornography. States are also ill-equipped to handle cases of non-consensual pornography effectively.
In fact, a recent study found that the states that have enacted statutes that regulate non-consensual pornography use inconsistent language, fail to provide comprehensive protection for victims, and do not hold producers and distributors accountable, thereby allowing them to evade consequences.
Moreover, deepfakes evade most state revenge porn laws because it is not the victim’s nudity depicted in the videos, which exempts those types of scenarios from prosecution. As well, if deepfakes are created for monetary gain, attention, or clout, rather than for revenge, it is possible that the state’s non-consensual pornography statute does not apply.
Between the lines
This paper highlights the urgent need for lawmakers to adapt and update legislation to keep pace with technological advancements and address the unique challenges that deepfake technology poses to women, particularly victims of domestic abuse. While AI technologies can reveal the gaps in our existing laws, it is up to legislators, policymakers, and government officials not to let those abuses fall through the cracks.