Advertisement
Politics

WA bill would add explicit ‘deepfakes’ to child pornography laws

HB 1999 would close a legal recourse loophole by outlawing the use of a minor’s face to digitally fabricate sexual content.

WA bill would add explicit ‘deepfakes’ to child pornography laws

by

Ashli Blow

Repuplish

Caroline Mullet was only 15 when her friends fell victim to deepfakes – realistic photos and videos made using artificial intelligence.

Mullet told lawmakers in testimony earlier this year that a boy in her grade took photos of her friends at her first high school homecoming dance in September 2023 and subsequently used AI to alter the images, making the students appear nude.

“He laughed about it and sent it to his peers as though it was funny,” Mullet, the daughter of Sen. Mark Mullet, D-Issaquah, told lawmakers at a House Community Safety, Justice, and Reentry Committee on Jan. 29. “I hate the idea that I should have to worry about this happening again to any of my female friends, my sisters or even myself.”

Under current Washington law, though, if someone’s face is used to make pornographic content without their consent, the victim has no legal recourse.

“If you have sexually explicit conduct, the law prohibits people from using or sharing that,” said Russell Brown, executive director of the Washington Association of Prosecuting Attorneys. “But if you take and modify it using a real person’s face, putting that on sexually explicit conduct, there’s a loophole there.”

A proposed bill in the Washington Legislature, House Bill 1999, would address this loophole. It would expand criminal penalties under current child pornography laws to include instances in which an identifiable minor’s image was used to digitally fabricate explicit content. The bill would also provide a civil cause of action for adult victims by expanding the Uniform Civil Remedies for the Unauthorized Disclosure of Intimate Images Act.

Though the bill addresses only deepfakes in which the victim is identifiable, Prime Sponsor Tina Orwall, D-Des Moines, says she can see potential for expansion to include non-identifiable victims.

“I think that’s gonna take work at the state and federal level,” Orwall said. “I felt like the first step was to build on people that are being harmed that are identifiable.”

The topic has been in the news recently after Taylor Swift became the subject of sexually explicit deepfakes that circulated across social media, placing this issue in the national spotlight. The Guardian reported that it was her fans who forced social media companies to take the fakes down, rather than any government agency.

Since its emergence in late 2017, deepfake content has grown at a startling rate. Sensity AI, formerly known as Deeptrace, a security firm that provides deepfake tracking technology to other companies, says it has found that the number of deepfake videos online has roughly doubled every six months between 2018 and 2020, with more than 85,000 such videos online as of 2020. The company says that the technology is being used primarily to create sexually explicit content. The company reported in 2019 that it found that 96% of deepfake videos online are pornographic, with the vast majority targeting high-profile celebrities.

Orwall said she’s prioritized supporting survivors of sexual assault in her work, and was the prime sponsor of the 2023 bill that established civil causes of action for victims of nonconsensual disclosure of intimate images, also commonly known as revenge porn.

Orwall said she began to focus on the issue of fabricated intimate images after learning about it at a workgroup at the White House.

“I think it’s just becoming a bigger issue, we’re seeing more of it and it’s just so damaging,” Orwall said. “This is a form of sexual abuse, it truly is.”

According to the New York Institute of Technology, the most common form of deepfake is face swapping, when AI is used to convincingly superimpose a face onto another body. This type of deepfake is commonly used to create sexually explicit content of someone without their consent, like what happened to Caroline Mullet’s friends.

“Everyone was just kind of upset, they felt like they had been violated,” Sen. Mullet said of the incident at his daughter's high school.

As AI technology has rapidly become more sophisticated, deepfake creation has become easier. In their analysis of deepfake creation on the messaging platform Telegram, Sensity found users have the ability to generate “stripped” images at no cost, with the option of paying roughly $1.50 to remove an image’s watermark. As of July 2020, Sensity reported that Telegram had over 103,000 members across seven channels.

When someone becomes the subject of deepfake porn, the consequences can be severe, regardless of whether the image is digitally altered. Research on the emotional impact of revenge porn published in the Journal of Feminist Criminology showed that victims suffer PTSD, anxiety, depression, suicidal thoughts and other mental health impacts similar to those resulting from sexual assault.

Proponents of this bill hope to discourage the creation of this content and ease victims’ suffering by providing a means for legal recourse.

“This is our opportunity to give survivors really a path to justice, but also make a clear statement that these are harmful and damaging and we really don’t want to tolerate this in our state,” Orwall said in the Jan. 29 hearing.

Brown, of the Washington Association of Prosecuting Attorneys, is involved in a workgroup for this bill. Brown says some people have raised concerns regarding whether the bill would infringe upon free speech.

In 2002, the U.S. Supreme Court ruled in Ashcroft v. The Free Speech Coalition that parts of the 1996 Child Pornography Prevention Act prohibiting sexually explicit computer-generated images of nonexistent children were overbroad and infringed on freedom of speech.

In other words, while the Constitution does not protect child pornography, simulated or artistic representations of minors involved in sexually explicit scenarios are still legal under free speech laws.

“So the question becomes what happens if you merge the two?” Brown said.

According to Brown, HB 1999 is distinguished from Ashcroft because deepfakes involve the faces of real people who exist and not AI inventions of nonexistent people. “The more you make it look like, and the more it is like an actual image … that doesn’t hold the same protection because you’re using an actual person,” Brown said.

Brown said the bill’s workgroup will continue to address concerns surrounding free speech and intellectual property as the session progresses.

The group has also been considering questions posed by representatives from technology and media industries, which use this type of AI technology to alter images for things such as body doubles in movies.

However, bill supporters say that these industries likely won’t be impacted because the language is specific in targeting only sexually explicit altered images that are meant to do harm and in which the victim is identifiable.

“I think sometimes they look at a portion of the bill, and we’re like, ‘You gotta look at the bigger context,’” Orwall said.

Supporters of the bill say that everyone should have a legal avenue to protect their own faces from being used without their consent in pornography.

“If we see someone like Taylor Swift impacted … it almost makes it more intimidating and sad for someone who feels like they have no path for justice,” Orwall said. “We don’t want to see anyone harmed; we don’t want to see one of the most important role models of this generation harmed either.”

At least 10 states, including New York, California and Texas, have laws against deepfake pornography. If enacted, Washington would join this wave of legislative action, responding to increasing demands for heightened safeguards against artificial intelligence.

HB 1999 has passed out of the House and will now begin moving through the Senate.

“AI is something that seems like it’s quickly escalating, and so I think we would really like to see something move this year,” Orwall said. “For me, it’s standing up for survivors and just making sure they’re not harmed, and they feel they have some kind of recourse.”