Deepfakes Are at the Center of A New Federal Bill

A day before the Senate Judiciary Committee grilled CEOs from tech companies about internet child safety, bipartisan lawmakers introduced a bill that would allow victims to sue people who create and distribute sexually-explicit deepfakes under certain circumstances.

A day before the Senate Judiciary Committee grilled CEOs from tech companies about internet child safety, bipartisan lawmakers introduced a bill that would allow victims to sue people who create and distribute sexually-explicit deepfakes under certain circumstances. 

The Disrupt Explicit Forged Images and Non-Consensual Edits, or DEFIANCE Act, allows victims to sue if those who created the deepfakes knew, or “recklessly disregarded” that the victim did not consent to its making. 

The federal bill, introduced on Tuesday, came nearly a week after deepfake pornographic images of Taylor Swift flooded X. The social media platform temporarily removed the ability to search for Swift’s name on X after the explicit content was viewed tens of millions of times.

Only ten states currently have criminal laws against this form of manipulated media files. If the DEFIANCE Act passes, the bill would become the first federal law that would protect victims of deepfakes. 

“Nobody—neither celebrities nor ordinary Americans—should ever have to find themselves featured in AI pornography,” said Sen. Josh Hawley, one of four legislators who introduced the bill, in a press release. “Innocent people have a right to defend their reputations and hold perpetrators accountable in court. This bill will make that a reality.”

Nonconsensual pornographic deepfakes are alarmingly easy to access and create. “Starting at the very top, there’s a search engine where you can search ‘How do I make a deepfake’ that then will give you a bunch of links,” Carrie Goldberg, an attorney who represents tech abuse victims, previously told TIME. Deepfake software takes a person’s photos and face-swaps them onto pornographic videos, making it appear as if the subject is partaking in sexual acts. 

A 2019 study found that 96% of all deepfake videos were nonconsensual pornography.

The movement to address deepfakes seems to have mounting support, as Rep. Joe Morelle had previously introduced the Preventing Deepfakes of Intimate Images Act, which would criminalize the non-consensual sharing of deepfakes, last May. There has been no action taken on that bill, however, since its introduction. Americans seem to overwhelmingly support federal action against deepfakes—84% say they are in favor of legislation that would make non-consensual deepfake porn illegal, recent polling by the Artificial Intelligence Policy Institute shows. 

“Deepfakes that spread misinformation, cause defamation, or commit copyright infringement—those fit fairly neatly into our framework of laws designed to address such harms,”Syracuse University professor Nina Brown, who specializes in the intersection of media law and technology, told TIME Wednesday. “At the same time, laws are not enough. Social sharing platforms need to commit to investing resources in ensuring that deepfakes aren’t allowed to exist on their platforms.”

ncG1vNJzZmhqZGy7psPSmqmorZ6Zwamx1qippZxemLyue82erqxnk52us7jIZpuapZWhtrB5wK1knJ2eqbKzec6fZKedp2LBqrfTqKJmqp%2Birq%2BvxGg%3D

 Share!