Politics

How Congress is combating the rise of non-consensual AI pornography

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram



Political momentum is growing to regulate the spread of explicit and non-consensual deepfakes, as the issue of digitally altered images has gone from a potential threat to a reality.

Several bipartisan bills introduced in Congress aim to mitigate the spread of explicit, non-consensual images made using artificial intelligence (AI), a problem that has not only plagued public figures and celebrities, but also ordinary people and even children.

“Last year, it was really a new thing where it was forced — where we have a big problem,” said Anna Olivarius, founding partner of McAllister Olivarius, a transatlantic law firm specializing in racial and gender discrimination cases. .

In January, graphic AI-generated images that looked like Taylor Swift circulated online, drawing mass attention to the issue. The outcry inspired lawmakers and the White House to promote platforms to enforce their rules and stop the spread of such images.

While the spread of Swift deepfakes has highlighted the rise of non-consensual AI pornography, the problem has become more widespread. Schools have even been forced to deal with the new form of cyberbullying and harassment, as students create and spread deepfakes about their peers in a largely unregulated space.

“It’s impacting a lot of everyday people,” Olivarius said.

Lawmakers were also victims. Rep. Alexandria Ocasio-Cortez (DN.Y.), who is one of the lawmakers spearheading a bill to combat explicit deepfakes, spoke about being the target of non-consensual explicit deepfakes herself in an April interview with Rolling Stone.

The issue is drawing support from lawmakers across the political spectrum. One of the bills, the Defiance Act, is led by Ocasio-Cortez and Sen. Judiciary Committee Chairman Dick Durbin (D-Ill.), while another, the Take It Down Act, is led by Sen. Ted Cruz ( R-Texas). ) and Amy Klobuchar (D-Minn.).

Olivarius said the support from both sides is overwhelming.

“It looks like we might finally have something here that lawmakers can agree on or enough to actually pass,” she said.

The two bills aim to approach the issue from different angles. The Defiance Act, introduced in March, would create a federal civil cause of action that would allow victims to sue individuals who produce, distribute, or solicit deepfakes.

The Take It Down Act, introduced last month, would create a federal criminal violation for publishing or threatening to publish digitally altered, non-consensual images online. It would also create a process that would allow victims to force technology platforms to remove explicit, non-consensual deepfakes depicting them.

Durbin spokeswoman Emily Hampsten said the two projects are complementary and her team is in discussions with the offices of the other project’s sponsors.

While there is bipartisan support for the bills, there could still be an uphill battle to get them passed — especially in the months leading up to a contentious election with the power of the White House and both chambers at stake.

Durbin, the Senate majority leader, introduced the Defiance Act for a unanimous consent vote in June, but was blocked by Sen. Cynthia Lummis (R-Wyo.) – a co-sponsor of the Take It Down Act.

Lummis spokeswoman Stacey Daniels said the senator “supports the intent of the Defiance Act” but “remains concerned that this legislation contains overly broad language that could unintentionally threaten privacy technology and stifle innovation , while failing to protect victims.”

Daniels said Lummis’ team is working with Durbin’s team to try to resolve the issues.

“Senator Lummis supports the Take It Down Act for its more personalized approach that ensures people who knowingly produce or distribute deepfake pornography are held accountable,” Daniels said in an email.

Olivarius said the civil remedies built into the Defiance Act are “very powerful” because they would empower individual people affected to take action. The Take It Down Act, however, is “much narrower.”

Carrie Goldberg, a victims’ rights attorney, said the Take It Down Act is an “interesting new approach” but also highlighted potential obstacles to how it would be applied as criminal law.

“I’m pretty skeptical of laws that just give power back to the government,” Goldberg said.

“It then becomes a question of whether law enforcement is going to take this seriously,” she said.

At the same time, Goldberg said, one of the goals of a bill like this is to show that this conduct is illegal and that in itself it could deter lawbreakers.

She also said technology companies could argue that Section 230 of the Communications Decency Act could override the bill’s notice and takedown provision. Section 230 protects platforms from being held liable for content posted by third parties.

“But since this is a federal law that conflicts with another federal law, it will be interesting to see how this plays out,” Goldberg said.

Another bill to combat non-consensual explicit deepfakes was introduced by Senators Maggie Hassan (D-N.H.) and John Cornyn (R-Texas) in May. The legislation would make it illegal to share deepfake pornographic images and videos without consent and would create a criminal offense for sharing such images. The bill would also create a private right of action for victims to file a lawsuit against parties sharing the images.

Olivarius urged Congress to take action on the issue, highlighting its impact particularly on women and its terrible – and even potentially fatal – effects, citing cases in which victims died by suicide following the release of altered images.

“Society hasn’t done much to show that many people care about women,” she said. “That [support for the bills is] unusual. I think that’s great and I hope we can get it on the books as soon as possible,” she said.

However, with the potential obstacles posed by Section 230, Goldberg said Congress should prioritize abolishing the controversial provision in order to help victims.

“The best way to address so much harm that is happening on platforms is for the platforms themselves to share the costs and responsibility,” Goldberg said.

“Power needs to be transferred to the people, and they need to be able to sue or demand the removal of content from platforms,” she added.



This story originally appeared on thehill.com read the full story

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

1 2 3 9,595

Don't Miss