Tech

Google bans advertisers from promoting deepfake porn services

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram


Google has long banned sexually explicit ads — but until now, the company hasn’t banned advertisers from promoting services that people can use to make deepfake porn and other forms of generated nudes. That’s about to change.

Google at the moment prohibits advertisers from promoting “sexually explicit content,” which Google defines as “text, image, audio, or video of explicit sexual acts intended to titillate.” O new policy now prohibits advertising for services that help users also create this type of content, whether by altering a person’s image or generating a new one.

The change, which will take effect on May 30, prohibits “the promotion of synthetic content that has been altered or generated to be sexually explicit or contain nudity,” such as websites and apps that instruct people on how to create deepfake pornography.

“This update explicitly prohibits ads for services that offer to create deepfake pornography or synthetic nude content,” said Google spokesperson Michael Aciman. On the edge.

Aciman says any ads that violate its policies will be removed, adding that the company uses a combination of human reviews and automated systems to enforce these policies. In 2023, Google removed more than 1.8 billion ads for violating its policies on sexual content, according to the company’s report. annual ad security report.

The change was first reported by 404 Media. As 404 notes that while Google has already banned advertisers from promoting sexually explicit content, some apps that facilitate the creation of deepfake pornography have gotten around this by advertising themselves as non-sexual in Google ads or on the Google Play Store. For example, a face swap app did not advertise itself as sexually explicit on the Google Play Store, but did so on porn sites.

Non-consensual deepfake pornography has become a consistent problem in recent years. Two Florida high school students were arrested last December for allegedly creating AI-generated nude photos of their classmates. Just this week, a 57-year-old Pittsburgh man was sentenced to more than 14 years in prison for possessing false child sexual abuse material. Last year, the FBI issued a statement about a “surge” in extortion schemes that involved blackmailing people with AI-generated nudes. While many AI models make it difficult – if not impossible – for users to create AI-generated nudes, some services allow users to generate sexual content.

There may soon be legislative action on deepfake pornography. Last month, the House and Senate introduced the DEFIANCE Actthat would establish a process through which victims of “digital forgery” could sue people who make or distribute non-consensual deepfakes of them.



Source link

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

1 2 3 6,000

Don't Miss