16-year-old Jenna Ortega is the latest victim of deepfake nude app that ran ads on Meta platforms

jenna ortega

create and promote sexually explicit images. The latest victim is Jenna Ortega, a young actress best known for her role in “Wednesday.” She was reportedly depicted in a manipulated, suggestive manner with her photo taken when she was only 16!

The ads in question were promoting an app named Perky AI, which boasted capabilities of digitally undressing women using AI. NBC News reports that the creators behind Perky AI managed to run 11 ads featuring a blurred, fake nude image of Ortega. To make things worse, they reportedly capitalized on her underage appearance to advertise the app’s problematic features. These ads circulated on Facebook and Instagram, as well as on Messenger mainly in February 2024.

But wait, there’s more. Perky AI didn’t stop at offering images of Ortega without clothes. It also promised the ability to modify her clothes based on textual prompts like “Latex costume” and “Batman underwear,” According to NBC News.

This source intervened and tried contacting RichAds, the entity behind these controversial ads. Reportedly, they have remained silent about the issue. Meta’s spokesperson, Ryan Daniels, responded, stating that “Meta strictly prohibits child nudity, content that sexualizes children, and services offering AI-generated non-consensual nude images.”

Reportedly, Meta suspended the of the Perky app’s page after the incident. However, by that point, it had disseminated over 260 ads across Meta’s platforms between September and February. Although Meta had previously suspended 30 of these ads for failing to meet its advertising standards, the specific ads involving the underage depiction of Ortega somehow slipped through the cracks. The scope of the viewership of these ads remains unknown. Still, one such ad on Instagram recorded over 2,600 views according to NBC News.

Moreover, Apple took the step of removing the Perky app from its App Store, and it hadn’t been listed on Google Play in the first place. However, the problematic app is still available for those who already downloaded it, and they’re still able to use it. The app charges either $7.99 per week or $29.99 for a 12-week subscription, according to NBC News.

Other deepfake issues

This incident has sparked a significant outcry over the ethical use of artificial intelligence in advertising and the responsibilities of social media giants in regulating content. Butsadly, this isn’t the first time something like this has happened. Scarlett Johansson and Tom Hanks both had their deepfake versions appear in unauthorized ads. Taylor Swift was recently a victim of explicit deepfakes that spread around the internet. Last year, we reported about extortionists who use Facebook photos to create AI nudes.

All these incidents highlight a bunch of broader issues. The most obvious one is privacy and dignity, hindered by creating images like this. Then, there’s the issue of licensing and consent, especially in cases when your liking is used for unauthorized ads. There’s also the oversight of content and advertising on social media platforms. Sure, Meta doesn’t allow “AI-generated non-consensual nude images,” but it has happened more than once that they somehow slipped.

Exploiting deepfake technology, especially involving minors, raises serious questions about the ethical boundaries of AI applications. We should also question the effectiveness of the existing regulatory frameworks to protect individuals’ rights and dignity in the digital age.

[via NBC News; image credits: Chris Roth, CC BY-SA 2.0, via Wikimedia Commons (edited)]

Source link