As an adult content creator with a large online following, Isla David has dealt with trolls before. Typically, her work fighting bad actors on the Internet involves her trying to remove photos of herself that have been posted elsewhere without her permission, like a photo for an ad for “horny MILFs in your area” that she's trying to remove here and years. But she's also used to trolls editing her image without her consent. “Whether they remove clothes, change my waist, [or] working on my bust and hip size,” she says Rolling rock. “They just generally treat me like human Play-Doh to mush.”
On February 2, however, David found herself in a somewhat unusual position. A friend informed her that one of her photos, a sultry image of her drinking a bottle of Highland Park Scotch while wearing a white button-down, which she had posted on the r/whiskyporn subreddit in 2021, had gone viral in what she refers to as “the darkest corners of the internet,” such as 4chan. The image was edited not to remove her clothes, but to add them.
The photo had been artificially manipulated, says David, to make it look like “some weird parody of a woman”. Her waist and thighs had been shaped to look smaller, but her head had been shaped to almost twice the size. Additionally, the person who curated it had dressed her in a demure white A-line dress and surrounded her with three adorable children, all of the same height and wearing similarly beautiful white outfits. Far-right influencer Ian Miles Cheong had he tweeted with the caption, “When given photos of thirst traps, AI imagines what could have been if they were raised by strong fathers,” a post that had garnered seven million views.
At first, David was amused, particularly by the poor quality of the AI rendering. “It looks like I'm sucking my AI baby's brain through his ear,” she says. But that amusement turned to an overwhelming sense of dread, particularly when she saw the responses to Cheong's tweet. “The implication [of many of the replies] it's that I'm something other than a whole person,” he says. “That I'm a broken creature and if I put on a long dress and had babies, everything would be solved. And I frown upon that, because my worth is not about the images I choose to put online.” The manipulated image, he said, was “an attempt to violate me and my bodily autonomy, regardless of whether you add or remove clubs.”
The photo was part of a larger campaign spearheaded by 4chan trolls for using artificial intelligence software to dress women online more modestly. Under the name “DignifAI”, a thread on the hate forum /pol/ sums up the “mission” as follows: “We put clothes on degenerate women for fun, come join. The goal is for people to see that a degenerate lifestyle is ultimately fruitless.” The thread also includes links to specific tutorials for using Stable Diffusion for this purpose, as well as instructions to “include the name of [woman] in the post so anons can @ them,” making it clear that the campaign is aimed at targeted humiliation.
DignifAI first went viral on X (formerly Twitter) with a post by far-right influencer Jack Posobiec, who on Friday posted four examples of the tool being used on what he referred to as “e-girls,” a pejorative term. for women with a face on the internet. A companion X account for DignifAI features manipulated versions of Instagram models, as well as celebrities such as Miley Cyrus and Doja Cat, and has amassed 28,000 followers.
In a message to rolling rock, The person behind the DignifAI X account denied being behind the image Ian Miles Cheong had tweeted (which makes sense, given that the editing is noticeably worse than most of the photos on the DignifAI account). But in general, DignifAI serves as a mirror for an ongoing movement among trolls to use AI technology to strip women of their bodily autonomy.
Deepfake nude photos and videos of women have gone massively viral alongside the rise of artificial intelligence, with independent researcher Genevieve Oh finding that nearly 143,000 deepfake porn videos were posted without women's consent last year. While celebrities such as Taylor Swift have been heavily targeted, with a deeply fake porn video of the artist garnering 45 million views on X last week, non-famous young women and children have also suffered the terrifying phenomenon, including a 14-year-old girl from New Jersey, who, with her mother, is currently advocating for stricter legislation governing AI-generated sexual abuse material.
Although several states, including Texas and New York, have passed laws criminalizing the spread of deepfake porn, it remains readily available on search engines like Google and Bing, according to a recent analysis by NBC News. And while almost all women on the Internet are vulnerable to technology, celebrities and online content creators remain particularly vulnerable, due to the sheer volume of their media available online. With the latest campaign to dress up “degenerate” women — that is, those who don't fit a very narrow definition of how women should behave in public — adult content creators like David are now facing a whole new method of humiliation.
From experience, David says, he knew that taking legal action was “pointless,” having spent years trying to use DMCA requests to remove non-consensual images, to no avail. But she says seeing Cheong's tweet was a “very similar feeling” to seeing her image reused without her permission on tube sites or shady internet forums, or picked on by misogynists.
“It's just another attempt to make me feel bad about the person I am online,” she says. “Whether it's calling me ugly, complaining that my ass is too big or not big enough, or too lumpy or not lumpy enough, at the end of the day, it's all about finding fault with the body I've put online… In the end of the day, what they're trying to do is take away your ability to consent and take away your bodily autonomy online.”
Ignoring the adage not to feed the trolls, David decided to fight back. He retweeted Cheong's post with the caption: “This might be the funniest attempt to deny me I've ever seen. I had to share. Scroll down for commenters who are clearly terrified of death by snoo-snoo” (euphemism for sex with strong women, popularized by the show Futurama.) In a way, she's been laughing lately: since Cheong posted her AI-manipulated image, she says she's seen a 594.4% increase in earnings from her OnlyFans in the last day alone, as well as thousands of new followers on X. But she also acknowledges the sad reality that her experience is not an isolated one — not just for women in her industry, but for any woman who dares to maintain a public profile online.
“Just being human and making a product that people want is not something that should be a reality,” he says. “But yeah, that's something that regardless of the size of your account, regardless of the nature of the content you're creating, you're vulnerable to something like this. You can be angry about it, or you can laugh about it. And those are the only two options. Or you can't be a public woman online. Which for me, at least, is not an option I tend to follow.”
from our partners at https://www.rollingstone.com/culture/culture-news/dignifai-4chan-shame-women-1234961851/