Politics

The Online Degradation of Women and Girls That We Meet With a Shrug

Alarms are blaring about artificial intelligence deepfakes that manipulate voters, like the robocall sounding like President Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.

Yet there’s actually a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebrities and unknown children alike. One recent study found that 98 percent of deepfake videos online were pornographic and that 99 percent of those targeted were women or girls.

Faked nude imagery of Taylor Swift rattled the internet in January, but this goes way beyond her: Companies make money by selling advertising and premium subscriptions for websites hosting fake sex videos of famous female actresses, singers, influencers, princesses and politicians. Google directs traffic to these graphic videos, and victims have little recourse.

Sometimes the victims are underage girls.

Francesca Mani, a 14-year-old high school sophomore in New Jersey, told me she was in class in October when the loudspeaker summoned her to the school office. There the assistant principal and a counselor told her that one or more male classmates had used a “nudify” program to take a clothed picture of her and generate a fake naked image. The boys had made naked images of a number of other sophomore girls as well.

Fighting tears, feeling violated and humiliated, Francesca stumbled back to class. In the hallway, she said, she passed another group of girls crying for the same reason — and a cluster of boys mocking them.

“When I saw the boys laughing, I got so mad,” Francesca said. “After school, I came home, and I told my mom we need to do something about this.”

Back to top button