It was huge news when fake nude images of Taylor Swift spread on the internet early this year, but the problem goes far, far beyond her case. Artificial intelligence is making it cheap and easy to generate fake nude photos and videos that can humiliate anyone from a congresswoman to a global celebrity to an underage girl. These deepfake videos and photos are proliferating on the internet. One study found that 98 percent of deepfake videos online were pornographic and 99 percent of those targeted were women or girls. Companies profit by selling advertising and premium subscriptions on websites with these graphic videos, sometimes seeming to show public figures weeping as they are raped. And Google directs traffic to these websites. Meanwhile, victims have little recourse: Our legal system so lags the technology that it's not even clear that a law has been broken, even when the victim is a child. This exploitation is the topic of my latest column, which I've been working on for months. I learned about the scale of the sexual deepfakes problem last fall from a young executive, Breeze Liu, whose world collapsed when a friend tipped her off that there was a nude video of her — taken without her knowledge — on porn sites. That was turned into deepfake sex videos, and ultimately this nonconsensual imagery ended up on at least 832 links. Devastated, she climbed a tall building and prepared to jump. Fortunately, she instead decided to fight back and develop technological solutions to help victims like her. I admired that pluck. I also interviewed a high school sophomore in New Jersey, Francesca Mani, who was called to her school office and told that one or more boys in her class had used a "nudify" website to take a clothed photo of her and generate a fake nude version. Like Liu, she has chosen to fight to get better regulation of the sector. A lot surprised me as I reported this column — about the capabilities of the technology, the recklessness of the search engines that support this exploitative ecosystem and the courage of victims willing to speak up in hopes of sparing others. But perhaps what surprised me the most was that society apparently believes that prominent women and schoolgirls alike must simply accept being degraded by explicit fake photos or videos. We all have an interest in creating a society in which women and girls aren't demeaned so that tech companies can profit.
Here's what we're focusing on today:
We hope you've enjoyed this newsletter, which is made possible through subscriber support. Subscribe to The New York Times. Games Here are today's Mini Crossword, Wordle and Spelling Bee. If you're in the mood to play more, find all our games here.
Forward this newsletter to friends to share ideas and perspectives that will help inform their lives. They can sign up here. Do you have feedback? Email us at opiniontoday@nytimes.com. If you have questions about your Times account, delivery problems or other issues, visit our Help Page or contact The Times.
|
Monday, March 25, 2024
Opinion Today: Why do we let tech companies profit off nude deepfakes?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment