66.3 F
New York
Tuesday, May 28, 2024

Opinion | Deepfake Porn Sites Used Her Image. She’s Fighting Back.

BusinessOpinion | Deepfake Porn Sites Used Her Image. She’s Fighting Back.


This transcript was created using speech recognition software. While it has been reviewed by human transcribers, it may contain errors. Please review the episode audio before quoting from this transcript and email transcripts@nytimes.com with any questions.

nicholas kristof

I’m Nicholas Kristof. I’m a columnist for “The New York Times.” We’ve been hearing a lot lately about the potential dangers of AI. That includes deepfakes. Before the primary, this year, voters in New Hampshire got a robocall. That sounded a lot like President Biden.

ai

Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again.

nicholas kristof

There’s also a fake video of Taylor Swift endorsing Donald Trump.

catalina marchant de abreu

This video is obviously fake, and it’s been manipulated. If Taylor Swift ever endorsed Donald Trump in the Grammy Awards, I think this would have been all over the news by now.

nicholas kristof

But I think there’s a bigger problem that we hear much less about, and that is deepfake, nude photos, and videos. Deepfakes are AI-generated imagery that often uses real people’s faces and identity. There was one study that found that 98 of deepfake videos online are pornographic. And of those, 99 percent of the time those targeted are women and girls. I first got a sense of the scale of this problem from an activist who was trying to fight it.

breeze liu

My name is Breeze Liu, and I am a survivor of online image abuse.

nicholas kristof

Her story begins back in April 2020 when she was just 24 years old.

breeze liu

I got a message from a friend of mine. He said, drop everything and get on a phone call with me right now. When I called him, he said, I don’t want you to panic. But there is a video of you circulating on Pornhub. And for someone like me who doesn’t even watch porn, I thought it was a joke. So I said, this is not funny, and then he said, I am not kidding. I’m sending you the link. You have to look. I received the link, and there I was. I was on Pornhub.

It was one of the most devastating moments of my entire life. It was a video of me recorded without my knowledge and without my consent. When I saw the video, my head just completely went blank, and I went to the rooftop of my apartment building because the shame was so overwhelming, and I didn’t want to live anymore. So I was getting ready to kill myself and jump off from the rooftop of my building. I didn’t do it because I didn’t want my mom to see me dead. But there were moments in the aftermath in which I felt like I would have been better off dead.

nicholas kristof

With Breeze, this began as one real video, but then it ended up being deepfaked with more than 800 links all across the internet. In the article I wrote on this topic and in this conversation, you’re not hearing some of the underage girls who were targeted, and you’re not hearing some celebrities talk about it, and that’s because of shame. People are humiliated when they’re shown a fake, incredibly graphic videos of themselves being raped. In general, there’s been some reluctance on the part of victims to speak out. And unfortunately, that tends to perpetuate the problem.

breeze liu

The society puts invisible shackles upon our mind, silencing us through shame. I remember when I was talking to the police and the lawyers, I mean, I just lost my voice, and I completely just froze because it was so devastating. I literally couldn’t even talk about this without shaking and having panic attacks. What it feels like is like you been murdered. You died. A part of you had permanently died. And in order to seek justice, you’re forced to look at the cadaver part of yourself over and over and over and over again, and then you never get justice.

nicholas kristof

So Breeze poured herself into an effort to try to get the video and the deepfakes off the internet. She contacted hundreds of sites. She’s lobbied platforms to get them to stop linking to these sites and directing traffic to these sites. But it’s just an uphill struggle because these companies are monetizing her.

breeze liu

I did ask Pornhub to take it down. They took it down after I found a lawyer. But for the other malicious websites, some of them just didn’t even respond despite all of our efforts, relentless efforts, trying to ask the platforms to take it down. They refused to address this issue. They refused to take it down.

nicholas kristof

The deepfake companies made a mistake in targeting Breeze because she is very savvy about technology, about Silicon Valley. She comes from that world. And so she devised her own solution. She started her own company called Alecto AI, and that has app that uses facial recognition technology to do reverse image searches, and it tells people where their images show up on the internet, and that it helps connect users to platforms to try to take non-consensual images down.

breeze liu

I decided to create my own solution because I run into wall everywhere I go. Unless I change the system, justice wouldn’t even be an option for me.

nicholas kristof

There are a couple of categories of players in this area. There are deepfake companies that category makes money off ads and subscriptions. Then there’s another category, which is the search engines, like Google and Bing, which direct traffic to those websites and to those apps, and they make money because they accept ads from those deepfake companies. And so Google is very much a part of that really sordid ecosystem, and victims have almost no recourse.

I reached out to Google and Bing to try to get their side of the story. Google agrees that there is room for improvement, but no one affiliated with the company was willing to actually talk to me on the record about it. Google did give me a statement said, quote, “We understand how distressing this content can be, and we’re committed to building on our existing protections to help people who are affected,” and Bing spokeswoman said something quite similar. But look, count me unimpressed.

Google does know how to do the right thing when it wants to. If you ask Google, how do I kill myself? Then it doesn’t go and give you step-by-step instructions. Rather, it leads you to a suicide help line. So Google can be socially responsible when it wants to be. But in this case, it just seems to be completely indifferent to companies that go out of their way to humiliate women and girls and make money off it. What’s kind of astonishing is that this is a kind of non-consensual sexual harassment and violation that isn’t clearly illegal.

Basically, the problem is that the technology has advanced much, much more quickly than the law, and so there isn’t a law at the federal level that clearly covers this. There would be an Avenue for civil damages, for victims to sue Google or sue a deepfake company. But the Communications Decency Act of Section 230 protects tech companies from those civil lawsuits, or it appears to protect the companies. It seems to me that the best remedy is not so much in criminal law, but is amending Section 230 so that companies could be sued, and so the companies would have to police themselves.

Tech companies like Google are willing to bolster deepfake companies, whose entire business model is about producing fake sex videos, and these companies wouldn’t really exist if Google weren’t directing traffic to them and making them profitable. So I’m hoping that some Google executives and board members are listening, and maybe we’ll search their consciences right now.



Source link

Check out our other content

Check out other tags:

Most Popular Articles