Navigated to Update: The Deepfake Law - Transcript

Update: The Deepfake Law

Episode Transcript

Speaker 1

Hey, this is Olivia Carvell, one of the hosts of Levittown.

Since we released this series, the US has moved closer to passing a bill that would crack down on deep fake pornography.

The US House passed the Take It Down Act with a near unanimous vote, and now that bill is heading to President Trump's desk.

I spoke about this legislation with the folks over at the tech Stuff podcast, which is produced by our partners Kaleidoscope and iHeart.

We wanted to share my conversation with them here with you, our Levittown listeners.

So here it is.

Speaker 2

Welcome to tex Stuff, a production of iHeart Podcasts and Kaleidoscope.

I'm mos Vloscian and today Karra price Night will talk to Bloomberg's Olivia Carville about the Take It Down Act and what it means for the future of the Internet.

There is a landmark bill aimed at combating AI harms, specifically deep fakes.

Speaker 3

They're used in scams, they're used in spreading misinformation online, and I'd say most notably, they have been used in the non consensual creation of porn.

Speaker 2

Right and that's what this legislation is all about.

This week Congress passed the Take It Down Act, which aims to crack down on the creation of revenge porn i e.

Pornographic images that are shared non consensually.

The Act specifies that those who distribute revenge porn, whether the quote real or computer generated, could be fined or subject to prison time.

It's had rare backing from both sides of the political aisle and from First Lady Milania Trump.

As of Wednesday afternoon, the time of this taping, the bill heads to President Trump, who's likely to make it law.

Speaker 3

Here to walk us through the Take It Down Act and what it means for tech companies is Olivia Carville, investigative reporter for Bloomberg News and co host of the podcast Levittown, which is a must listen agreed wherever you get your podcast put It covers the rise of deep fake porn.

It also happens to be a co production of Kaleidoscope.

Olivia, Welcome to tax Stuff.

Speaker 1

Thank you so much for having me.

It's great to be back with Kaleidoscope's team.

Speaker 2

Thanks thanks for being here, Olivia.

You've been tracking this bill for a long time.

When did the push for legislation on deep fake pornography begin.

Speaker 1

I mean, it has been a very long journey to get here.

We've seen quite a lot of states across the US rolling out legislation to try and target deep fake porn since the revolution really began a number of years ago.

Now at the moment, more than twenty states across the country have introduced new laws.

But one of the criticisms we heard time and time again, and something we raised in the Levetown podcast is the fact that there was no federal law criminalizing this across the US.

And this bill was first introduced last summer in twenty twenty four, its bipartisan legislation.

Senators Cruise and Clobashah put it forward and it unanimously passed in the Senate, but unfortunately it stalled in the House last year and that led to a lot of frustration from the victims.

Earlier this year we saw it once again take it down, was reintroduced, unanimously passed in the Senate, and then earlier this week and very exciting news, it was also unanimously passed in the House.

And we're talking a vote of four hundred and nine to two, and that's kind of remarkable.

At the moment, given the current polarized political climate we're living in right now.

The bill is en route to President Trump's desk and there's a lot of expectation that he's going to sign it soon.

Speaker 3

So just to go back for a second, what is the Take It Down Act?

And what does it say?

Speaker 2

So?

Speaker 1

The Take It Down Act is actually an acronym for a very long piece of legislation that's tools to address known exploitation by immobilizing technological deep fakes on websites networks.

Speaker 3

I think who came up with take it Down is pretty easy to remember.

Great, Yeah, you know it's it's an acronym.

Speaker 1

Yeah, So it is an acronym.

And the law really does exactly what that title implies, which provides a way to ensure this content can be taken down from the Internet, because that's where it's particularly harmful, is where it starts to be shared across high schools and in friendship groups.

So the law goes after two main parties.

One, it makes it a crime for offenders to knowingly publish deep fake pornography or intimate images, whether they're real or created with AI, and then if they do, they can serve up to two or three years in prison, depending if the individual in the photo is an adult or a minor.

And then it also challenges or holds to account the technology companies, the social media platforms where often this content is shared and disseminated on and it forces them to remove these deep fake images within forty eight hours of being notified of them.

Speaker 2

I have two questions for you, Olivia.

Firstly, as this phenomenon becomes more and more ubiquitous, what will this law mean practically if you discover you're a victim?

What will it allow you to do you can't do today?

And secondly, you mentioned the liability of the platforms.

How does this intersect with Section two thirty.

Speaker 1

So for a victim of deep fake porn, a young person who maybe finds or discovers that fake pornographic non consensual images are circulating online, now this law gives them a path forward to get those photos taken down, to get them scrubbed from the internet.

Finally, so it enables them to file a report with the social media platform or the website or app where these images have been published or disseminated, and to inform them that it's deep fake porn, that it's non consensual and that they want it removed, and then within two it has to be removed, and the FTC, the Federal Trade Commission, is responsible for holding those companies to account to get that taken down.

The other thing it gives victims is a path to justice.

It's a way to go after the offenders who publish this content or even threatened to publish this content against the survivors.

Well, you ask about two thirty, and that's a great question, because this is one of the only pieces of consumer tech legislation where federal regulators have been able to come in and actually sign a law in place that impacts young people using these platforms Section two thirty, and it comes from the Communications Decency Act.

It's a very controversial piece of legislation and it really did change the Internet.

And it was written into law back in the mid nineties.

And don't forget that that's before Facebook was even created.

This law, which governs all these social media platforms, was written at a time before or social media even existed.

And what it does is it provides an immunity shield.

So these platforms are not responsible for the content that is uploaded onto them.

So anything that is posted on Facebook Instagram, Snapchat, TikTok, Twitter, now X.

The platforms themselves cannot be held legally responsible for that content in the choices they make around removing it or allowing it to stay up.

In this law, the platforms are being held to account to take down deep fake porn, to take down this specific form of content.

And that's why it's so controversial, and that's why there are critics of this act because some people think that this law will be weaponized or abused, and it's going to result in the platforms taking down a lot more content than what this legislation covers.

Speaker 2

Wasn't.

Speaker 3

Section two thirty in part introduced because of concerns over online pornography.

Speaker 1

So two thirty was first introduced because at the time, judges and the legal system was ruling that platforms were liable for any content that was posted on their sites.

And that meant that if a platform decided to remove harmful, grotesque, vile, or violent content, say someone being cyber bullied or punched, or content about drugs or alcohol, content that they just didn't want to share with their other users, if they took that down, they were actually being held responsible for that decision in the legal system.

Judges were saying they would be held accountable and legally responsible for removing content and people could sue the platforms for doing so.

So the law was written to actually protect the platforms and enable them to moderate their content to try and make the Internet a safer space.

It's kind of counterintuitive when you think about it, because unfortunately now what's resulted is it's enabled these platforms to have so much power over the content that's up and enabled them to wash their hands and say this isn't our responsibility.

We can't be held legally liable for this.

We're effectively walking away.

Speaker 3

And necessitated a lot like this one to come into play.

I mean in a certain sense.

Speaker 1

Yeah, I mean it definitely did.

And here the law is relatively narrow.

We're not talking about any form of content.

We're talking about only content that involves non consensual intimate imagery, whether that's real or created by AI.

So that enables people who see photos of themselves which have been manipulated using technology to undress them or turn them naked or put them into sexual acts, which is something we explored, and leave itt town.

Those images in that content can be taken down with this act.

Speaker 2

Some tech companies and adult websites only fans.

Pornhub Matter already have policies in place where users can request that revenge porn be taken down.

What will be the change from a user victim point of view once this becomes law.

Speaker 1

Yeah, you're right.

I mean even Nick Meek, the National Center for Missing and Exploited Children, has a tool which is actually called take it Down, which does exactly the same thing.

Enables people to plug in a photo or a hashtag which is like a unique idea of each image, to say I don't want this online and I'm a victim of this, and please remove it.

But the law regulates this, and it makes it a federal law to say you have to remove it, and you have to remove it within two days.

So I guess it's just putting a stricter approach to this, so the platforms know they have to oblige and they have to get that content scrubbed from their websites.

Speaker 2

There's an amazing moment in the Levittown podcast where one of the high school students who realizes she's been a victim of deep Fate porn.

Her father's actually a police officer, so they try and figure out is there any legal recourse and the response from the police is basically, there's nothing we can do.

It's kind of amazing in the arc of your career as a reporter that the law is actually changing in real time and response to the stories that you've been covering, these very moving, horrifying stories.

What do you the victims think about this law and what's been the response among your sources.

Speaker 1

The victims have been waiting for this for a very long time.

When you think about the origin story of Take It Down, it was when Aliston Barry, a young teen from Texas, actually went to Senator Cruise's office and told him that a deep fake image of her had been circulating on Snapchat and she had asked the platform to remove it, and after a year, the platform still hadn't taken that image down.

That's what really sparked this particular piece of legislation.

And we've seen young teenage you know, high school students, college students speaking before Congress pleading for a law like this, asking for help to find a path to get these images removed from the internet.

Because i'm fortunately, you know, in teenagers' lives today the digital world, as you bequit us.

They exist within it, and they merge between the online world and the offline world.

They don't call their friends on the phone, they don't call their parents on the phone.

You know, they'd be more inclined to send a DM through Instagram or a message on Snapchat.

And when you exist in your social fabric exists within the digital world.

That means that when images like this are shared, everybody sees them.

And I think that's the real harm here is the photos created.

It's fake, it looks unbelievably convincingly real, and it gets shared to everyone in your social network within seconds.

These young women have been fighting for help and support, some at the state level and they've been successful, but really they wanted this at the federal level.

So for a lot of the young women, I think it's been like a sigh of relief that finally we're here, and you've given us and other young women who have been victimized or had their images weaponized in this way a path to justice, but also a path to get those photos removed from the Internet once and for all.

Speaker 3

Well, this all sounds like a very positive thing, and it has bipartisan support.

Are there people arguing against it?

And are there criticisms of the bill despite it being overwhelmingly positive.

Speaker 1

There definitely are As is the way when it comes to social media or consumer tech, there is an ongoing tension and like a push and pull between privacy and safety.

You have those who you know, prioritize safety and say protecting children online is the most important thing we can do.

And then you have those who value privacy and say, if we're going to create safety regulations or rules that in any way we can our privacy, you know, that's a bad thing to do, because privacy is something that we need to priori ties as well.

And so in this case, you do have free speech and privacy advocates criticizing this law for being unconstitutional, saying that it could chill free expression, that it could foster censorship, that it could result in what they describe as a knee jerk takedown of content.

And what I mean by that is because these platforms and I'm talking about meta, Snapchat, TikTok, because they've grown so big and we're talking billions of pieces of content uploaded on a daily basis, if you're going to enforce regulation or legislation that says they have to take down certain content within forty eight hours, and say they get flooded with millions of requests on a daily basis, they are not going to have the bandwidth to actually review each request and that could result in them just deciding to remove everything that gets reported to them.

And that is what free speech and kind of privacy advocates fear is going to result in a level of censorship that we haven't seen before because no one's been able to really adjust two thirty since it was written into law.

We've also, interestingly seen some criticism coming from the child safety advocacy space, and they've come out swinging saying that while this bill, in this legislation is necessary, it's far from game changing, that it's taken too long to get here, and that the penalties aren't severe enough that this is going to put a lot of pressure on local and state authorities, prosecutors, law enforcement to actually go after the perpetrators in a more severe way.

Because when you look at Take it Down, we're talking two years in prison for publishing an intimate image of an adult, deep fake or real, and up to three years for a minor.

Speaker 2

What about the tech companies, I mean, are they viewing this as the first battle line in the way to fight over the future of Section two thirty.

Have their lobbyists been active on this issue, and how are they preparing for this extraordinary new set of responsibilities that will come with a passage of this bill?

Is so you mean to get signed by President Trump.

Speaker 1

Well, the tech companies, a lot of them actually do have rules in place that says non consensual intimate or sexual images can't be shared.

I mean, even on Metas platforms alone, it's against the rules to post any nude photos.

But in this case, now that they're being kind of forced to do so by regulation, Metas come out in support of this, saying, you know, we do think that deep fake porn shouldn't exist on our platform, and we will do what we can to take it down.

I think that from the platform's perspectives, they don't want fake photos, fake naked photos of teenage girls shared on their platforms, like that's not a positive use case of their networks at all.

They don't want their users sharing or distributing this content.

And now they're being told and hold to account to ensure that it's taken down within two days.

And I'd be interesting to see how the companies internally are responding to this, and what the process is going to be and whether it's actually going to change anything.

Speaker 2

Olivia, just to close, I mean, you've had kind of an extraordinary run this year, putting out the Levittown podcast, also having extraordinary documentary called Can't Look Away that Bloomberg produced distributed about the harms of social media.

Can you sort of take a step back and describe this moment, because one thing that Karen and I talk about and think about is that five years ago, the idea that the law might catch up to the tech companies and there would be enough social pressure to insist on changes to protect users from harm seems to be like a fantasy.

But in this moment, there seems to be some promise that it's actually happening.

Can you speak about that.

Speaker 1

I've been covering the dangers of the digital world for Bloomberg for going on almost four years now, and I have been terrified by what I've seen online.

And I'm not talking just deep fake porn and you know, witnessing the real world consequences of these photographs being shared among teenagers in high schools, and I'm talking the impact on the young women who are targeted, but also the young men who think that it's normal to create and share photos like this, think it's a joke.

The way in which teens and this generation are kind of warped by technology, I think we don't fully understand what the long term consequences of that are going to be.

But the harms of the digital world exist far beyond deep fakes, and that's what we were exploring and the Can't Look Away film, and the film itself explores the other ways in which social media can harm kids, from recommendation algorithms, pushing suicide, glorifying content, content that is going to lead to or mental health harms, or eating disorders.

It explores the ways in which kids have been targeted by predators online who want to sell them drugs, and in many cases they think they're buying counterfeit pills like xanax or oxycodone, and it turns out to be laced with enough fentanyl to kill their entire household, and parents are discovering their children dead in their bedrooms.

So it's been a really difficult topic to explore, but also in just such a crucial one.

This is one of the most essential issues of our time, and I think that this has been a challenging yet very rewarding area to explore.

And I know there's a lot of criticism of the Take It Down Act, but regardless of the controversy, most people agree this is a step in the right direction.

And I think this act is a good thing.

But it's very narrow.

You know, we're only talking about removing content that is non consensual intimate imagery.

We're not talking about all the other content that could potentially harm kids.

So while the fight here is a win and we should celebrate that, the broader concern around protecting our children in the online world is ongoing.

Speaker 3

Olivia, Thank you, Thanks Olivia.