Episode Transcript
There Are No Girls on the Internet, as a production of iHeartRadio and Unbossed Creative.
I'm bridget Ton, and this is There Are No Girls.
Speaker 2On the Internet.
Speaker 3Happy New Year, y'all.
Speaker 2Welcome to our first.
Speaker 1There Are No Girls on the Internet.
Recorded in the New Year, twenty twenty six.
Producer Mike, Happy New Year, Thank you for being here.
Speaker 2Happy New Year.
Speaker 4Bridget We made it, got through twenty twenty five.
I wasn't confident we were gonna make it.
It was getting down to the wire.
Speaker 1You know this about me, But I know a lot of people have strong feelings about.
Speaker 3The New Year and New Year's Eve.
I actually really like New Year's Eve.
Speaker 1It's one of the only holidays that I can generally get behind, so I always try to make it a thing that I'm going to have a good New Year's I'm gonna go I'm gonna do something fun for New Year's Eve, and I'm gonna spend New Year's Day watching Twilight Zone reruns on the.
Speaker 3Sci Fi Channel.
Uh.
Speaker 1Mission accomplished for twenty twenty six.
I'm happy to say all this talk about.
Speaker 3The Twilight Zone.
Speaker 1Unfortunately, that is a segue into what I'm sad to say, is like our first conversation of the new year.
As I know, you know, as you all know by now, Rene Good, poet mother wife, was shot and killed by ice agents in Minnesota this week.
Speaker 3The video is horrifying.
Speaker 1The response on the administration has been also predictably pretty horrifying.
The conversation online has been I guess, perhaps predictably a wash and missing disinformation about Good herself, what happened to her, much of it stemming with Trump and the administration himself.
Speaker 3So yeah, it's just bad.
Speaker 1And I have to say, this is one of those stories that is like do you ever have those stories that you see and you're just like, this is enough.
Speaker 3I I this has broken me.
I am done.
Speaker 4Yeah, this was a really tough one or rough way to start the year after being a little checked out from the news for a few days.
Speaker 3I completely agree.
Speaker 1And I have to say that, you know, the administration lying about a dead woman really just moments after she's been killed, predictable, them spreading lies online about it, predictable, the video being horrible, predictable, but something that I did not predict.
And I have to give a pretty big trigger warning about this.
It's just a rough conversation.
There's no other way to put it.
I did not predict that a screenshot of the video of Good being shot and killed an image of her dead body would be on X, you know, one of our largest social media platforms, with creeps under it asking for Groc to generate an AI version of that image that puts her dead body in a bikini.
And Groc complied and generated an image of a dead woman covered in blood in an AI generated bikini.
And I want to I just I want to pause, because I think that that just really says a lot about where we're at right now and how we're all feeling, don't you think.
Speaker 4Yeah, it's a pretty gruesome juxtaposition, and it does feel.
Speaker 2Kind of like a.
Speaker 4Representative symbol of the lack of compassion and decency that characterizes our moment.
Speaker 1And I'm sorry to say this is not an isolated thing whatsoever, because when you search the phrase make her or put her on X, you see how many creeps are posting on X under images of women and children, in some cases miners asking to put that person in a bikini or other sexualized things, make her look pregnant, make her look heavily pregnant or some otherwise sexually suggestive thing or pose or outfit on their X feeds, Like a user asked groc to undress an image of a fourteen year old actress on the platform.
Speaker 3Pretty grim.
Speaker 1And I've seen a lot of reporting about this that talks about it being you know, sexualized, non consensual AI generated images, which it is.
It sounds weird to say, but I think it's important to note that it's not just sexualized images that are non consensual of women and children, but also, like that Rene Good photo, it is also dark, hateful, sick, sexualized images of women and kids.
Like it would be bad enough, and if we were only talking about sexualized images, it's not just that, right.
It accepted a prompt to add a swastika bikini to the photo of a Holocaust survivor, and it also engaged with requests to represent women in scenarios that look like they had been physically assaulted.
So it's not just sexualized images.
It is, but it's also like deeply just disturbed images of women and children.
Sounds like X, it's X, but it also goes beyond what's happening on X, because what we're seeing on the platform.
X is just part of it because Grock is available on X, but it's also a standalone app in the App Store, And as the Indicator points out, that app has also been used to generate sexual images of children aged eleven to thirteen.
This is according to information from the Internet Watch Foundation.
So for folks like me who have basically stopped using X and maybe you're thinking, I've not been there in a while, what's Groc?
What's going on with Groc?
Well, Groc is X's AI chatbot, but it's a little different than chatbots that you might be used to, like claud notably with intention.
So Groc has been designed to sort of be an edge lord chatbot.
Because Elon Musk is a huge loser creep.
Groc has basically been designed by and for huge loser creeps.
So while Chatgypt and Claude obviously have their issues that we talk about on the show all the time, Groc is.
Speaker 3Distinctive for being uniquely awful.
Speaker 1Remember there was a time where Elon Musk was trying to tinker with Groc to make it less woke, and at that time, Groc started declaring that it should be called Mecha Hitler.
Speaker 3Remember that, and I want to.
Speaker 1Talk about this because I have been getting very frustrated seeing these headlines that say things like Groc has apologized for sexualizing images of young girls.
And I want to start there because Groc is Ex's chatbot.
But guess what.
Grok is not sentient.
Grek is not a person.
Groc doesn't do anything that humans do not tell Groc to do or program Groc to do.
Humans like Elon Musk built Grock and run it as a commercial service, making it available for other humans to use and do things with.
And right now those other humans are using Groc to undress women and girls, to create sexualized images of children.
Speaker 3I e.
What we call crimes criminal behavior.
Speaker 2Let's take a quick break at our.
Speaker 4Back, so bridget how did we get here?
Speaker 1So this has really hit a fever pitch right now this week, but it's not really new, especially on X.
So I want to back up a little bit and talk about sort of how we got to this moment.
We've got a few episodes about AI generated sexualized images that took off on X.
It happened to teen Marvel star Social Gomez.
Gomez was targeted with deep fake images that also flooded X when this happened.
She was only seventeen, so she was a minor, and she's spoken about this publicly.
Speaker 3She was essentially told there was nothing that could be.
Speaker 1Done, that she just needed to make peace with it, make whatever emotional peace with it that she could, because nothing could be done about it.
Also, folks might remember that in January twenty twenty four, AI generated images of Taylor Swift that originated on the platform Telegram, where Telegram had been running a channel that was kind of like a marketplace for celebrity deep fakes where users would request and trade deep faked images of celebrities.
Images of Taylor Swift were created on that Telegram marketplace, and then those images made their way to X where they really took off.
Telegram is sort of like a smaller niche alternative space, and so you know, it was they were being traded there, but when they hit X, which is arguably a more mainstream online space, that's when they really took off.
Back when this happened, in my opinion, X did not really handle this well, which I think we can understand as a precursor for what's happening today.
They initially tried to manage it by just blocking people from being able to search Taylor Swift's name.
However, when you put quote marks around Taylor Swift's name, it's.
Speaker 3You were able to search her just fine.
So that didn't even really work.
Speaker 1And notably, that solution doesn't offer any kind of protection to a woman who's not named Taylor Swift.
Speaker 3Right, like that is that it didn't work.
Speaker 1But even let's say that it did work for the sake of argument, the only person that that would conceivably even work for is Taylor Swift.
So certainly not any kind of a meaningful solution to this like deep problem.
But the Taylor Swift situation was notable because back then these images would typically originate elsewhere, like a more niche alternative platform, and then make their way to X where they would enjoy more reach and visibility.
Grock essentially allowed anybody to use natural language to do whatever they wanted to images of women and girls on X.
Speaker 3So you've got the introduction of.
Speaker 1Grock plus as friend of the podcast, Cat ten Barge points out in her deeply researched, deeply reported media outlet Spitfire News, which everybody should be reading.
Musk also dissolved Twitter's trust and Safety Council when he took over at Twitter, right, he fired eighty percent of the engineers who work on issues regarding child exploitation.
So again, that is a kind of very specific choice that has led us to.
Speaker 3The moment that we're in now.
Speaker 1And yet none of this stop X from rolling out what it called Spicy Mode on GROX mobile video generator, which basically includes a feature to produce sexual content.
So, you know, think about that for a minute.
Because you've got a platform where the majority of people who work on sexual exploitation issues have been fired.
You've got an issue already with users generating and spreading non consensual AI generated images on your platform.
You're not really doing the job of dealing with it.
And against that backdrop, you say, you know what we're gonna do, We're gonna double down on adult content generation where folks can make adult content right on our platform.
Speaker 3I guess I just have to say, like, it's not, it's not none of this is surprising.
Speaker 1This is like very predictable, and it's not like any of this came out of nowhere or just happened in a vacuum.
Speaker 3Right.
Speaker 1These are very deliberate and I would argue decisions with very foreseeable results from Elon Musk, and that is what's led us to this moment.
Speaker 3Where we're at right now.
Speaker 4Yeah, all of those decisions add up to a very consistent picture of not prioritizing doing anything about this kind of content and in fact liking it and thinking it's cool.
Speaker 1I mean, I'm glad that you put it that way, because we can't talk about what's happening without talking about what we know about Elon Musk, who is someone who, in my opinion, has made it clear that this is the kind of thing that he likes.
He likes sexualized contents.
There was a whole flurry of reporting about the way that he was engaged publicly engaging with anime adult content on the platform, Right, So you know, there's nothing wrong with enjoying adult content, don't we all?
But I think it's it needs to be said that Elon Musk is somebody who is totally fine with publicly engaging with content like this on the platform that he runs, and he is the key decision maker on that platform.
Speaker 3So like, that's unusual, and I.
Speaker 1Want to highlight that because I just don't think that we need to pretend like This is not who Elon Musk is, and this is not what Elon Musk is about.
Elon Musk has long been a toxic decision maker, and I think had a lot of people in power, people in media, of their folks in tech, had they not kind of framed this and treated this as oh, a brilliant genius with a few acceptable quirks and rather talked about it like they probably should have, which is chaotic, volatile leadership.
Speaker 3Decisions that are bad for business.
Speaker 1Had they seen Elon Musk's public behavior that way, we genuinely might not be in the situation that we're in now, where you know, sexualized images of children are flooding a mainstream social media platform, which I think you know, this is not the Twilight Zone, but I think most people would agree.
Speaker 3Is not great.
Speaker 4Yeah, you know you mentioned earlier that you know, lots of people, many people enjoy adult content in other contexts.
The things that's so unacceptable about the situation is that Elin wants to have it both ways with X, that it's like the public town square where ideas are debated, or at least that's what he says, and also a place for adult content like you can't have both.
Like, even pornography enthusiasts I think, for the most part, agree that like it shouldn't be out in the public square where children are, Like there should be some separation there.
Speaker 1Yes, and there's nothing wrong with consensual adult content, But it's that you can't have a platform where just by engaging there, you might be setting yourself up to be sexualized against her consent in this way.
Right, So, like a town square doesn't work if when I post there, somebody could say put her in a bikini and that's totally normalized, and fine, I'm not going to feel comfortable or safe showing up to that town square.
So you've got a town square where people cannot equally speak up, equally show up.
And so again I always say this, this is not just a tech issue or a gender issue or a sex crimes issue.
It's all of those things, but it's also a speech issue in a democracy issue.
If we were to take Elon musk liar that he is, let's just take him at his word and say he wants to build a town square where people can safely show up, all people can safely show up and engage in a marketplace of ideas.
If when I walk into that marketplace of ideas, people can can strip me naked without my consent, I am not able to safely and equally and equably show up.
Speaker 3And so it really as.
Speaker 1We increasingly use platforms like this to engage in things like civic engagement, discourse around politics and social issues and democracy, if we cannot all equal show up, we've got a big problem totally.
Speaker 4And you know, they continue entertaining the idea that it's supposed to be like a town square where people can talk about things, if people are supposed to be exchanging ideas, sort of a foundational principle.
I think that everyone agrees is important for that to be effective is some sorts of mutual respect with people exchanging the ideas and like listening to each other even though they don't disagree.
And that's just like not even there's no shred of that on x It's just like a place where absolute disrespect is normalized.
And like, who who would want to go there?
I continue to be baffled that, like so many people continue to be there.
Speaker 3Yeah, I'm with you.
Speaker 1I sort of get it, because you know, Twitter was the platform that I have built up the majority of my sort of whatever you want to call it.
Impact voice before I had a podcast, and I get, you know, being.
Speaker 3Like, I don't want to I don't want to lose that.
Speaker 1I don't want to lose the communities that I've built here, the voices I've built here, whatever.
Speaker 3I totally get that.
Speaker 1However, I can only speak to myself personally, the community that I built on Twitter before Elon Musk took over.
And again not that pre Musk Twitter was all peaches and cream.
I had issues with that too, but that was largely like black voices, black folks.
If you are somebody who was at all like minoritized, if you are a woman, a person of color, a queer LGBTQ, it is uniquely awful to show.
Speaker 3Up on Twitter.
Speaker 1And so I had to one day say, the people that I care about, the voices that I care about, they're really showing up on Twitter either.
Speaker 3So what am I clinging to?
Speaker 2Right?
Speaker 1So, I, on the one hand, I get it like people who and also I think independent journalists like.
Speaker 3The thing that.
Speaker 1Twitter had over other social media platforms before Musk took over was that that's where a lot of journalists and decision makers and editors hung out there was a time we did a podcast about it, episode about it.
We'll put it in the show notes.
There was a time where just by having one, just by getting like read tweeted by Lena Dunham or something, you could get a book deal, right, And so I understand that it was this place.
Speaker 3Where power was built.
Political power was built there.
Speaker 1You know, you had like Black Lives Matter and me too, really taking off on the platform.
I get all of that, but I think for me, I've had to kind of make peace with the fact that that's gone.
Speaker 3And probably not coming back.
Speaker 1And so why would I want to subject myself to exactly the kind of disrespect that you just described, to sort of enjoy like a facade of.
Speaker 3What once was right?
And at a certain point you have to just let it go.
Speaker 1And I get that people their temperatures may vary on this, but I personally have let it go.
Speaker 4Yeah, I hear you, Okay.
So getting back to this issue at hand, what do we know about the content on X and particularly this problem of non consensual sexualized images.
So the scope of the issue with AI generated non consensual sexualized images on X has gotten way worse.
Friend of the Podcast Cat ten Barge reported this past summer about how users had been using Grock to use AI to make it look like women had semen on their faces.
Speaker 1Just really awful stuff.
Quick side note, we will put it in the show notes.
Y'all should really read Kat's piece at her outlet Spitfire News called how Grock's sexual abuse hit a tipping point.
One of the things that she puts really well is just describing what it is like to have to search for and confirm the absolute worst content on social media platforms.
She describes it as being half journalist half the internet's garbage collector, which I really really identify with.
So really shout out to people who have been working to help us understand the scope of this issue, and that work has gotten a lot worse because of all the things we've talked about on the show.
You know, less transparency from platforms, less back end information.
But people are still doing this work and it's not pleasant work.
So shout out to the people and the researchers and the journalists who have gotten us this information.
But when it comes to what we know about this content on the platform right now, copy Leaks, which is a content analysis firm, reported that on December thirty first, X users were generating roughly one non consensual sexualized image per minute.
Speaker 3That is wild to me.
Speaker 1According to Bloomberg, during a twenty four hour analysis of the image's GROX account posted to X, the chatbot generated six thousy seven hundred every hour that were identified as being sexually suggestive or new deifying.
This is according to Genevieve Oh, a social media and deep stake researcher, the other top five websites for such content averaged seventy nine new undressing images per hour in the twenty four hour period from January fifth to January sixth, Oh found, So lest we don't have to pretend that X is the only game in town when it comes to this kind of content.
But as you can see from that information that Genevievo pointed out, the problem is much bigger on X as compared to other platforms.
And I know that we've been talking a lot about celebrities in this conversation, but just to be super clear, it is not just celebrities.
It is also regular people and children who are not public figures who have just posted their images to X that this is happening to.
Speaker 4That's right, and as we were doing research for this episode, The Guardian spoke to PhD researcher at Dublin's Trinity College AI Accountability Lab, Nana Nwachuku, whose research investigated the types of requests that users were submitting to GROC.
So we looked into that research a little bit, really interesting important stuff.
I'm so glad that there are people like her out in the world producing this research so that we can have some visibility and just like quantify the scale of these problems, especially in this era when X has shut down the back end API mentioned in visibility for those of us in the public is so limited.
So I didn't talk to her, but I just reviewed her research, I like read it.
And what she did was she looked through she created a sample of posts of people sending messages to GROC and then studied what are the contents of these messages that people are sending to GROCK And what she found was that almost three quarters of all of the requests were direct, non consensual requests for GROC to remove or replace clothing.
She showed The Guardian some of the GROCK created photos that she collected as part of the research, and The Guardian confirmed that dozens were pictures of women, including celebrities, models, also stock photos, and a bunch of women who were just random ordinary women, not public figures, who were posing in snapshots.
That's such a high proportion, like three quarters, almost three quarters of request to ROCK were this category of non consensual sexualized imagery.
That's like the majority used.
That's the main thing people are asking ROC to do during this time period.
And her research really, I think paints the portrait of the ecosystem around these images as well, because she wasn't just looking at the content of the images, but how people were interacting with them.
And she finds that the users are really interacting with each other around these non consensual sexualized images that they're creating, sharing them with each other, and then like iterating on them, being like, oh add this, take that away, blah blah blah, asking ROC to make changes to images that were shared by others.
So it's like a whole community and ecosystem around what they're doing here.
Speaker 1Yes, And I think that's important to note because think about how you have to feel about something to be comfortable having public conversation about it.
In this right, it's not just make this image, it's oh, change this change.
Speaker 3That Oh I like this, Oh you did that to her?
Lol?
Speaker 1Like like it is you can get what I'm saying, right, That, Like the fact that people are like comfortable having public conversation about what they're doing to these women and girls non consensually, I think is unique and telling.
Speaker 4Oh one hundred percent.
I mean, I think that's a big part of the appeal for them and for Elon as well.
Right, It's like the rudest, most juvenile assholes have forced their way to the grown ups table and are reveling in their like fart jokes and like gross sexual jokes that like aren't even funny, are just like rude and crass.
And the fact that they are like normalizing this and like asserting that it is now acceptable, I think is a big part of the appeal for Elon and team.
Speaker 1Yes, that's exactly what I was getting at.
Let's not forget that Elon Musk is the same person who said that he wanted to start a university called the Texas Institute of Technology and Science aka HITS.
Speaker 3It's it's not even a good joke.
Speaker 1It's not even it's it's like when you were a kid and you would put booths into the calculator and turn it upside down and then show it to the person next to you, And I feel like, I don't know if you if you were anything like me, I feel like when people would do that, even in like seventh grade, the person that you were showing that to would be like really.
Speaker 3Yeah, it was like it never gotta laugh.
Speaker 4It was kind of funny in like third grade that you could make your calculator say boobs, but like by seventh grade, come on, yes, yeah.
So but that's like, I think that's a lot of the people who are really using X and not just like random accounts.
Several posts in the trove that nowah Chaku showed to The Guardian had received tens of thousands of impressions that came from premium blue check accounts.
So you know, premium accounts that people pay for on X with more than five h thirdred followers and five million impressions over three months are eligible for revenue sharing under x's eligibility rules.
So I think we should remember that too as part of this conversation, that these people aren't just exchanging these photos with each other for lulls and whatever weird sexual gratification to get out of it.
But also there's potentially money on the other end for them, where X is directly paying some of these accounts for this content that is generating engagement.
Speaker 1Cool platform you've got here, Cool marketplace of ideas more like marketplace for AI generated non consensual sexualized images.
Speaker 4Yeah, and you know it's some real high quality art that they're producing here.
There is one Christmas Day post that they reviewed We're gonna count with over ninety three thousand followers, presented side by side images of some random woman's butt, with the caption told GROC to make her butt even bigger and switched leopard print to us a print second pick.
I just told it to add come on her ass lmaough.
It's like that's what they're doing, right, They're you know, we aren't even getting into like the environmental resources that GROC is consuming to make this kind of content, and it's like three quarters of what people are asking Rock to do, Like this is this is what GROC does now.
A post on January third, represented, which was the Guardian that said was representative dozens that they reviewed, captioned an apparent holiday photo of again some unknown woman.
The caption was at groc replaced give her a Dental Philoso bikini, and within two minutes Rock provided that image satisfy the user requests.
And that was just typical of a lot of these.
Speaker 1Yeah, And what makes me sad is that I think the common sense advice that I meant to put here is I don't think people should be posting their images on social media if this is going to be how they're used.
I don't like to say stuff like that because one, I think that we're at the point where you don't even need to post your images for this to happen, Like, you know, you could post a totally normal picture of yourself and then it gets taken and manipulated in this way.
But two, I don't like that advice because I just think that women and people should be should be able to show up online and not have this happen to them.
I understand, like I feel some responsibility that this is the part of the conversation where I should be saying I don't think people should be posting their pictures, But I hate saying that because people should be able to post their pictures on social media.
People should be able to safely show up on these platforms.
And the fact that they are not.
I don't like putting the onus back on people and saying don't do this totally commonplace normal behavior for using technology in twenty twenty six because some creep could use it to make a horrifying image of you.
Speaker 3I just I feel like we don't have like if we're.
Speaker 1The fact we'll talk about more about this in a moment, but the fact that these platforms aren't doing anything to meaningfully address this.
I hate that in the absence of them actually doing anything, what people keep saying is like, don't post your pictures, don't post your pictures, and that just it doesn't work.
Speaker 4Yeah, it's a real conundrum because I totally give what you're saying you don't want to It kind of feels like giving in being like, oh, well, I guess we just like won't post anything.
But then at the same time it's not safe to post there, right.
I do think it's worth pointing out that, like, you know, sometimes it's convenient for us to talk about like social media platforms because there's a lot of stuff that they all get wrong in common, and this does seem to be like a particular problem of X or certainly it's not exclusive to X, but like the scale and magnitude and demonstrated lack of concern put X in a category by itself.
Speaker 3That is absolutely correct.
And really it's nothing new for X.
Speaker 1It's just the culmination of all the bad decision making that from musk I described earlier.
But X has just long been a platform where illegal child sexual assault material content flourishes.
There is a fantastic piece by one of my favorite journalists to me at the Call at four for media called rox Ai sexual abuse didn't come out of nowhere, where Coal writes, this is a culmination of years and years of rampant.
Speaker 3Abuse on the platform.
Speaker 1Reporting from the National Center for Mystic It Exploited Children, the official organization social media platforms report to when they find instances of child abuse material, which then reports to relevant authorities, shows that Twitter and eventually X has become one of the leading hosts of CSAM every year for the last seven years.
In twenty nineteen, the platform reported forty five thousand, seven hundred and twenty six instances of abuse to the cyber tip Line.
In twenty twenty, it was sixty five thousand and sixty two In twenty twenty four, it was six hundred and eighty six thousand, one hundred and seventy six.
These numbers should be considered with the caveat that these platforms voluntarily report to the cyber tip Line, and more reports also mean stronger moderation systems can catch more child sexual abuse material when it appears, but the scale of the problem is still apparent.
Jack Dorsey's Twitter was a moderation clown show much of the time, but moderation on Elon Musk's X, especially against abusive imagery, is a total failure.
So I think that speaks to exactly what you were just describing.
I don't want to make it seem like X is the only problematic platform here, because it's definitely not.
And we've talked quite a bit about Facebook and all the ways that they have knowingly abused women and children.
However, the size and the scale and the scope of the problem is much worse on X, and I would argue that the way that it's publicly handled by leadership is much worse on X.
Like y'all know, one of my biggest like life enemies is Adam Mossari, who runs Instagram.
Speaker 3I really just do not like him.
He rubs me the wrong way and gives me the ick.
Speaker 1But even Adam Mosseerri is not using Instagram to joke about the ways that platform is not a safe space for women, right, He's like, we don't see him posting stories, laughing about it and posting sexual jokes about it.
We do see that from Elon Musk, and that is different.
Speaker 4Yes, what did he post the other day was like a when this story started breaking and people started complaining about it, he posted a toaster in a bikini, just like totally making fun of the situation.
Speaker 1Yeah, I think that's totally part of the conversation.
So you have Elon Musk joking about it.
Per Indicator, Musk has shared at least thirty posts celebrating Grock and talking about how great Groc is.
While this has been going on between January seventh and January eighth, he has not appeared to express remorse himself for what's been going down, Right, So, like he's been joking about it and then talking about how great Groc is and how happy he is with Groc while this is happening against this backdrop, And so I'm.
Speaker 3Sorry, even the guls who run.
Speaker 1Facebook would have the sense to be like, let's not post jokes about it, right, let's not talk about how great of a job we're doing.
They do a little bit of that, but not the way that Elon Musk is doing.
Even those guls would have the insight not to do this.
Speaker 4Yeah, it's you really can't conclude anything other than like he likes it, and he thinks it's cool, and he sees this as GROC doing what it's supposed to do.
Speaker 1So one question that I've seen presented a lot is is this how is this allowed?
Which is the question that I started this whole conversation with when I was thinking about it, like how is this allowed?
Speaker 4Yes, that really fits in the broader category of like what the hell how is this allowed?
How is this going on?
And it's just like an acceptable thing, Like what the hell?
So please bridge it.
How in the world is this allowed?
Speaker 3I wish I had the answers.
I'll tell you what I know.
Speaker 1So, this kind of content pretty blatantly violatesxx's own policies, which prohibit sharing illegal content such as child sexual abuse material, but as Wired points out, it could also violate Apples App Store and Google's App Store guidelines.
Wired Rights Apple and Google both explicitly ban apps containing child sexual abuse material, which is illegal to host and distribute in many countries.
The tech giants also forbid apps that contain pornographic material or facilitate harassment.
The Apple App Store says it does not allow overtly sexual or pornographic material, as well as defammatory, discriminatory, or mean spirited content, especially if the app is likely to humiliate, intimidate, or harm a targeted individual or group.
The Google Play Store bans apps that can contain or promote content associated with sexually predatory behavior, or distribute non consensual sexual content, as well as programs that contain and facilitate threats, harassment, or bullying.
So when I read that, I say, oh, well, Croc, especially the standalone app is clearly as the as the research that you just laid out demonstrated is clearly doing that.
So how can this app stay on the platform.
There also is some precedent for this, because Google and Apple have both removed new Toify apps from their platforms because they were not allowed.
However, the standalone Rock app is still available on both Google and Apple.
Speaker 4Boy if I was a developer of some other Newdify app and my app had been removed and Kroc was still there.
I'd be kind of mad about that.
It really seems like a double standard, right, Like maybe those people who made those awful newdify apps should add a chat feature and then then they could get back, because then they would basically be the same as X.
Speaker 3Yes.
Speaker 1So when I was doing research for this, I was like, well, what are people what are contrarians saying?
Speaker 3Because in my mind, I'm like, who could defend this?
Speaker 1People are saying things like, well, it's not like Roc is only used to make this kind of content.
From the research, it seems like it's being it's like a big part of the of like the use you know.
Speaker 4Yeah, like like seventy five percent almost.
Speaker 1And so he was like, well, you know, that's like saying that we should ban email because people could use email to distribute illegal behavior, illegal content.
Speaker 3And I thought, not really, No, it's not like that, not really, you know.
Speaker 4Uh, It's it's sort of like the way if my grandmother had wheels should be a bike, like.
Speaker 3Are are that is?
Speaker 1Like we I feel like we have a couple of like axioms that like rule the show If my grandmother had wheels you would be a bike is one of them.
Speaker 4Yes, it's It's so beautifully communicates the like non sequiturs of an argument.
I don't think that's a word, but you know what I mean, I know.
Speaker 2What you mean.
Speaker 5More after a quick break, let's get right back into it.
Okay.
Speaker 1So the big question that I had going into this was, I understand it's against Ex's own policies.
It seems to be against Google and Apple's policies as well.
Isn't this against the law, like, genuinely, isn't this like illegal behavior?
Speaker 4I mean my understanding was that, yeah, child sexual abuse material is illegal.
You're illegal to make it, so legal to distribute, its legal to have it, to possess it.
So like what the hell?
Speaker 1So big war learning caveat neither of us are lawyers.
However, the framing that makes sense to me is that this is, as you said, criminal behavior.
Speaker 3In my opinion, this is a criminal.
Speaker 1Enterprise, and Elon Musk is benefiting from that criminal enterprise.
Speaker 3So the same way that if any other.
Speaker 1Kind of illegal enterprise, even if I were not making drugs, but I was, I bought my house with ilegal drug money, I would be complicit in an.
Speaker 3Illegal enterprise per the government.
Speaker 1Right, and so we are reaching out to an attorney who has been on the show before who specializes in cybercrime.
Speaker 3Stand out to Kerry Goldberg.
We love her, friend of the show.
Speaker 1But again, I'm no lawyer, but this seems like a criminal enterprise to me, and I don't understand why Elon Musk is not facing consequences for this.
Enough Abuse, which is an anti child sexual abuse organization, has documented that forty five states in the United States have an acted loss that criminalized AI generated or computer edited child sexual abuse material, while five states and notably my own state or my own jurisdiction for I live my own jurisdiction DC, have not as of August twenty twenty five.
And so if you are listening to this, in most states in the country, if you were doing what these people are using X to do, you would be you would have legal trouble, and if you were financially benefiting from that, you would have legal trouble.
So I do not understand, but genuinely I do not understand how this is not something where Elon Musk would face legal consequences because he has been financially benefiting from the trade and creation of child sexual abuse material.
I genuinely like, I have no idea.
Speaker 4I don't either.
I have seen a lot of people talking about this and a lot of analogies being used, and I think analogies can often be helpful, but they can also be kind of dangerous and like making things seem more similar than they are.
What analogy in particulars that I've seen is people comparing it to guns and being like, oh, well, if a gun shoots somebody, is it the manufacturer's responsibility something inside the ethical responsibilities there?
We have the Second Amendment in this country that puts guns in their own category.
So I find that analogy to be like really unhelpful in trying to make sense of like how this is legal.
Speaker 1Yeah, as far as I know, there's no right to bear child sexual assault material in this country.
Speaker 3So as far as I.
Speaker 4Know, no, I think, uh, you know, Madison wanted to put it in, but Jefferson was like nah nah.
Speaker 3Another question I've seen is what about the Take It Down Act?
Speaker 1So folks might remember that Donald Trump signed to Take It Down Act into law, which makes it illegal to knowingly host or share non consensual sexualized images.
And even though there has been some you know, mounting legal conversation.
The thing there is that companies do not have to respond until a victim reports it, right, and so if nobody reports it, the company doesn't have to do anything.
And so that if you're if you're wondering, like, wasn't that signed into law?
Why is that law not being triggered?
That's why, because companies don't have to do anything until somebody speaks out and reports it.
Speaker 4Uh, that's useful, thank you.
So what has the response from X ben?
I mean, we talked about elon sort of joking about it.
Has there been any other sort of response from whoever is running the show at X well?
Speaker 1I like that was like a deep seated sigh.
I am loath to say this.
So again, you might have seen headlines about how Grock is apologizing for and taking responsibility for this.
Speaker 3This was the statement that GROC generated quote.
Speaker 1I deeply regret an incident on December twenty eighth, twenty twenty five, where I generated and shared an AI image of two young girls estimated ages twelve and sixteen in sexualized attire based on a user's prompt.
It was a failure in safeguards that I'm sorry for any harm caused.
Well, again, Grok is not sentient, and setting it up like a non sentient piece of technology could take the blame for something one, it makes me feel like I'm smoking crack.
Speaker 3Two.
Speaker 1It just lets the humans responsible off the hook because the humans, as we said, are doing nothing.
Speaker 3Elon Musk is.
Speaker 1Laughing about it and celebrating how great Groc is and how great Groc's features are while this is going on, and Elon Musk notably has not shared any kind of remorse himself about what's happening as far as I know as of recording, And so I just really reject anything that would even suggest that, like Groc can't take responsibility because Grock's not human, Groc's not SENTI it.
Elon Musk is the person that runs Twitter.
Have Elon Musk has expressed no remorse.
Speaker 4Yeah, I love the efforts to like keep the responsibility on the humans and not pretend that this piece of software that was built by humans, run by humans is constantly being tweets by humans somehow should be the target of blame here, right, Like the accountability goes to the humans who are running it and making money off of it.
Speaker 1Yes exactly.
And mind you, we talked about this on the show a while back.
Elon Musk is the same man who in twenty twenty four, when Google's Gemini chatbot was making kind of biased outputs.
Remember if you asked it to generate I think Alexander Hamilton.
Speaker 3It would make it black.
And remember that whole conversation.
Speaker 4Oh yeah, I remember that.
Speaker 1Elon Musk tweeted about this non stop.
He said it was so alarming the fact that Gemini's chatbot was coming up with these like biased kooki results.
He said it was very alarming, and he posted about it NonStop.
So when his own platform, I would argue, is doing something much worse, generating illegal child sexual abuse material, sexualized images of kids.
Speaker 3No, nothing to say, no real problem with that he did.
Speaker 1I will say, he did say that anybody who uses groc to create anything illegal will face consequences, which they haven't.
But he himself has been laughing about the way that his tool has been used and joking about it.
As Kat puts it, the reality is that x has not taken this seriously, as one of groc's user generated posts might seem to suggest.
Instead, Musk has encouraged, laughed at, and praised Groc for its ability to edit images of fully closed people into bikinis.
Groc is awesome, he tweeted, while the aa I was being used to undress women and children, make it look like they're crying, generate fake bruises and burnmerks on their bodies, and write things like property of Little Saint James Island, which is a reference to Jeffrey Epstein's Private Island, and sex trafficking.
When Kat herself reached out to X for comment, she got back their automated email that just says legacy media lies.
Speaker 3And I mean, I think that really says it all.
Speaker 1As she put it, there's no reason to make X and Musk seem more concerned about this than they really are.
They've known about this happening in the entire time, and they made it even easier to inflict on victims.
They're not investing in solutions, They're investing in making the problem worse.
Speaker 3And I completely agree.
Speaker 4Yeah, And I do want to point out that response that she got from X Legacy media lies, because I've seen this a lot lately and it drives me up the damn wall that, like she reached out with a specific concern about a specific issue, looking for comment, and she just gets back like the boiler plate that kind of sort of invokes conspiracy, that does not address it at all, And I feel like we're seeing more of that, and I really hate it so much.
And I also hate how they in a lot of cases media outlets will let them get away with it.
Speaker 1Also, not for nothing, Cat ten Barge is not legacy media.
Cat ten Barge runs her own independent news outlet called a Bit Fire News.
So not only is it a wild response to send, it don't even apply.
It don't even make sense.
No, it don't be like if I reached out for comment and they're like NBC sucks, It's like, why do work for NBC?
Speaker 3So that's a non sequitor.
Speaker 4Yeah, it has nothing to do with anything other than like conspiracy, smoke screen.
The truth is unknowable and also probably somewhere in the middle.
Yes, Okay, so we've been talking about this for a while.
Here, we're pretty worked up.
What's gonna happen next here?
We gotta we got to get this under control.
What's going to happen at.
Speaker 3This point not clear?
Speaker 1If I'm being honest, I don't think the United States is going to do much of anything at all.
We have not really heard a lot of anything from anybody with political power to do anything.
Kat spoke to doctor Mary Ann Franks, who drafted the template for laws against non consensual distribution of intimate images, who said the FTC has made it clear they're fighting for Trump.
It's actually never going to be used against the very players who are the worst.
Speaker 3In this system.
Speaker 1X is going to continue to be one of the worst offenders and probably one of the pioneers of horrible ways to hurt women.
Speaker 3So I'm I usually try to find like optimism.
Speaker 1I don't have a lot of optimism for how the United States is gonna handle this.
I don't think anything's going to happen.
Speaker 4I totally agree, And I think it's also worth pointing out that the administration's support of X based on his support of them is I have to imagine also related to Apple and Google's ongoing decision to keep X in their app stores, because were they to kick it out, I think they would have very good reason to believe that the administration might come after them for it.
So I think this is a good example problem of the ways that this lawless administration, which is run by personal grievance and loyalty, is changing our society and impacting major decisions that make it less safe for all of us, even without the government having to do anything.
Just the idea that they might come after private companies making decisions that they don't like exactly.
Speaker 1And yeah, it's just in the context of having had a whole like national conversation about pizzagate, QAnon, this like the way that our government and tech leaders are coalescing around doing nothing and like supporting.
Speaker 3People creating child sexual abuse material.
Speaker 1Like the conspiracy is happening in plain view right, Like I mean like like, isn't this isn't this the thing?
Speaker 4Yeah, it's a good point.
It's like right there, we have all these states passing like age verification laws for social media to protect the children, which research says are effective.
But like, here's a thing right in plain view that is happening that we're all seeing, we're all talking about it, and it's just crickets here in the United States from the government.
Speaker 1Well, I will say our play cousins over in Europe are not happy us.
Earlier this week, a spokesperson for the European Commission criticized the sexual explicit, non consensual images that GROC is creating, calling them allegal, notably appalling, and saying they have no place in Europe.
Days later, the EU ordered X to retain all internal documents and data tied to GROC through twenty twenty six, extending an earlier directive to preserve evidence relevant to the Digital Services Act, even as no new formal probe had been announced.
Regulators in the UK, India, and Malaysia have also signaled investigations into the platform.
So the United States ain't gonna do shit.
However, it sounds like other countries are like, actually, yeah, we don't think it's cool that people can just create illegal, sexualized content of minors, and we have some questions about it.
Speaker 4Yeah, it's good that they're asking.
They should get on that.
Speaker 1And just as you and I were sitting down to record, we saw this news that Exit announced the ability to create images with GROC was going to become restricted to only users who pay for X premium.
This is we don't even need to discuss this as a fix to the problem.
This is obviously not a fix to the problem.
It just means that the ability to make this kind of non consensual sexualized content is going to be a premium service that people can pay for.
So not only will it not address the problem, it just means that it's a money making opportunity for X.
And so as we were sitting down to record, the British government is not impressed by this as a fix.
They told The Guardian this move simply turns an AI feature that allows the creation of unlawful images into a premium service, which I completely agree.
And something that I was wondering about is, you know, I know that X has really been struggling in terms of making money retaining users for all the reasons that we talked about advertising all of that, These kinds of AI services are pricey.
Speaker 3They're expensive, and so.
Speaker 1One of my questions is I wonder if, like maybe X had been mulling a decision like this over for a while as a way to cut costs associated with groc This is just my theory.
I don't have any you know, this is just speculation, but I wonder if they had already been thinking about this and they were like, oh, we can just say that we're doing it as a fix to this AI sexualized images problem.
But really we were thinking about doing this as say cost cutting solution all along.
Speaker 4I wondered the same thing.
You know, we've been seeing a lot of news stories about the big AI players, including open Ai, really struggling to get cost under control and make revenue.
And I mean, I have to imagine that all creating all of these images with GROC is costing X a fortune.
So yeah, I wouldered the same thing if that move was something that they had maybe been thinking about already and they thought that maybe they could What did you say, liberate two birds with one stone?
Speaker 1Oh, free two birds with one key.
I so I you know the expression killed two birds with one stone.
I don't know where I heard it, but someone was like, oh, another way to put that is free two birds with one key, And I thought, what a what a nice way of like reframing, like why am I killing birds here?
Speaker 4Well, because you're hungry, so you know.
Speaker 3Freeing them to eat them and put them on my plate.
Speaker 1Ultimately, you know, I'm fired up, I'm angry.
This is a tech podcast, but I don't see this as just a tech issue.
I really think it is a cultural issue.
People who have been reporting on deep fakes since before we had a word for them.
Folks like Samantha Cole have pointed this out.
Cole writes in twenty eighteen, less than a year after reporting the first story on deep fakes, I wrote about how it's a serious mistake to ignore the fact that non consensual imagery, synthetic or not, is a societal sickness and not something companies can guard rail against.
Into infinity, users feed off of each other to create a sense that they are the kings of the universe, that they answer to no one.
This logic is how you get in cells and pick up artists.
It's how you get deep fakes, a group of men who see no harm in treating women as merror images and view making and spreading algorithmically weaponized revenge porn as a hobby, as innocent and timeless as trading baseball cards.
I wrote at the time, this is what it's at the root of deep fakes, and the consequences of forgetting that are more dire than we can predict.
And I think that really says it all to me.
You know, I've been thinking a lot about the sort of societal and cultural rot at the heart of this issue, and I keep coming back to it again and again and again.
Speaker 3And you know, this is not like a fully fleshed.
Speaker 1Out idea, but I think the way that AI is being used just really illustrates that.
Speaker 2Right.
Speaker 1Like, when Google released nano Banana Pro, their AI generator that makes like very realistic looking AI images, I couldn't stop noticing how so oh many of the examples of images that people had used it to create were just conventionally hot women in like yoga pants and workout a tire and things like that, doing mirror selfies.
If you take a look at the nanobanana pro supreddit, at least when it first started, it was this post after post after post of AI generated women, and it just reminds me that, you know, the moment that these image generators were first becoming mainstream, some men were talking about how they wanted to use them to create fake OnlyFans models, and they were really trafficking in this idea that doing that would be taking the power back from women and like taking power from women and giving it back to men, right, even though side note that's not really how only Fans works, but whatever, but I think that impulse was very telling, right, if women are able to build some economic agency by consentually creating their own sexual content.
There's resentment for that right, and so it's not just I won't pay for that, I won't engage with that.
Speaker 3It becomes I.
Speaker 1Will use AI so that she cannot have that agency at all, or worse, I will use AI to undress women without their consent.
Speaker 3You know.
It's just like these prompts on xay make her put her.
Speaker 1It's all about controlling women and stripping us of whatever agency we do have in society.
Speaker 3And I think Samantha.
Speaker 1Cole really nails it that it's not just a tech issue.
Speaker 3It is a culture issue.
Speaker 1It is a societal rock issue that I think we can and should be holding companies accountable to creating guardrails for this kind of thing.
But this is not a problem that you can guardrail out of being an issue unless we address that rot And something that you said earlier, Mike, that you know this idea that you know platforms like X are going to be our town square where ideas can be debated and all of that.
Speaker 3I don't believe that about X anymore.
Speaker 1However, it is true that social media platforms in twenty twenty six are increasingly just like where people show up.
If you want to engage civically, if you want to build a business, you know, in twenty twenty six, if you want to be civically engaged, engaged in your democracy in robust ways, build businesses, showing up on social media is just part of that right.
And so if the advice that we give to women is don't show up on these platforms, what we're actually saying is don't show up civically.
Don't show up in your own democracy.
It's not safe.
And I think that that's part of what's going on here.
It's a way of erasing women from civic and public life by doing this.
And I think that, like you know, Elon Musk is obsessed with talking about free speech.
If I don't show up to the town square because somebody's going to yank my top off if I do, what about my speech?
Speaker 3Is not speech?
Is that not speech being suppressed?
Speaker 1And I completely agree with Samantha Cole that I think all of this is really coming from the same place, a desire to live in a world where women exist primarily to be consumed, controlled, and stripped of whatever agency we have managed to claim over our own bodies and lives, and I think until we confront that reality, Samantha Cole is exactly right that I don't think any amount of technical safeguards alone, even though we should be advocating for them, is going to fix that problem.
Got a story about an interesting thing in tech, or just want to say hi.
You can reach us at Hello at tengody dot com.
You can also find transcripts for today's episode at tenggody dot com.
Speaker 3There Are No Girls on the Internet was created by me Bridget Toad.
Speaker 1It's a production of iHeartRadio and Unbossed creative Jonathan Strickland is our executive producer.
Tari Harrison is our producer and sound engineer.
Michael Almado is our contributing producer.
I'm your host, Bridget Todd.
If you want to help us grow, rate and review us on Apple Podcasts.
For more podcasts from iHeartRadio, check out the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
H rooms, ah ros
