Navigated to Ted Chiang's "The Lifecycle of Software Objects" - Transcript

Ted Chiang's "The Lifecycle of Software Objects"

Episode Transcript

Wheel of Genre, the show for readers whose childhood trauma was when their Tamagotchi's battery died.

Each month we spin the wheel and land on a different genre, idea, or author.

And right now we are reading stories about AI.

That's artificial intelligence, and that includes this one, Ted Chong's The Life Cycle of Fire.

I don't know why I clarified that, Ted Chong said Life cycle of software objects.

I'm Zach and this week I am really interested in style.

I think that this is one of the most interesting to think about stories that we have read stylistically.

I'm Bob and I've been Traumagachi.

Zach, I hated this story.

I'm going to pause so everyone could give their hate comments.

Let's countdown count up, really type it out.

Thank you.

You can hit send.

I think everyone calls him Ted Chang, but it's Zhang.

But I think everyone just goes by Chang now.

So I think he is really interesting and brilliant and I thought this was very strange story that made me think a lot more than I mean many.

I just I thought a lot reading this and I just hate the whole reality inside the story.

I think he's very interesting in what he's doing, but I don't want the future.

I hate the future, Zach.

I just hate the future.

So tell, tell me more about that.

What about this story made you hate the future?

So I the both what I love about this and what Ted Chang is doing and what is inside the story is what's making me hate the future.

This is basically they're called digiants, but they're like Digimon, little creatures that live in computers.

But now people have made them so they have their own genome.

So they adapt, they learn, they can be educated, they change and they develop.

But they also have created these robots that are kind of like, you know, how people can have an avatar in a game?

These digital creations can have an avatar in real life.

So they enter the robot and they wander around and they have interactions with these people who are basically their owners who can train them.

But what I hated, and I know that this is, this is what he's doing on purpose.

So he's very successful in this story.

But why do we ever have to get to the point in the future where we have to think about Digimon having sex?

Like, why?

Why do we have to think about that?

I don't want a future where that becomes one of our main concerns is these digital creations.

And then we have to think about should they be having sex or not?

You get left behind, Bob.

I know apparently so.

Well, I really want to give him credit for having an idea or take, I mean, taking an idea that I think is not far future, you know what I mean?

Creating a digital creature.

I mean, we started a show talking about Tamagotchis.

You know, the idea is there.

The idea is here I have several cartridges of Pokémon.

I played The Sims a lot as a kid, you know what I mean?

Like we're there, but then saying, OK, extrapolate this out.

How can we make it better, you know, in terms of like technologically more interesting, more sophisticated?

And then what are the legal ramifications of this?

What are the economic ramifications of this?

How will the world also change?

To me, what, what I found so interesting about this story was it felt to me more like an essay.

And I don't, I don't know how to say that, you know what I mean?

Like it felt to me like he wrote out a bunch of bullet points and then said, OK, we're going to put a loose narrative on top of this.

And all of the ideas are, to me, like, rock solid, really well thought out really well, like clear causal implications.

But like, it feels so different than the genre with which this is like traditionally lumped in with SF.

Like SF being an adventure, SF being fun.

I think 1 issue is we just read.

Claire on this?

Oh.

No, no, no, no.

No, no, no.

Just what's, what's the, what's the triangle?

What's the arc, you know?

You mean the Frey talk?

The Frey talk experiment, I mean.

I mean, SF being like writing that follows conventions of writing, you know what I mean?

Like fleshed out character, fleshed out setting, fleshed out interiority, things like that.

Like I, I feel like this is heavy ideas and it's and it's just wearing the coat of, of what we have grown to expect in writing, in writing itself.

Maybe I'm OK if if, if, if sci-fi is really going to be, you know, if a story is like a vehicle to get kind of an idea across.

I like it for being an idea heavy genre too.

Sometimes we get an adventure, sometimes we get, you know, full arc, Sometimes not so much.

But I think we just read Kazuo Ishiguro and like, that's where everything moves you to a great degree.

And I feel like it's impossible to be moved in this story.

That's OK.

I think it's easy to be enraged or easy to be invested in this because like you said, it is.

We're right on the cusp of this being very possible, you know, having these things that can learn on their own and then putting them out into the world, making little Digimon that now are in the real world wandering around with these weird robots that are basically vacuum robots, slightly advanced vacuum robots.

But I was thinking, why does technology have to keep Speaking of sci-fi?

Like, I'm really angry at sci-fi after reading this.

I don't feel like I just, I just don't want to read sci-fi ever again.

After reading the short story.

That's how I feel.

I never want to read sci-fi again.

I'm so angry.

Why do we have to keep pushing things this far?

Like why should we be concerned with Digimon having sex?

That's rule 34.

But now we're making it in the real world.

Like leave just the Internet is a stupid disaster.

Why does this stupid disaster now have to take over our real world?

I was thinking about, you know, that the there's an old episode from Next Generation with the Crystalline Entity, and it comes in a few times.

But when they finally kill it, did you ever see that episode?

There's this.

Yeah.

So from Star Trek Next Generation, the Crystalline Entity is this this horrific thing that goes and destroys people, destroys whole planets.

And there's this mother whose son was killed by the Crystalline Entity.

And she's a really famous scientist.

Anyway, to spoil the whole episode, she eventually kind of tricks the the the whole crew to kill this thing.

And then they're like, why did you kill it?

We were trying to make a connection with it so we could finally understand it.

And she said, no, this thing has killed hundreds of thousands of people.

Just kill it.

And we're left in that episode thinking, Oh my gosh, she's terrible.

The the lesson is no, you have to learn from this thing.

I am with her.

Just kill the the thing.

It's going to curse there.

Sorry.

This feels like I'm just angry after the story.

It feels like technology is that crystalline entity.

People just keep saying, well, we just got to keep advancing it.

We just have to explore these things.

There's no way to stop it.

Someone else will do it at this point, which is very true, but why?

It's just feels like a technological advancement is just using humans at this point.

There is no more advancement that is really I'm really interested in anymore.

It's just more stupid AI slop, and it's going to be more stupid AI slop for generations.

Well, I think the idea of Slop is really interesting when you think about what is happening here, because one of the biggest overall questions is monetization in this story.

And like there are several times in it, you know, they're talking about using selective breeding to determine which characteristics are going to continue into later software versions.

And they're growing more and more attached to these entities.

And these these entities are, you know, I wouldn't call them very intelligent.

You know, they're like children.

Well, we can talk about what they are later.

Actually, I think I want to make this point first.

But like, there comes a point, there comes several points in the story where I'm like, what are we doing here?

You know what I mean?

Like, if we're not and, and he explicitly addresses this, are we training employees?

Are we training products?

Are we training, you know, sex workers?

You know, what are we, what are we doing here?

But the idea of raising these digital children that, you know, they, they say they're doing some, they say they're raising them and trying to create, you know, higher and higher intelligence, but like they don't seem to be any more capable than your average human child, you know what I mean?

And it, it just makes me wonder like why when there are actual children in the world and, and, and the characters themselves have this very conversation, one of the characters gets pregnant, She says, Oh, I'm not going to take care of, of these digions anymore.

But then it's like everyone else seems to feel like there is an ethical obligation to not abandon these these digital entities.

But it's like you can, I mean, you can just save them.

You know what I mean?

Like what what it all boils down to is like the metaphor that people use on, on these digiants seems to be the only way that they can think about them.

So at a certain point they think about them as like pets, for example.

Well, you know, then they act like pets all the time.

Then they kind of think about them as like children, you know, they, they act like then they then they start studying and go to school.

So they make them look like students, like young adults, you know what I mean?

But the overall metaphor that no one seems to be able to get away from is these things are alive, and that in a large way seems to be restricting the kinds of ways that they feel comfortable or not comfortable with interacting with them.

Yeah, and there's there's kind of two camps.

I mean, so you mentioned some people who just totally walk away and they're like, what are we doing?

Some people just freeze these beings who they've purchased, who've become their pet and kind of become their weird surrogate child.

They freeze them when they get uncomfortable and then they're just suspended.

But that little personality could be renewed at any time.

But there are, I think, the people who continue become two main camps.

And it's our two main characters.

And we go back and forth between their two perspectives.

So one was a zookeeper.

Her name is Anna.

And she is going to be the one who will always protect them.

You know, no matter what.

She doesn't want to suspend them.

She doesn't want to freeze them.

She wants them to learn from their mistakes and always learn and grow up like a child.

The very end is her saying, go and do your homework.

Then there's the opposite side, who seems to go along with her this whole time, I think, What is his name David?

Or is his name Marco?

They're floating around in my head.

I think you're right.

David.

Marco.

Now you got me, you got me scared.

I got it wrong, but I think it's David.

Yeah, Marco, I think is her boy boyfriend, but David is the other guy who he's in.

He wants to be her boyfriend.

But anyway, so David also is super into the idea of, you know, letting them grow up, letting them develop, but they eventually split off when he says, OK, I don't even know what these things are.

You know, we've called them pets.

We've called them robots.

We've called them children.

I don't even know what they are.

And it seems like they understand themselves more than I understand them.

He has two ones called Marco and once called Polo.

Polo is a copy of Marco, a clone of Marco, and Marco is telling him like I want to be a corporation, I want this, I want that, I want this.

And he eventually decides, OK, I am just coddling them and I should let them go off and do this crazy thing which we can talk about later, but I will let them go and do this.

He decides that they have to grow up and make mistakes on their own and they can no longer be controlled or kind of they can't be raised.

They have to go off and raise themselves.

And I think that is a major place where our relationship with this kind of life form kind of diverges.

I'm not opposed to the idea in general of like, you know, like an artificially intelligent system gaining personhood, legal personhood through incorporating itself as a corporation.

I think that is like a really interesting idea, but I, I guess I am a little bit like confused on the ethical groundings of a lot of the, the commitments that the story was making.

And it is going to sound really callous of me, really, really callous.

But I spent a lot of time thinking about it and I'm not sure if I can fully understand the ethical obligation not to torture AI.

Like that was a big thing in this.

Can you believe these sickos are taking our our diggiants and, you know, torturing them just, you know, for the their own like sick twisted pleasure.

But like, let's just take the opposite of like kidnapping a random person and torturing them.

You know what I mean?

Like silly serial killer stuff.

Well, that's obviously worse.

Well, why is that obviously worse?

Well, there seems to be something that humans have that these digital creations that primarily live inside of a computer mainframe.

Humans seem to have that something, but these don't these things, you know, they're programmed to feel pain.

But what does feel mean in this context?

What is, you know, who is the person?

What is the thing that feels the pain?

Or is it just a simulation of, of feeling pain?

Like there's, I don't know, I, I feel like I'm probably probably getting the down the downward thumbs on on this video from a couple people, a couple angry comments from this one.

But just like, I don't know, I just feel like the, the, the commitment to saying these things are equivalent to humans or dogs or living creatures or animals.

Like it's in some ways covering up.

It's glossing over all the ways in which these things are actually not living, not people.

I was thinking in a similar way, but I think like, the scene that they describe where one of the digiants has been stolen, pirated, copied and now is being tortured, and that video is circling around.

I found that extremely disturbing.

Like, regardless if you know if it's a human or not, it's just we, no one should be doing that.

And it's just disgusting.

But it also made me think like, yeah, it's just, you know, it's very troubling.

It's terrible.

But I was wondering too, like, why should we even get to a point where that is possible?

I mean, why even make these things if we're going to allow them to be copied?

One of the one of the other disturbing things is it's an interesting idea to let them be corporations, to let them incorporate.

You know, Marco and Polo really want to do that.

Another digiant has also incorporated himself.

But I think it's strange that they can copy themselves and sell themselves.

So they will not only be like they'll have rights, but they will also have as a corporation because they are a digital, I guess I could say a digital entity.

Sorry, something that can be copied and pasted.

God is dead and we have copy pasted him.

Zach they can allow copies of themselves to be sold and those copies, anyone can do anything with them.

And why like even pursue creating these kinds of entities if they are allowed to do them that to themselves?

I because it's not really just selling your own image.

It's selling a copy of something that can feel pain and then just selling it.

You're basically allowing to to slave themselves out.

Why create something like that?

That's that like that's really, I think the tension with a lot of AI and robotic stories is like with with most AII think like the idea that anyone is operating off with AI is that they want labor done without paying for that labor.

You know what I mean?

Which is just another way of saying they want, you know, they want a slave basically, they, they want, you know, someone who will work for them without money and they can tell to do whatever they want.

You know, that's, that's I think the dream for most people who dream of AI, whether that's acknowledged or unacknowledged.

Anna says so.

Anna, one of the, the main characters in this, whose main, whose digit is called Jax, she says the same thing.

Zach, she says when she's talking to Pearson, who is looking to make a deal basically with how we can use these digits.

She realizes this, she says the fundamental incompatibility between Exponential's goals and hers.

Exponential as the company, they want something that responds like a person but isn't owed the same obligations as a person.

And so that's when she refuses to make this deal.

They're, they're trying to make this, this option to have them permanently, that they need enough money to let these digits permanently come out into the real world.

Because right now they're all in kind of a digital space that mimics the world.

They need that money, and they're going to really sacrifice their values to get that money.

And so if something is on the table about what they're willing to do with the digits, and if they agree to something, the digits can come out in the real world.

But she realizes no, people just want to use them for free labor or free other things.

In a weird way, you know, the story is called the Life cycle of software objects.

And I, you know, that is a reference to like, you know, the life cycle of software basically, you know, or like a life cycle of a corporation.

But you know, there is a lot to be said about just how this really kind of mimics the life cycle of a human.

You know what I mean?

Like starting off in childhood, big dreams, playfulness and then, you know, toddler, you know, just learning language.

But then the studiousness of, of those teenage years, they start to get grumpy, they start to get stubborn and, and, and, and willful.

And then there comes a point where you say, OK, you are now a person.

We are thinking of you as a person now and not as a dependent.

Now you have to, you know, join the economy.

Now you have to get a job.

Now you have to become productive in some way, shape or form.

And there are many, many ways to do that.

And you know, I think all options, including the oldest option is of course explored in here.

Explored in a pretty disturbing way.

I mean, there's, there's, there's some uncanny horror, I think, and it's explored in a very kind of deep, disturbing way.

It's like a kind of an uncanny horror because they can't even tell really like developmentally where, how old these beings are.

And then there is this, what is that company called?

They're called Binary Desire, I think.

And so binary desire approaches them to say, look, we'll pay a lot of money to basically have these digiants have what they what they compare to brain surgery to give them basically experiences that are sexual, but they don't want it.

And it becomes an argument, OK, is this allowing them to become more human?

If so, OK, interesting.

We can let them develop.

We can let them grow up.

But it seems like these companies are actually just trying to exploit them.

They are trying to connect them with different clients who are wanting to be with these digiants sexually.

And that feels like, you know, on the one side, OK, let's change the digiants.

They're already beings.

Why don't we let them become more human?

But also, this is a deep, disturbing part of the Internet.

Why are we going to pursue that part of the Internet and make these digiants now become exploitable?

Yeah, I like how they said, I mean, she basically calls them out and says like, look, you're, you're trying to take a kink and like not only just like, like make a new genre of kink, but make a new mainstream genre of kink.

You're trying to change human sexuality so that these objects you want to buy from us become valuable enough to sell at high product, high profits.

And I thought it was just like a perfect illustration of, like, the audacity of just capitalism to take like, the lowest common denominator of something and be like, oh, we're going to break this taboo, you know, that everyone knows is not right.

And we are going to actually say yes, do this thing.

There's a market, yeah.

But I I did think that.

I can't remember who made this point, but I thought it was so, so good.

Where I think it was, I think Jax was saying, you know, I, you know, I want to do this.

I want to become sexualized is what Jax was saying.

And they're like, but it's not natural, you know, it's not what you naturally do.

And Jax was like, why did you make it so that we have to eat?

And I thought that was like such a good like criticism of the whole like project.

I mean, I don't know, just like Jax correctly calling out that like there's no natural or unnatural behavior for these diggiants.

It's all just programmed by the people who have been, you know, not just creating them, but you know, through Darwinian like natural selection, domesticating them and breeding them.

You know what I mean?

Like, like every single thing about them is not natural.

It's built into them.

So if someone is to say, well, you know, it's unnatural for them to simulate sexual desire, well, like he points out, it's unnatural for them to simulate hunger because they don't actually need anything.

So I get.

So maybe it's good if they if you've already created these things, you know, and they already are, they've become sentient, they've become they're already conscious.

They they know who they are, they understand themselves better than anyone else.

Like Marco says one of the digits.

It's just it is like Chang is really smart by using this creepy company coming in to give this option because it's like, oh, this company has the ability to make these more human, But then it's also, oh, this is it just doesn't feel right.

The, I was thinking a lot about some of the other artificial intelligence books that we've been reading.

And like what some of the conclusions seem to be about the relationship between humans and artificial intelligence.

And kind of like what it takes to be telling if something is artificial or not.

And then also how to treat like, what, what, what is empathy really doing for these relationships?

And do androids dream of electric sheep?

It's decided that they were never human to begin with.

So we've made a test to prove it.

They've tried to trick us, but we can try and find it out.

Maybe we can do our best to try and find it out.

And Clara and the Sun, it feels like at the very end, Clara is this really wonderful character.

And as readers, we really grow attached to her story and we don't really care about some of the other people because she has the most empathy.

But it feels like at the end she concludes this idea that humans are not really going to care about an imitation.

Humans are not going to care about an artificial intelligence.

It's only she said, she says this.

I'll just quote her.

She says there was something very special, but it wasn't inside Josie.

That's the character.

She's going to have to replace this girl.

It was inside those who loved her.

So it's in the relation of the object relation.

So it's in the other people around you who make you human.

And who can say, oh, that's a human, that's not a human.

It's not just one personality deciding that it's human or not.

But in this, it feels like for Chang, he's really interested in treating AI well and being patient and letting them develop and grow up on their own.

And he said that people always talk about rights, but what we really, you know, like what rights will artificial intelligence need to have?

But he has what we really need to put real effort into is their individual, our individual relationships with AI.

And he gives an example, saying no matter what roles we assign, AII suspect they will do a better job if at some point during their development there were people who cared about them.

So I think he is putting up a new model of like, if these entities, these digitants can think, can feel, can change, then we need to really think about our relationship with them, treating them seriously, taking them seriously and letting them grow up on their own.

So what are you proposing here?

Are you saying that the the digitants get whatever they want or are you saying like or just that people be very mindful with how they interact with them?

I think he's saying we need to be really mindful in how we interact with them.

But it's making me think about an artificial intelligence I've always found very wholesome.

You know, everyone loves them.

Data leave lore aside, but Data is unique.

He's trying to understand what it means for him to be alive, trying to understand what it means for humans to be alive.

And maybe we like him because he's trying to be more human.

And maybe that's something that maybe I find distasteful about the digiants.

You know, we don't, we can't even say exactly what they are.

So maybe that's unfair.

That's my own prejudice.

But something about Data is great.

He thinks on his own, he is very unique.

There is no one else like him, except for his brother, who's very different from him.

But he is a true individual.

And I think the Diggiants are not.

They are.

They start to be, because each one continues on its own little timeline.

But they are mascots, that's what they call them, who can be copied, copied, copied, copied, copied, copied, copied copied copied ad infinitum.

And I feel like that feels quite a bit different from.

Data very Walter Benjamin art in the age of mechanical reproduction in the sense that, you know, data, data is 1 of 1.

He has that mystique, that allure about him, that sense of presence, because there's only one data on this, on the enterprise.

But here, you know, these are, you know, we're, we're, we're simultaneously told that these are things with personhood.

They're people, I guess you could say.

And yet also they can be copied.

You can make 1000 copies of them with a single click of a mouse, you know what I mean?

So there's a sense in which like the act of that kind of copy pasting cheapens them.

It makes them lose their aura, lose their allure, lose their sense of weightiness.

Because you know of what value is any one of them if you can just make a dozen more just like that.

This is hopping genre for a second, Zach, but I was really thinking about Tolkien when he's writing unfairy stories the the essay and he says that he's making a sub creation.

And in making that sub creation, he's praising creation.

People were attacking fantasy, saying like it's a it's a distraction, it's silly.

And he said, no, look, fantasy is my devotion to the whole world.

And people have argued after reading Tolkien, I feel like I'm more awake to the trees.

I am more interested in the breeze going through the grass.

You know, Tolkien makes you appreciate nature more.

I wonder in this when people are basically they're kind of making a sub creation because they are creators creating new things, they're creating life, does it feel the same as sub creation or is it different?

Wow, Yeah, that's really interesting.

I love the connection to sub creation.

And I do think that you're right, or at least the idea that you just said that, that it could be devotional.

I think that it can't be anything but devotional with the amount of hours, weeks, months, years that they spend growing these digits.

But I, I do think it's a great open question of how exactly their sub creations impact the lives of these characters or even us as the lives of our readers.

I mean, I guess maybe every person would have to answer this in their own way.

But, you know, does it make me more aware of the needs of pets, of children?

Does it make me more aware of it's not making me more aware of the breeze and the trees?

You know, we can, we can rule that out.

It's not doing what Tolkien is doing.

But then the question for me is to open what is it doing?

How have I been changed by reading the story?

I think not only how have I been changed by reading a story, but also the actual characters in doing their creation.

Like are they making something itself that is praising life or does it detract from life?

Yes they have made living beings but now these living beings can be copied copied copied.

Does it feel kind of like a like a a reverse of the sub creation?

Like an evil side to the sub creation, I guess.

It made me think a lot.

We read Heinlein, we read Time, Time enough for Love.

And Lazarus Long was just a really kind of left you a little uncomfortable, pretty uncomfortable character.

And for some reason what they are doing in a story reminds me a little bit of what he's doing in that story.

We did a whole video on it.

But Lazarus Long at some point clones himself, and he has these two clones, and then eventually he has relations with these two clones because they're so attracted to him.

It's a very like book, but it made me think about this a little bit because these people are having ideas, right?

And they want to make these digiants, which were originally just an idea.

OK, we have this idea.

Let's put them into a digital world where they can turn into something more than an idea, but they're still in the digital world.

Now let's bring them out into the real world.

Now let's expand their horizon so they can actually participate in the real world in a more meaningful way.

And then eventually they want to have sex with them.

Like it makes me feel as disgusted as when we read when we had to sit for so many hours with Lazarus Long.

It feels kind of similar.

So in one way, you could say, well, it is a good sub creation because at least Anna, really, she's one of the main characters.

She loves these things, She loves the diggiants.

And so in that way, she is making a sub creation of like a honorable love.

You know, she wants these things to live because life itself is the sacred thing and she's honoring that.

But then at the same time, it kind of feels like Lazarus Long, like just making things that are mirrors of himself.

And you have to wonder if it's like, you know, her true goal was to care for animals and failing being able to care for animals, she's now gone all in on caring for these digeons.

But you know, if I said that my dream was to be a psychiatrist and to help people and feeling that I become, you know, like a Sims addict, I feel like, you know, I feel like it's fundamentally kind of the same move.

I'm just not sure like as an act of sub creation, you know, I don't deny that caring for, you know, these digital entities can create a transformative effect for the people who care for them.

You know, I think that it's very true that by having these digital entities be such a large part of these characters lives, you know, we see everything outside of the Digiant through how it affects them.

You know, the breakups, the, the uncertainty, you know, the, the, the, the amorous feelings between characters, the desire.

But I'm not sure if that's enough to really justify devoting your life to these things to the point of where these things are.

The lens through which you look out onto the world.

You know, you see everything through the lens of how will this affect my digiants?

How will this affect my digiants, you know, to return back to Star Trek?

I think Star Trek is a real guiding light for it should be for life.

It feels like a if we really have a true digiant, you know, and it is a person, you'll immediately recognize it as a person.

Like if we really met Data, even someone way less intelligent than Data, you're going to respect that person.

You're going to treat them like a person if you don't, you know.

But it feels like the fear about these digiants is that they are the Borg.

You know, they're just repeated, repeated, repeated, repeated.

They're making themself again, making themself again, and they're going to change something fundamental to life, I think.

I guess that's just how I'm thinking about it.

My fear is that the future will bring us to the Borg, but it probably will more likely to bring us towards Data.

You know, you come across Data, you're going to treat Data like a person immediately.

You know that's a person.

Well, this was an interesting one.

I want to read more.

Ted Tsiang.

I think that we can do it.

I think that we can find an excuse somewhere along the line to to knock out a few more of his short stories.

I'm looking forward to it.

All right.

Talk to you later, Bob.

I would love to read more.

Yeah.

Talk to you later, Zach.

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.