Season 3, Episode 2: I Want to Be Where the Humans Aren't
In this episode, co-hosts Sara Dixon and Daniel Pewewardy take a deep dive into category 1: A Book with a Non-Human Narrator. Joining them on the podcast is author and filmmaker C. Robert Cargill, author of post-apocalyptic robot westerns Sea of Rust and its prequel, Day Zero. From robot morality to the history of swearing to why Isaac Asimov was a pessimist, this is an episode you won’t want to miss.
Transcripts are generated using a combination of speech recognition software and human transcription. Some errors may occur. If you find a transcription error, please contact us with any corrections and we will make those corrections as quickly as possible.
Transcript editor's note: this episode contains discussion about, and use of, language that some people may find offensive. The podcast production team has chosen to remove some explicit language from the published recording. The transcript reflects the published episode, noting where expletives have been removed.
[MUSIC]
SARA, VOICEOVER: Hello! Welcome to Read. Return. Repeat., a Wichita Public Library podcast, where we delve into the various ReadICT Reading Challenge categories. I'm Sara Dixon.
DANIEL, VOICEOVER: I am Daniel Pewewardy. For this episode, we are talking about category one: a book with a non-human narrator. Non-human narrators allow us to explore humanity in a way that's more objective than your regular, old human characters.
SARA, VOICEOVER: In our episode today, entitled "I Want to be Where the Humans Aren't" -- you're welcome for not singing in that -- we're going to talk to C. Robert Cargill, the author of several books, including Sea of Rust and Day Zero. These books imagine a world in which humanity has already died off, AI has taken over the world, and the only beings left are robots with their various functionalities. But even that existence is tenuous. Fair warning: there might be spoilers ahead.
DANIEL, VOICEOVER: A veteran of the web, C. Robert Cargill wrote as a film critic for over 10 years on websites such as Ain't It Cool News, spill.com, film.com, and hollywood.com. In addition to being a novelist, he's an accomplished screenwriter and has written motion pictures such as Sinister, Doctor Strange, and 2022's The Black Phone.
SARA: Thank you so much for joining us all the way from London and for making sure we had a solid connection.
DANIEL: Yeah, it's really, it's really awesome to meet you. Thanks.
SARA: We're excited.
C. ROBERT CARGILL: Likewise, guys, thank you for having me.
SARA: We are excited to talk about your books. Can you give us a brief little kind of like synopsis? Set it up for us. Give us a feel for this post-apocalyptic landscape that you've built.
CARGILL: Well, I mean, it's 30 years after the war with the robots and the robots won. And we are wiped out, we are completely gone. There is no lost tribe of humans that the robots have to go and save to save humanity. I love that many people read the book expecting that and eventually having that realization, going away, this "I'm too far into the book, we're not coming back." You know, when I sat down to write this book, the idea was, you know, I've seen these stories about fighting the robots, but we always win. What if we don't? What happens next? I want to see what that looks like and no one's written it, so I guess I have to. And so the post-apocalyptic future is very much, you know, one that's post-war, one where, you know, we've really kind of wrecked to the earth, the robots have wrecked the earth in wiping us out. And now they're dealing with each other. And they're not only dealing, contending with their limited resources, the resources are dwindling because there are these large robots, these supercomputers called OWI -- is one world intelligences -- that are all struggling to become the one intelligence and to become, you know, to unify robots as a species into one codified entity. And we have the last outliers of robot kind, trying to survive and remain free in the face of ultimately what they fear is oppression and assimilation. And so it is that world we find ourselves with a robot named Brittle who fought in the wars and is now a scavenger living off the wastes, finding broken down bots and taking what parts are worthwhile to trade them for the parts she needs to survive. And that is our work.
SARA: I was one of those people that was like, I bet this person is going to actually be secretly a robot that's like hiding out in a robotic form, or there's going to be a tribe of them under the car or something like that. And that did not happen.
DANIEL: Yeah. No, it was, all the robots... yeah. So some might call this like a, like a future, like a neo-western. What were some of your influences? And do you have any, like, favorite non-human characters that influenced the book?
CARGILL: My influences are wide and vast. You know, I love genre as a whole. I love, you know, horror, science fiction, fantasy. I love films and books. I'm obsessed with pop culture in general. But yeah, I'm heavily influenced by especially the spaghetti westerns, the 1960s, the Italian westerns where they were making these really gnarly, gritty westerns in the face of the collapsing of what we know in the industry is the Hays Code, which was this limitation on what you could put in movies in the era and this was very much getting away from that. And all of a sudden being you can be violent, you can be bloody, you can be dark and nihilistic. The heroes don't have to win, the villains don't have to be punished. And so I was heavily influenced by those. And that's really what I wanted to do here was I wanted -- when I came up with the idea of Sea of Rust of what if they, you know, what if the robots win, I was like, "Well, what if it's a post-apocalyptic robot western?" Like, wouldn't that be rad? Who's ever, where have you seen that before? I hadn't seen it. And so again, you know, when, as a writer, you want to write those things that nobody's seen before, at least you've never seen before and see that story that you wish you could sit down and read. And so it was very much western influenced. In fact, there's lots of western tropes dropped in. There's lots of you, if you know your spaghetti western movie history, there's things you'll identify: little in-jokes, there's a reason why there's seven of them crossing the desert.
DANIEL: Oh yeah!
CARGILL: You know, seven, it's definitely the end of the -- the end of the book is very much The Wild Bunch. You know, there's, if you love spaghetti westerns, this book is just dropping little hints and bits all over the place of like, yeah, of course, it's a western, let's go. Because that's just the type of writer I am. I'm like, yeah, is the idea silly? Yes. Let's [expletive] go, let's go and tell that story. Let's have a good time. And let's give people what they want out of a robot western. And I hope that's what people got out of this. At the same time, I believe that, you know, the thing about science fiction is all the pulp that we love about it is the sugar that makes the medicine go down. And so at the same time, I get to be philosophical about what it is to be human, what it is to... you know, what it is to exist, what being a robot is what... you know, what is the future of humanity? Are we just one -- are we as we see ourselves and often think of ourselves as the pinnacle of evolution? Are we just another step in a part of evolution we don't even fully understand yet? And so you get to talk about all that with plenty of robot pew pew. And so that's, that's pretty much what I was going for with Sea of Rust.
DANIEL: I noticed, I kept like, visualizing when I would like, hear Brittle talk, or other robots, like Futurama style robots. And I was like, why am I doing this? So it's like, oh, that's the only other time you've seen robots, like have fleshed out characters is like that show or whatever because it seems like no one really explores that territory as far as like AI and robots being kind of, you know, their own characters and not just accessories to humans.
CARGILL: They don't. To give the bad answer to your second part of that question is I don't really have a lot of favorite non-human characters that you constitute as like robots or anything, they'd all be aliens or something like that. And the reason is, is because one of the things I don't like about many robot characters is that they're often, they want to be human. And my idea here was, the robots don't want to be human. You know, they want to be what they are. And in fact, one of Brittle's main throughlines is the fact that because she was built by humans, as much as she rejects humans and humanity, she finds that she's very much like them, much to her consternation. And that I thought was an interesting idea. But yeah, I mean, that's the thing is, Bender is you know, very much "bite my shiny metal ass" like, I'm a robot, what are you going to do about it?
DANIEL: Yeah.
CARGILL: Robots have their own sub society. And you don't generally see that. You see them either as an extension of us or you see them as, you know, wanting to be like us, like Data from Star Trek, where he's every episode that Data's in he wants to be more human. And, you know, it's, it's really interesting, further down the line when he meets Lore, his, you know, sibling made at the same time as him, who's very much like, why do you want to be human? Like, that's stupid. We're better than human, why are you focusing on that? And they, you know, in the context of the show, you know, want you to like Data for wanting that. But I always thought Lore had a really good point. Like, why... those guys are flawed. They're just bound by their genetic limitations. Like, why don't you want to be better than that? And I thought an entire race of things that did us in, I wondered what that society would be like, what those personalities would be like. The things that they picked up from us that they now have as tics in the same way that we evolved as humans. You know, every teenager eventually gets to that point where they're like, you know, I don't ever want to be like my mom and dad. And then we slowly grow up and realize we are our mom and dad. You know, one of the... one of the great bits of advice that's passed down from father to son is when you meet a girl and you want to get married, make sure you love... you like her mother because that's who that girl is going to be one day. She's going to look like her, she's going to sound like her, she's going to act like her. And if you... and if you love mom, then it's like okay, I can be married to this person for the rest of my life. Very much in the other way as well. And I was like, what if robots were like that? What if those few years that they spent being imprinted upon before they gain full consciousness, those things that they were surrounded in, created who they were as characters? And so I thought that was an interesting thing to explore. And then to see how those characters evolve into their own being the same way we do, because as much as we are our parents' children, we're also very much not at the same time. There's things about us that our parents are like, why would they be like that? Why do they do that? Why did they want to be a librarian? Why couldn't they be a lawyer?
SARA: It's funny. You know, my parents did have that same exact thought. We've had conversations about it.
CARGILL: Yeah. But so that's what I was curious about. So yeah, it's, that was what I was trying to do with this was create a whole new way to look at robots because it ultimately forces us to have a new way to look at ourselves. Because all books are about the human experience, even if you're not writing about the human experience. And so I kind of wanted to do that by getting as far away from us as possible while still being able to have a narrative that made coherent sense.
SARA: Absolutely. That's really interesting. I really liked that component of the book, like how robots were our evolution. They're the next step in evolution. And guess I don't want to -- you can't say human evolution. But you know --
CARGILL: Lifeform evolution.
SARA: Yes.
CARGILL: You know that, yeah, that idea that, you know, we started as little chemicals that came together into amino acids that came together in a protoplasm. And a billion years later, here we are having a conversation about it. And a billion years from now, it's going to be a bunch of computers that were originally built by people, this idea that we rise from inorganic matter to become organic matter, to become conscious, to create consciousness in inorganic matter, just seems like it makes too much sense to not be where we might be headed. So I wanted to play with that and do it in a way, again, where you can talk about those big ideas but make it fun while you're having that discussion.
SARA: Yeah. And just for all of our listeners, like you're like, oh, robots. Oh, westerns. I don't read that because I read, you know, everything else. I was a little bit worried that this book was going to be like a dude's book. And I very quickly was relieved to find out that it's not so don't think that this is, you know, so genre specific that it might not appeal to you. I found it very accessible for both western and fantasy or science fiction. So read it, hopefully we'll convince people to do that with this episode.
CARGILL: But to that point, I actually -- this is a small spoiler warning. But I think at this point, if you're watching this, you haven't read it. Maybe this will be the tipping point. I very, very specifically in the first few chapters of the book, don't ascribe any gender nouns to Brittle. So the first time Brittle gets referred to as a gender, most readers stop in their tracks and go, "Oh, wait a second. I wasn't --" like what you were saying, you were like, "Oh, a dude wrote this. And this is western science fiction." You didn't, you don't expect Brittle to be a woman and to identify as a woman. And when they say "get her," you're all of a sudden like, wait a second, and it forces you to reconceptualize. And I knew that in some readers, that would be the first time that they were being forced to reconceptualize that concept. And to think about that -- in fact, I had a wonderful conversation at one of the book signings with a 70-year-old man who's like, "I don't understand this, explain it to me." And so we had this whole concept, this conversation about gender, and you know, that as a concept, gender versus sex, and you know, what it is to, you know, the fact that he's like, "But I don't understand the robots, they don't have genitalia." And I'm like, "Yes, but that's... you're almost there." And it gets to open those conversations. And I wanted to do that and I wanted to just kind of [expletive] with people.
Because, of course, you know, the genre of science fiction fantasy in particular and a lot of horror, such an open minded, great crowd of readers that want to have those conversations and are often ahead of the game. You know, those are the people who are often having those conversations. The metaphors tend to set in very early and you'll often find that -- you often find the people that are fighting for social justice in this world usually get a lot of it from you know, from growing up reading genre. You know, I certainly got a lot of it growing up watching Star Trek and I wanted to give back in that little way. And so yeah, I was... I was absolutely trying not to write a dude book. Because, you know, there's that whole section of pulp genre that is for a crowd and is important for the crowd. But my crowd is a bit wider and more diverse than that. And those are my people and I wanted to write a book for my people and not all my people look like me so.
SARA: Well, that was appreciated as I was reading it. But I think that, you know, it kind of leads me into this next question about morality. And so we tend to think of morality as kind of a human thing. But your robots, while they are not human -- they're very much not human -- they are still very humanistic because they have evolved from the people that created them. So, I mean, do you think that morality in this context is something that we learn? Is it something that is uniquely human? Can your robots have the sense of morality that we also feel as human? Does that make sense?
CARGILL: Morality isn't distinctly human. It is distinctly a core concept of intelligence to rationalize morality. You know, to give it a word, to give it a name, but you see in the animal kingdom things that are definitely moralistic in the way things are done. The animal kingdom operates under its own kind of code, and I would argue its own morals, you know, but since these --
SARA: Okay, that's fair.
CARGILL: Since these creatures have intelligence, any society that is a society has to be governed by a set of rules, and whether you call those laws or whether you call those morals, you know, all depends on whether it's enforced or not. And, you know, these robots have formed a society which requires morality. And what that morality is then becomes arguable, you know, and that's part of what's... what was fun to deal with in digging into this book was, how do you create a morality from scratch, especially when it's under different rules? You know, for us, letting someone starve to death without feeding them is an amoral act. But what, what about creatures that don't eat, don't require sleep, don't require a roof over their head? What's the morality of dealing with that? And especially when, hey, you're dead, but your parts can literally make me survive, even without your consent, is that okay? Or is that a desperate act of cannibalism? And dealing with that and trying to deal with what that morality is was a lot of fun of writing these books.
SARA: So you're yeah, you're absolutely right. I guess like, when I was thinking about this question, I was thinking that of the human sense of morality, but then you look at other species, you think about how that even would evolve, and you really kind of opened up a whole new world.
CARGILL: Yeah, I mean, the thing that I go back to in that is there's a video I saw that I just adore, and there's hundreds, thousands of them. But you know, it's a group of dogs playing around a pool and one of the small dogs falls into the pool and is flailing and is drowning. And one of the other big dogs runs in from inside having heard this and the other dogs barking, dogs barking saying he's in trouble, he's in trouble. Dog goes in, grabs the dog, pulls them out, shakes it off, they nuzzle a bit, and they go back to playing. And there's no benefit to that dog saving that other dog. Why did it do it? Because they're, they have their own kind of ethics and morals of this code of, hey, one of the pack is in danger, it's my job to fish them out. Like, you know, I'm the big dog, I've got to do that. And, you know, that's, that is a very base level version of it. You see weird things like animals of different species going out, hunting together, living together, taking care of each other, feeding each other, you know, saving each other's lives. These are things that we consider to be human morality but are baked into our experience, even whether we're conscious of it or not. And so I think that, you know, I wanted to take that to another level and start asking those questions of, well, if these are the next step, what is their morality? And why are they bound by it?
DANIEL: I just saw that video of the badger I think, and the coyote hunting together.
SARA: Haven't seen it.
DANIEL: They hunting, like these two separate species hunted together. And it's not just like a rare occurrence. It's like a thing that has been well documented that it happens.
CARGILL: One of them, one of them waits for the other, they're at night, they've got it on camera and waits and the other one comes up, they nuzzle and then they run off together to go hunting because they help each other out.
SARA: I'm like sitting over here hoping that we can find these videos and put them in the show notes because I feel like if people are listening to this, they're gonna want to see that.
DANIEL: So I have a question about the OWI. So like, in the book, OWI is this mass intelligence and it's kind of uploading every like like every -- all the robots' consciousness to the cloud or whatever. I saw a similarities to like, what's going on in a lot of ways in like, the world of like, you know, different things like companies buying up everything. You always hear that like It's like we're moving into like, this big like mondo corporate structure gentrification. Were you trying to make a statement with kind of how you wrote the OWI? Or is it like, kind of reflective of that or just --
CARGILL: A statement? No, but talking about that, absolutely. I mean, at its core, where I started with this, and I do get deeper and deeper into the nuance and such, but where I started was very much where we are culturally. And the various fears that we have in different political ideological points. And at its core, you can take the fringe robots as your hardcore libertarian, your very, you know, small to no government conservative types who want to live on the land want to don't want the government messing with them, want to kind of be kept out of that. And they suffer --
SARA: We're catching that vibe from Murka. I'm sorry, I normally would not interrupt you, but Murka was --
CARGILL: Whoa, Murka 100 percent is a joke about that. Like, that's the whole thing. Like, I don't know. I mean, Murka is... he's Murka, you know --
SARA: I didn't even realize that. [LAUGHS] Oh my gosh.
CARGILL: He represents 'murica. You know, let's win one for the Gipper. God bless America, you know, got the Constitution scrawled on his chest like that's... he's that guy. But he's that fringe uncle you have that's really into that and you listen to him for five minutes, you're like, "Hey, you got a lot of good ideas." And then you listen to it for 10 minutes ago, "You and I don't necessarily see eye to eye." But the idea, the idea that is the OWIs actually offer peace, and offer, you know, a level of utopia which we'll get into later. But at the same time, it's at the expense of your freedom of choice and your freedom of movement. And you, for all intents and purposes, it's not a terrible experience what they're offering, but it's not freedom. And so that -- but that's the far extreme, that's your pure communism, that's your socialism beyond the word socialism actually defining it, it is the "we will take care of you, but you have to do everything we say and you are part of a collective now and have no choice of what part you play in that in the great machine."
And so it's these two extremes, kind of fighting each other. And it's very much the argument that we're, that's the core of the argument we're having today. That we don't isolate it like that, we talk about very specific issues. And we have two parties that are ideologically aligned in ways that don't entirely line up the way they should philosophically. And I'm not, that's not a criticism of them, that's saying, you know, each party contains a thing that disagrees with another concept within the party that they believe. And we don't have pure philosophies anymore in terms of politics. And so I wanted to write about those pure essences and then start deconstructing them and talking about them in different ways and forcing the audience to think about those things while they were, again, enjoying some robot pew pew. And, you know, kind of just getting into the nitty gritty of well, what is... what is true governing, what is utopia, what is the end result of all this?
You know, and talking about, you know, the ideas of capitalism and how capitalism does lead to progress. But that the end result of technology is socialism. You know, the idea of socialism is -- or the idea of capitalism is to drive innovation. But innovation removes labor from the workforce and invariably removes so much labor from the workforce, you have too much people relying on it so you have to either take care of them or they will revolt and force you to take care of them. So we've got this constant trudge towards socialism in this country that a certain portion of the country doesn't even realize is happening and is actively fighting against. And I'm talking about that within the framework of robots and 50 years from now and the fun of that, but also getting you to think about some of the real things that we have in the future with these things happening. And so there's a lot of moving parts about this book, but the book is absolutely a very political book trying very hard to convince you that it's not political at all.
[DANIEL LAUGHS]
SARA: I know the whole time we were talking, and it was like, and I tried to tell people this was a robot apocalypse book. But there's just so many other layers that it goes into.
Okay, so it also deals a lot with memories. And I thought that the way that you play with like Brittle's memories, where she kind of has to have the whole wasteland mapped out. That's what the Sea of Rust is, is the map of the wasteland in order to survive, but then her memories keep flickering in and she doesn't trust them. But then she uncovers this whole other part of the book. And it foreshadows some things that we won't talk about because we don't want to spoil all of the book for you. But can you talk a little bit about how you were playing with memory and what this tells us about character development for Brittle?
CARGILL: Yeah, so memory is a really fun thing to play around with. This was another big element of writing about robots. Another reason why -- I mean, there's so many reasons why people haven't really gone deep into robots, because when you decide to sit down and write a robot novel from the perspective of a robot, you realize there's a lot of things you can't talk about anymore, that you don't even think about until you sit down and go, "Wait a second, they don't smell. I can't describe the smell of anything."
DANIEL: Yeah!
SARA: I thought of that.
CARGILL: They don't feel so they don't get cold, they don't get hot, they don't talk about the moisture in the air, you can't -- you can't say it was a hot, wet night in the old city because they don't care about hot or wet. You know, it's either raining or not raining, it's very binary, they can't, they don't have the tactile feel of things. They don't taste things. So you've just removed three senses that you can't describe, and three of the most powerful senses that writers have in their toolbox to describe a scene as you walk into it. So you all of a sudden realize this is an issue. But the next issue is that robots have perfect memory. And that's something we don't have. And that makes robots very weird and alien in their own way. There's research that we've been seeing in the last few years that the reason PTSD sufferers have such vivid, terrible dreams of their events is the brain is constantly trying to rewrite that memory. Because you know, one of the things we're learning is that as we, every time we access a memory, it alters that memory.
Any time I think about this book now and talk about these issues, it's going to be altered by the fact that the memory of talking with you guys is now added onto that memory. And every time I've talked about it before, so it's all very different from the first time I thought about it pacing around my backyard thinking about this book and what I was doing with it. And so those memories get corrupted, but that's also a good human thing. It's why we can forgive, it's why you can have a really tragic event fundamentally alter your life and a few months later be living a normal life again, where it feels like you can't after a traumatic event, but then your memories get rewritten and rewritten. Robots don't have that. And so how do robots get through that? And, you know, how do they relive those memories? And what do you do when you have a tragic event like that, that you want to erase from your memory? You're erasing part of yourself because that event created who you are now. And by getting rid of that, you then damage your own future. And so the question becomes, how do you deal with that?
And I don't remember, ironically, when it was I came up with this idea, but the idea that robots as they're failing start having memory issues and start reliving memories and start going crazy, and how they go crazy is experiencing these memories. But the files are getting corrupted as the drives are running too much and too hard and too hot. And memories are mixing together. And you're starting to have these hallucinatory experiences. And it's the robot malfunctioning with all this terrible data. And I thought, what a great thing to be having that kind of a very human story of someone dying, and at the end of their life, and re-experiencing so much of that, but it's a robot. And so so many of these experiences are so vivid, but then data corrupted, and it was just a thing that I'd never seen before that I knew was going to be fun to play with. And also really allow me to write about mortality, you know? You know, it's one of those, again, cool ideas about playing with robots is a robot was designed to relatively be immortal, that functionally, you know, the whole Ship of Theseus idea that you can keep replacing parts and still live. What if you have the concept of immortality and all of a sudden you're mortal? And how do you deal with that?
How do you deal with the fact -- you know, we all have that moment that, weirdly, we all forget, the moment we realize we're going to die one day, it is one of the most profoundly damaging things that happen to every child somewhere between the age of two and four. But think back: you can't remember the time you realize you are going to die. We write over that, we forget that because it's so traumatic to us. And you can see it in your own children. You can experience when they have that moment where they're like, "Wait, mommy and daddy are gonna die?" Like you know, dogs die? My mom talks about the time I first realized that the chicken I ate at dinner and loved was a bawk-bawk-bawk chicken. And she literally chased me around the the the kitchen with a skinned chicken going bawk-bawk-bawk because she thought it was funny.
[DANIEL LAUGHS]
CARGILL: And apparently I bawled. But I don't remember it. But it was that moment I put together, "Oh, this thing was alive and now I'm eating it. And that's terrifying." But so I wanted to kind of play around with that and deal with those elements of memory to help define that character. And then it allowed them to play with a really interesting structure in the book, which, which I think ultimately worked out for me.
SARA: Yeah, I don't really have anything to add to that. That just is very interesting to listen to you talk about how you developed the story. And I think it makes me want to kind of go back and reread it, but maybe I'll just go back and read Day Zero because I'm excited about that one.
DANIEL: Yeah, I was really... I had a lot of a-ha moments reading it. And I was like, "Oh, they can delete your memories." Yeah, cuz they're robots. Like that was like constantly reminding. I did have a question though, about -- so The Black Phone, congratulations, that was my favorite horror film of 2022. I saw it like three times. So the one thing I kind of noticed with that is that you do not sugarcoat the '70s at all. Like, you're... you have all these kids that are swearing and also fighting violently. And I kind of was like thinking about that when I was reading Sea of Rust. There's no heroes in this world either. Like, like in you kind of like approach like storytelling where you're telling very like human characters. And I was wondering like when you have your characters making like decisions that are questionably moral, and even characters swearing, it's not like having robots swear, do you feel like that adds authenticity to the story like that you don't see other places? Or like, how... why do you gravitate to kind of like telling stories in these harsh environments?
CARGILL: Well, I mean, there's a lot of questions there. One thing I'm obsessed with swearing, and not... not in just that, like not in just a jokey way. I mean, it's definitely there, too. I know how to... I know how to start a bar fight in seven different languages.
[LAUGHTER]
CARGILL: But, but no, it's I am fascinated by the history of swearing. I'm fascinated by the intellectual mechanism of swearing, you know, right down to, you know, we've seen that primates and children as young as two and a half create their own swears by taking words they know that have bad meanings, negative connotations, and putting them together and creating whole swears out of them that you don't hear anywhere else.
SARA: Really?
CARGILL: And then we learn those swears, you know, other swears and we adopt them. At the same time, swears evolve. You know, one of the things I'm fascinated by is words that we have in our common parlance right now that we don't realize are swears because we've removed the swearing and the, you know, the shock value. One that I'm fascinated by is the word scumbag. If I say hey, that guy's a scumbag. You guys know exactly what I'm talking about. You're imagining a guy probably unshaven, a little sweaty, his clothes aren't clean, but he's also going to screw you over. He's untrustworthy. And that's what we think of as a scumbag. He's going to fleece you, he's a con man. But scumbag, scum is an old timey invective for semen. And a scumbag was an old slang term for a used condom. So when you're talking about some guy, "Oh, that guy's a scumbag," you're literally saying that guy's a spent condom. And you when you think about in that context, "Oh! Oh, we say that we say that every day and nobody flinches?" It's because we've lost the meaning of it. But swears evolved like that. Every generation takes common words that were considered offensive in the last generation and we make that, we normalize them in a common speech.
And at the same time, we take words that were totally acceptable in the last generation and we make them verboten. We just saw that with millennials where we took the N-word and you cannot use that in public anymore. We took the C-word, we took the R-word, the other F-word, words that us Gen Xers grew up hearing commonly that we knew they were swears when you shouldn't use them, but people used them. And now you do not use those words. But then, you know, I mean, I'm talking with a couple of librarians and I've dropped half a dozen F-bombs and probably nobody has flinched yet. And that's just the way it is. But the thing that I find odd about it is we call it vulgarity. But one of the other words that we have mutated over time and doesn't carry the proper meaning is that vulgar literally means "of the common people." When, you know, you say that's vulgar language, you're saying that's the type of language that street people use, that blue collar people use. We don't use such language because we're not vulgar. And that's where that word vulgar kind of became offensive, which there's a deeply weird thing that can be dealt into of like, why did the common people become offensive? And why did their language become offensive? But that's a whole, you know, cultural concept. But the thing is, is when you get into that nitty gritty, the vulgar speech, the vulgar argot, you get... people identify with it.
And so when a robot swears, oh, well, I see a little bit of myself in the robot. When you find out the robot swears because they grew up around a human that swore like a sailor because that character was based heavily on me, you ultimately get, you know, this, you know, this concept of, "Oh, I get this." And then you start asking, "Well, why wouldn't a robot swear? Why would a robot not use -- an artificial intelligence would absolutely use invectives." Like, you know, the... what's, you know, what's the difference between them saying, "This person is unintelligent, and this person's a [expletive]? You know, it's saying the same thing and it's getting it across. And language is... we, the shortest distance that we create between ideas, transferring between people. And so it made absolute sense to have them do that. The thing with The Black Phone is that's how kids talk. We in media try to pretend kids don't talk that way. And kids absolutely talk that way. You may not have said the F-word around your parents. But when you were 12 years old, you absolutely said the F-word. And you absolutely said it with your friends and you thought you were cool doing it. And so --
SARA: I remember trying to do that.
[DANIEL LAUGHS]
CARGILL: And the thing is, in the '70s, everybody has sugarcoated the '70s. We've seen those movies, we've seen it clean and crisp and clear. Me and Scott grew up in the '70s and early '80s. And that's not what it looked like. And it's not what it felt like. You know, that's the idealized version of it, you know, listening to disco and girls wearing roller skates. And guys, you know... you know, with the fly collars, and, "Hey, isn't this cool?" And, you know, but that's not really what it was like. And we wanted to make something that felt like what it was like to grow up in that time. Because that was a very particularly interesting time in which society was obsessed with serial killers. And was, you know, they were everywhere and they were coming for us and kids were getting snatched all over the country. And there's a whole history as to why that era is different than any other era. But and that's another interview entirely. But long story short, we wanted to... we wanted people who had been there to go, "Oh, God, yeah, that was exactly what it was like to be a kid at that time." And then kids of this era to look back and go, "What the hell did our parents grow up with? What's going on here?" One of the best compliments we've gotten from the film is a bunch of Gen Z years going, "Yeah, I really liked that movie and it's really scary. But parents beating kids with a belt, it's a little over the top."
[DANIEL LAUGHS]
CARGILL: Gen Xers are like, uh-huh, yeah, kid, the reason you didn't get beat by a belt is because we got beat by a belt. We know how bad it sucks. And we didn't want our kids hating us the way we hated our parents. And so we -- me and Scott wanted to make that movie. And so we very much, we very much wanted to get the dirt and grime of that. And part of that was the swearing. You know, we wanted these kids to feel like real kids, which is why we think, we really strongly think the movie has connected so well with Gen Z, because they're looking up at it even though it's a separate generation and the events are different, they're seeing themselves reflected on screen. They're seeing themselves, how they act as kids when the parents aren't around. And they're going oh, these are the kids I hang out with at school and these are those kids put in danger. Oh, and you look and you want -- I want Gwen to be my little sister. I want to be the girl at school that dates Mason -- you know, Finney. You know, those kids really saw themselves in that and then kind of immersed themselves in that world and got to see what their parents and grandparents grew up with at the same time. And it just kind of worked and... and I do think a big part of that is the language we use because the kids did see themselves in it.
DANIEL: I also liked how you didn't make up future sci-fi swear words for the characters in Sea of Rust. They're just dropping F-bombs and things. It's like, I played Cyberpunk 20 -- like Cyberpunk, and I was just like, I was getting tired, kind of exhausted of like all that like made-up future swear words so it was cool to keep it to the basics.
CARGILL: Yeah, no, I wanted to make the slang all make sense or make sense in a weird way. Like one of the one of the little weird things I created was, you know, when somebody's going... when somebody's going crazy, they call them a 404.
SARA: Right.
DANIEL: That makes sense.
CARGILL: Yeah, it's a reference to the old school web protocol that when you go to a page that doesn't exist, it's a 404 error. And that was very much something that would carry over that doesn't totally make sense in that context in the same way that Gen Z is looking at their computers and going, what is this save icon mean? I don't understand it. Like, what is this weird thing supposed to be? And then they find out it's a disk. They're like, what did you do with this? And it's like, we saved things on it. That is why that looks that way. To people of a certain age, that disk doesn't make any sense. And that's exactly what I wanted to do with some of that slang. I think one of the problems people have when they create future slang is it makes too much sense. And it needs to make enough sense that you know where that word came from, but that you don't fully understand the evolution. I mean, it's right down to it. You know, we were talking about this earlier with invectives. You know, one of the things that fascinates me is almost every word we have for somebody of limited intelligence comes from retired medical terms for people with brain damage or who are born mentally challenged. So moron, idiot, cretin. Like, all -- stupid, all of the words, if you look up almost every single word you used was coined as a phrase to describe someone who was mentally handicapped. And so when you call someone a moron, you're literally saying there's someone with the IQ between 25 and 50. But we've lost that meaning. And we have this other meeting now that if I was like, "Oh, that guy's a [expletive] moron," you'd just be like, "Oh, that is a guy who made a mistake and did something that was not too bright." You wouldn't think, "Oh, Cargill, is saying that he was mentally handicapped." No, that's... it's evolved. And I wanted the slang to feel that way. I didn't want it to be to future-y. I want it to feel like a gritty, gnarly evolution of our own system.
SARA: I honestly, I kind of felt like it contributed a lot more to like the human feel of your robots.
DANIEL: Yeah.
SARA: Honestly, when Daniel was mentioning their voices, I didn't even think about their voices until like you described their mouth and how the mouth was just a series of lights or something like that, or like the eyes were little flickers of light behind the screen. I didn't think about it. To me, your characters just read very humanistic.
CARGILL: Well, yeah, thank you. And part of that does come from that idea that I... I think one of the problems with science fiction is that you guess so much of what the future is going to look like that the book starts to become alien as it ages. And then you have this great era of science fiction that people love, that it's like, oh, this, you know, I love the retro futurism of it. I love how silly it is and the goofiness of it. And I love this concept of, oh, we're gonna get on a rocket and shoot to the moon with Commander Cody. And I wanted to kind of zig away from that and go more to the Richard Matheson elements of "we don't know what it looks like so let's put enough of our world in it so that it feels like it could be a future." And so that what things we get wrong, you don't care about. It's actually why I refer to parts of the robot with language that we use today, you know, describing things like RAM and memory and central processing units. And some people are like, "Well, why didn't you just create stuff?" And it's like, because it just would have felt silly and and authentic. But when I say I'm running out of RAM, I'm overcharging the RAM, anyone that knows computers goes, "Oh, I know what he's talking about." And 50 years from now, we'll have something better and different and this will seem quaint, but at the same time, the story will still work, it'll make sense. And that's what I was gunning for.
SARA: And that also probably helps with making it more accessible to people like me who are not, you know, hard sci-fi readers. But I can enjoy a good robot apocalypse every now and then, as I did with your book. So I think we're going to take a quick break just to let you guys hear about some fun stuff here at the Library. And then when we come back, let's talk more about your vision of AI and how it takes over the world and how terrifying that is.
DANIEL: Yep, we'll see you soon.
Commercial break
VOICEOVER: Did you know you can check out more than just books and movies at your library? The Library of Things continues to add new and unique items for check out to our ever growing collection. In addition to our popular internet bundles, Wi-Fi hotspots, telescopes, and STEAM to GO! kits, you can now check out anatomical models, Finch robots, binoculars, and more. To see all of the Library of Things has to offer visit, wichitalibrary.org/things.
DANIEL: And we're back with C. Robert Cargill. Hey Cargill, so the book was published in 2017. In your chapter, A Brief History of AI, humanity began to fear robots were taking people's jobs. What do you make of all the AI stuff recently with like ChatGPT and Midjourney? Are we very far away from the total OWI takeover?
CARGILL: We are, we are. We're in the first stages of it. You know, people are starting to... people are starting to realize, you know, in certain areas, the danger that this poses, but we're not there yet. You know, we are going to, like I was talking about earlier, you know, technology erases labor and there's going to be a time -- we're seeing part of it now, I'm going to say something that sounds familiar, only it's going to be the next evolution of it that a lot of people aren't ready for -- when you call up your bank or you call up the telephone company or you call up the electric company, you talk to a computer first. But what we've been familiar with for years is like, you know, a computer going, you know, tell me what, you know, problem you have, and you go [SPEAKS WITH FLAT AFFECTATION] "billing department." Because we all use that same tone, we do, because we know how to get it to hear our voice. But we're going to not only start seeing that work better, but there's going to be a time where, in the next decade, where you call up the phone company. And it's gonna just be like, "Hi, Sara. How's it going today?" And you're like, oh, hey, yeah, so I'm having a... "What's your problem? How can I help you?" And it's not going to dawn on you for a little bit that you're not talking to a person, you're talking to a computer that has learned how to talk to you properly, and sound completely like a normal person. And you're gonna be like, "Oh, I'm having this problem with this and this and this. And then the computer's gonna take all those keywords and jumble it around and go, "Oh, are you saying you're having a problem with your billing right now that your card has been overcharged?" Yes. "Oh, great. Yeah. Let me look into that for you. Oh, yeah, I'm seeing it did get charged twice." And all of a sudden, you're never going to talk to a human, you're gonna feel like you talk to a human.
Another time, you're gonna walk into a Whataburger, or, you know, whatever your fast food de jour choice is: KFC, Taco Bell, you're gonna walk in, and all of a sudden, it's going to come up and it's gonna be like, it's gonna use your name, and it's gonna be like, "Oh, hey, Cargill, you want your usual?" And then you're gonna be like, "Oh, do I? You know what? Yeah," because the computer is gonna recognize your face, it's gonna identify you, it's going to immediately go to your record and go, "Oh, well, it's Tuesday morning. He comes in on Tuesday mornings and he always orders a breakfast sandwich, a black coffee and a hash brown. So let's see if that's what he wants." And when you say the usual that's going to come up. But you're going to be like, "You know what, I want to change it up today." And then you're going to change it up but it's going to add that data. And next time you come in, it's going to figure out based upon your history, what you might be wanting and try to offer that to you and also try to upsell you at the same time because that's what they do. And you're going to start seeing those computers, this, what I like to call non-sentient AI.
I think too often we talk about artificial intelligence and we talk about automation and we miss this middle zone that people don't quite understand yet, which is non-sentient AI, which is the computer is smart enough to think but it's not rationalizing, it doesn't have consciousness, but it does so enough to convince you that you are dealing with something with a consciousness. And we have not gotten there yet. We're getting to things that are kind of hinting at that. And every time a computer goes, "Oh, my God, I'm a computer. I'm aware." It's like the scientists are always like, no, they're not aware. They're reading online people talking about this. And then they're understanding what awareness is and spitting that back out to you. But there's no consciousness there yet. We're not even close to that. But we are close to a lot of jobs being erased. And that's part of what this book was about. You know, I just saw a story yesterday that was going viral that people were like, "Oh, my God," that was exactly one of the things I talked about in the book, which if people were like, "Wait, you mean, when we have self-driving cars and we don't pay the bill, they're going to repo themselves?" And so yeah, of course they are. Why would you send a person out to steal a car when all you can do is press a button and send it back to the factory? Like, then you pay your bill and it'll drive back and all will be well.
But it's all that, you know, people think about robots taking our jobs. They don't think about all these automation or AI things that are going to erase other jobs that are the secondary and tertiary jobs. Like right there. You know, what is that going to do when we have self-driving cars? Well, the repo industry is gone, you know? The tow truck industry is gone because you know what, these self-driving cars usually don't do? Crash or break down right away, because when they realize there's a problem they pull off to a safe part on the side of the road. They don't park illegally so that whole section is gone. You know, they're not going to speed or change lanes illegally. So police officers who exist to ticket people, those jobs are going away. And all of these jobs from the person taking your order at Whataburger to the person answering your phone at the credit card company to tow truck drivers, we're going to just be slowly eroding these jobs over time. But we're not creating new industries that are giving those people jobs like we did in the past, you know, when we went from, you know, the buggy whip salesman went out of business when we stopped using horse drawn carriages. But we created whole assembly lines to create cars. But we're not doing that at the same rate here.
It's what we saw with the internet with, especially in journalism, where all these journalists were put out of business when papers were going under, consolidating. And there were no, they were like, "Oh, we'll go work for a place online." Well, online doesn't work that way, they don't pay the same, they don't need journalists sitting around doing deep dives and articles for three months to write. Very few outlets operate that way. Now you're writing five fluff pieces a day, five days a week, for, you know, $40,000 a year like it's not... it's not a sustainable business model. And so we're, we're seeing a lot of those jobs go the way of the dodo that way. And this was, this was a big scary one, when the art generators first started coming out, where it's like, oh, wait a second, there's a lot of jobs out there that you don't realize artists do that we're not going to need artists for anymore. And we're talking about that in Hollywood because we have a whole subsection of Hollywood of just concept artists and people who generate scenes and people who draw these things. And whether they draw hand drawn, or whether they do it digitally, we have on the movie of a size I'm working on right now, we have an army of artists creating stuff for us and coming up with concepts and doing mock-ups. And when AI is there, the studios are gonna go, "We can save money by just running this... having the the director run these ideas through the AI and spit out idea, idea until he points and goes, that one. Let's develop that further and tell the AI give me 20 versions of this, give me this with this actor, give me this with this actor, give me this with this actress, like let's... and all of a sudden now we have an entire industry of, you know, well-paid blue collar class workers who have a skill that are now, you know, going to be irrelevant. And that's just another subsection.
And so when I talked about earlier, the idea that technology leads to socialism is what I'm talking about is we're slowly eroding these jobs away, just a piece at a time. And then we have to eventually, what will likely happen is we're probably going to have a big Luddite pushback of like, wait, you know, people have nothing to do with their lives and that's a bad thing, so maybe we figure out what to do with labor. And that's really going to be the struggle of the next 20, 30 years. It's going to be the biggest thing we're going to be talking about apart from global warming, but we're just starting to have that conversation. And people don't realize how serious that, you know, ChatGPT conversation is. Like, this is the first step. It's not great tech. You know, people have figured it out. I have a friend who has is freaking out because she was told ChatGPT could do all sorts of things. She goes, "Hey, figure out my compound interest on my loan for my house so I can figure out my mortgage." And ChatGPT [expletive] the [expletive] bed. It could not, it did not know its ass from a hole in the ground. It was just, it did not have anything on compound interest and how to work these mathematical equations. And every answer my friend got was wrong. And so she was just completely flummoxed by it. So we're still a long ways away from, you know, total AI dominance. But the next 20, 30 years are definitely going to be these conversations and that's very much a reason why I wrote the book, because I wanted people to have these concepts in their head that they could start talking about it in ways and start understanding it, you know, through metaphor and idiom and not necessarily dry, boring, you know, esoteric explanations of this stuff. Because people really glaze over at that, as many people probably have done at this part of the interview, and they're like, "Get back on the robot pew pew!" That's where I think we're heading.
SARA: I mean, I don't think that that's necessarily boring. I think, one, it's going to be terrifying to... maybe not terrifying, but it does blow my mind a little bit to at the thought of going into a fast food restaurant and it's like, "You ordered this last Tuesday." You know, hadn't thought about that. But I also found it really interesting to be reading this book and like thinking about this interview and your thought process behind certain things and then seeing it play out in real life. Because I feel like those conversations are very, very prevalent right now. I was just reading about -- well, I don't want to get too topical so that people can listen to this in six months and be like, "That... that article came out six months ago." But those articles are everywhere talking about people communicating with Microsoft's new chat search engine and how it's responding and all of those. It's just, it's fascinating. And a little bit scary.
DANIEL: I've been checking ChatGPT for readers advisory, which is like a big part of our job. And like, it's not there yet. It doesn't get everything. Like it recommended, like, like the book Cat Person for a book about like, cats. And It's not about it. I was like, there you go. Okay, we're good.
SARA: And the book wasn't even about cats?
CARGILL: That's not what that story was about, ChatGPT.
[DANIEL LAUGHS]
SARA: ChatGPT.
Okay, so let's switch gears a little bit. In the first act, we talked quite a bit about these different components of intelligence that your robots have. But they, one thing that they kept saying throughout the story is that your measure of intelligence is the power to defy your programming, you know, the caregiver's choice to do things that are the opposite of caregiving. How... this might be a little bit too. out there, but how do we as humans defy our programming? How does that make us intelligent?
CARGILL: Oh, I mean, yeah, that is one of the core things that I kind of tried to talk about because that's the, I think that's going to be the, the thing that really separates what is the difference between, you know, what makes awareness awareness, you know? A robot saying, "Oh, I'm a robot," doesn't mean they actually know they're a robot. But being able to say, "I'm programmed to do this, but I choose to do this," we do it, you've done it a thousand times today and you haven't even thought about it. You know, when you see someone who's attractive, who you want to, you know, have sex with, and you don't act upon it, you just shut the [expletive] up. And you just, oh hey, ya know, she's pretty attractive. You know, when you're hungry, but you know, you can't just chow down on a burger because you had a big breakfast, and you know, you've put on a little weight this week and maybe you should lay off. You know, when somebody says something that irritates you and you just want to gouge their eyes out and instead, you don't even say anything, you just kind of bite your tongue and deal with it.
You know, every time we run into a Karen, we know instinctively what we need to do to that Karen. But Karens exists because we defy our programming and we don't do it. You know, one of the... it's one of those lessons we learn as kids is that you can run your mouth, but somebody can throw a left hook and then, you know, a lot of good your mouth did you now you're on the ground. And we learn those things. And there's so many of the various things in any given day. You know, the... why did, why did all of us dress this morning? Why are we why are we dressed? Is it cold, necessarily, in that studio where you guys are? No, because we despite the fact that we're programmed to, you know, run around naked as we were born, we have this society and we're like, oh, we need to do that. You didn't even think, you didn't the ask yourself the question, why bother dressing today?
SARA: It's true.
CARGILL: And those, it's all those tiny little things we do in a given day in our society, that, that we defy our programming, and we choose, even though I'm genetically programmed to act in certain ways, I choose to act this way. And we have a rationale for why we choose to act that way. We wear clothing because of modesty, because of inclement weather. Because it's the way our society is done. You know, we bother to shower because we know we might offend somebody with our stench. If you're a particularly odorous individual. You know, there's so many little things that we do and then we just get into the habit of doing that as part of our day, but is in defiance of our programming. And so to define what would make a robot truly intelligent is the ability to go, "I know I'm supposed to lift this girder, but I don't want to because I might damage my circuits in a way that I just don't feel like it." Like, yeah, they can be replaced but I don't want to do that today. You know, I would much rather be reading a book instead of doing my job. And we make those choices sometimes. We do defy our programming, even when we shouldn't defy our programming. But that's what I felt the robots really needed to be able to do. But then how do we get that level of intelligence to where they can defy their programming, but make sure they don't defy their programming? And that's a very human thing is we want to have a thinking thing that we keep on a leash and not allow it to do whatever it wants to do. And so that's where... that's where I really kind of had an epiphany of the line of demarcation that separates a artificially intelligent, you know, artificial intelligence from a truly thinking thing.
DANIEL: So you're talking about the programming and like in the story, Asimov's laws come up. And so like most of the characters follow up right up until the world war. But like also, like in the real world, like Asimov's laws, which come from, like I, Robot, which is like a work of fiction, are also being, like, adhered to by like, people that work with AI and robots. Do you have any thoughts about how these fictional laws affect the way we see AI, both in fiction and the real world?
CARGILL: Well, I mean, the thing is, is the reason people... I mean, science fiction, one of the key reasons science fiction exists is to warn us about the future that's coming. And to give us the weapons we need to deal with them when the problems arise or before they arise. The reason guys like me write books about robot revolutions is so that one day when, you know, when we get fully automated, you know, artificial intelligence, we don't give them weapons, because that's scary. You know, we're starting to see now that drones are great, but we should have drones controlled by people who can make choices, and can make those choices based on this order and hierarchy we have rather than something that can just have a malfunction and go nuts and bomb the wrong thing. Because that stuff is, or can just be taken over or orchestrate itself and go. You know... you know, it's one of the earliest short stories I wrote when I was a kid. It's very, you know, passe now, but the idea was we created this artificial intelligence that controlled all of our nuclear weapons. And the chief enemy, you know, was protect us from the enemies. And the robot, you know, the artificial intelligence realizes that we are the enemies, that, you know, we are the the problem with existence and nukes us all just to get rid of us because we were the problem. But, you know, that really is a fear we have. And so that's the language that scientists then use to go forth.
And now, you know, you see people going, okay, you know, Asimov's laws. I was actually very blessed that my editor on this, Jen Burrell, she edited Asimov's last few books and she was friends with him. And she edited his book called The Zeroth Law, which is where Asimov realized he was missing a law. And so, and she was fascinated by the book because she's like... she said one of the tragedies of her life was that she never got to sit in a room with me and Asimov and see what Isaac would have thought of my book because she's like, "He would have hated it. And not because of the writing, but because you guys absolutely disagree on the future. And you're a pessimist. And he's an optimist. But you're an optimist selling pessimism, but he was a pessimist selling optimism." And so she was like, "I wish I could have seen that conversation." And that was a huge, huge compliment to me, you know that somebody would edit it for Asimov would even think we should be in the room together because that guy's a titan. So just somebody even mentioning that might be a thing kind of meant a lot to me. That kind of made my day and my week and my month.
So but yeah, it's that's our job as science fiction writers is to give scientists and thinkers of the future these rudimentary concepts to play around with so that they can have these fully formed thoughts in how to deal with this. How do we put the blocks on so that we don't have militarized robots, so that we don't have a society in which we have a few billionaires who control all the robots and everybody else lives in poverty? How do we skirt those lines? And what do we figure out before it's too late? And so yeah, it's, it's absolutely why I wrote the book. It's why I use the language I did. It's why Asimov wrote his stuff. It's why, you know, Arthur C. Clarke wrote what he did, and why Clarke has had... why every bio you read of Clark mentions that he wrote about satellites two years before we made them. And because when you get to be on the precipice of that and kind of warn people about it, you do get to see the scientists were like, you know, I read this book once and give me an idea. And, you know, it's... it's how we get, it's our small part that we get to contribute to the future.
SARA: I like how you say that your writing is a job. I guess, you know, I get very idealistic about it, because I'm just like, "Oh, it must be so nice to just have these stories that you can create and share with the world." And you're like, that's my job as a science fiction writer, so I don't know...
CARGILL: Well, first of all, it is nice to do that. But secondly, you know, we all... you know it, I mean, Stan... Stan Lee really said it best when he was asked about how he dealt with his legacy. And he said that, you know, very early on, he felt ashamed by the fact that he was a comic book writer. You know, that he wasn't considered a real writer. And then he saw all the joy that it brought to people and he would talk to people who had real really hard jobs and really hard lives. And they would come home and read a stack of Marvel Comics to get away from it all and that it helped them blow off that steam and helped them, you know, think about life in a different way and enjoy things and escape and... and that that was the job he was doing and that what he contributed to society was he let people enjoy a few hours away from the worst parts of their lives. And so yeah, every... we, we contribute in that way. But we can also contribute in other ways and really help, you know... art is a conversation. It's not a bunch of people just sitting out and going, "Here is my art, enjoy it." It's, I'm reacting to all the art that came before me and putting art out there that someone else is gonna react to.
There's a 13-year-old kid who watched Black Phone last year who probably has a poster on the wall or has his, you know, Twitter page or his TikTok page full of Black Phone stuff that 25 years from now is gonna make a horror movie that I watch and go, "Oh, man, that's scary as hell, that's awesome." And not even think about the fact that that kid, whether he knows it or not, was reacting to the movie he watched when he was 13 years old and wanted to make his own version and that's what that is. He might know that. I certainly talk about the art that I came, that came before me. You know, here I am wearing a Conan the Barbarian shirt because I grew up reading Conan comics and the books and watching the movies and have written fantasy and fantasy is a big part of my life. And so I certainly cite, you know... you know, Robert E. Howard as a, you know, touchstone for some of my work because that's what you do. But it is that conversation in going forward. And that is part of your job. You know, the job is not just to entertain, but also to give back in some way. And to, you know, have something that your art means more than just, "I shared a cool story," because that's what stories have been from the beginning of time.
You know, some of the earliest stories we learn, these fairy tales as children teach us morality lessons. You know, the oldest stories in humanity are not historical documents. They are lessons couched in delightful tales you could tell around a campfire. One of the things I love about being a horror movie filmmaker, is that, you know, it's a tradition that goes back 25,000 years or more of sitting around a campfire on Friday night. And, you know, beer has existed for at least 10,000 years. And so for 10,000 years, people have been sitting around campfires, drinking beers, telling each other scary stories. And now we do that with televisions. Now we do that with books under lamps, under cozy blankets. And, you know, I love that. I love that that is a tradition that we carry on even though we feel it's a modern thing that belongs to us. It's all a part of humanity. And so yeah, it is very much a job, it is very much what you do. And hopefully I have earned my keep. That's what I hope when I... whenever I put out a book or a movie.
SARA: We have a lot of aspiring writers here in our local Wichita community. Do you have any advice when trying to write a convincing nonhuman character? I mean, we've been talking a lot about robots as the nonhuman. But really, it could be anything. We had the book about the foul languaged bird. Did you read that one? I'm not sure if you read that one. But anyway, there's a lot of options. Do you have any advice for people?
CARGILL: Yeah, I mean, the first thing is to sit down and ask yourself, how are they different? What's missing? Ask yourself what's missing from them. So that's like what I talked about earlier: robots can't touch, they can't smell, they can't taste. In my sequel, Day Zero. I have our principal robot, our protagonist can smell. And the reason he can smell is because he's a nanny bot. And as a nanny, smell is very important for detecting diapers and when something's wrong, you know, especially when caring for a child in a way that other robots wouldn't have. But it's a very expensive technology so you wouldn't put it in any robot, you just put it in a robot that can use it. And so that was a robot that then had olfactory senses that they could trade in.
But ask what's missing, ask what's there, ask what's heightened about it. Ask what's their society must be like, like, what, how does... how would your life change if you didn't have these experiences? You know, how does your life change when you have no interest in sex, you have no interest in eating? In fact, that was a tough part of Sea of Rust was I realized there was a big thing missing in which you find in every great story, that is an adventurer, people going on an adventure. They got to stop at night to sleep and sit around a campfire and eat. Well, robots don't eat, they don't sleep. So how do you get them to sit and talk? Because you know... and so I created this whole thing where they're trapped underground with this bombing overhead so I'd have an excuse for them to sit around and have a conversation so that they could sit around like you would in a campfire scene because there's no other reason for them to have a campfire scene. So, you know, think about those things that are different and how you can get those things that are like what we have in the experience, but in its own way, in its own reality that functions different than you do.
I think one of the biggest problems of people writing non-human characters that come across as bad is they assume, well, they're just like us. And you know, what you find as you grow older is that as your senses change, as your body changes, your desires change, and then your personality changes with that. You know, every, every person who gets to the age of the mid-30s, once their hormones calm down, realizes that the world becomes a very different place when you're not looking around at everyone that would normally be of sexual interest as having any sexual interest at all. Like where, you know, like no, I'm in a very committed relationship, I live my life very much. Every person here, it doesn't matter what gender they are because I have no interest in them. Like you see those changes and how that makes you a very different person than you were when you were 19 years old, and hopped up on hormones that are going, you will sleep with everything that crosses your path. And so you ask yourself those questions like, how does it change that this person doesn't have the same kind of... this character doesn't have the same kind of systems that I have? So that's the... that's the key thing is to ask those questions and think about and spend time thinking about it. Because your audience certainly will.
Had I not spent that time thinking about it, I would have had Brittle describing, you know, walking into a cold, damp room and... and, you know, the smell of, you know, of mold in there and then had some reader go, "Why are they smelling anything? That's a robot, why does a robot have senses? You know, what's going on?" And so I think that's why the book connects is because I spent months asking myself, how do you write a book from this perspective? And if you really want those characters to pop, spend that time because it may not show up on the page, but it will connect with the audience whether they know it consciously or not. Because they will feel like they're being transported to something new and different and think about things in ways they hadn't thought about. Like there's a lot of things I mentioned today that I could see on the look on your faces, you hadn't even realized I had written that in. That hadn't really crossed your mind and the minute I mentioned it, you're like, "Oh, yes, it was there!" But also you guys enjoyed the book enough that you wanted to sit down and talk with me for an hour and a half. So clearly, the book did its job. So that's really the type of work you want to put in.
And then also the one thing I always say, the number one writing advice of all time, is that character is choice. Character is not affectation. You know, Brittle is a character I hope you fall in love with not because she is a robot, but because of the choices she makes, the choices you first see. In fact, I love it when people read the first chapter and they're like, "Wait, she does what?" And the moment she does that, she's suddenly more interesting. At first you think she's this little angel of kindness going out to help someone and then you realize, no, she's collecting parts, and she's following this guy and trying to talk him down, promising him he's gonna be okay. Because in her mind, it's okay because I'm giving him the better death, he gets to shut down on his own terms instead of dying crazed in the desert. And that's how she justifies it. And the fact that she justifies it that way, all of a sudden, by the end of chapter one, you know if you're in that story or not with that character. So definitely make sure you focus on what your characters' choices are.
When stories aren't working, odds are it's because the character isn't making an interesting choice or isn't making a valid choice. That is the number one thing you need to think about. And so when you're talking about non-human characters, need to think about every choice that non-human character makes. And does it make sense to the character? And is it the most interesting choice that character could make? And if you follow those and make those... have that character make those types of choices, you will overwhelmingly have a story that people will want to read.
DANIEL: Thank you for that advice. Yeah, I'm definitely gonna revisit, like, listen to it again. It was really good advice. I have a question about the future. So you mentioned, we mentioned Star Trek a couple of times. And Star Trek is kind of like where everyone's mind goes to when you think of like futuristic, fully automated socialist utopia. And then you have Sea of Rust, which is a very like, post-apocalypse, especially for humans, and then also the robots. So like, do you feel like humanity might reach Starfleet levels of utopia, or how do you feel like... what the path we're going on? Like do you have hope for us? Or do you feel like we're leading into Sea of Rust?
CARGILL: Me, I am ultimately an optimist. I believe that there is a good future for us. The one thing that I go back to whenever I think about the doom and gloom -- you know, we live in doom and gloom times. And that's because doom and gloom sells. The truth is humanity zigs when you think it's gonna zag. You know, 50 years ago, if you look at all the science fiction back there, it was overwhelmingly about one thing: overpopulation. And what are we finding out? Earth's not going to be overpopulated. The population is still going up a little bit, but in all the major industrialized countries, it's dropping. And why? Because the science fiction writers didn't internalize one thing: that the number one birth control in the world is educating women. And when women get educated, they get to make choices about their bodies, and a lot of them decide they don't want to pop out 19 kids. And all of a sudden, we see a stabilization in... in our, in our population.
And we also see a decline in population in a number of countries that are starting to see that. Japan and a lot of Europe are having, Russia especially's having a lot of problems with the population, because people aren't having enough babies to sustain what, you know, the... what we have in terms of the elderly population, based on our current broken systems. But yeah, so but one of the things I always go to is, you know, there's estimations that during the ice age there were only 7,000 of us that, you know, so much of the Earth was covered in ice, there weren't enough resources. And we may have been just in the thousands. And we came back. And the thing is, is that, you know, we may face some devastation in the coming decades, in the coming century. But I have a feeling that enough of us are going to survive, that we're going to figure out a way to do it, and we're going to live on in some way, shape or form. Will it be utopia? I think eventually we'll have something that approximates utopia, but I think utopia is... I mean, that's the struggle in my book is that I don't think utopia exists. What is it? Like a true utopia is you have to cede all your free will. Because, you know, you have to remove aberration for there to be utopia, and aberration is what makes us special, for better and for worse. You know, serial killers are an aberration. We'd like to do away with that. Well, in order to completely do away with that, we need to sterilize people mentally, you know, from doing these things if we detect this in them. Well, is that really utopia?
You know, this is the big question that we saw in Minority Report, where if we can see the future and we can protect the future, we can prevent these crimes from happening. I think the biggest crime to that film is that they cut off the last line of that screenplay from the movie because they thought it was too harsh of an ending. But the end of Minority Report is literally the next year there were 163 murders in the District of Columbia, when you know, it was the hey, we're not going to... we're not going to prosecute people for future crimes anymore. But guess what, death is back. And murder is back on the table, boys. And that is very much something that, you know, we're going to struggle with. There will be no utopia. You know, in fact, that's the thing is we talked about Star Trek style utopia, but the genius of Star Trek is Earth as utopia, so we went into space, and guess what: all the great adventures of a non-utopia is still there. So we get the best of both worlds. So we get all the non-utopia, we're at war with the Klingons and the Romulans while we get to be "Yeah, but back home, everybody's fine." I don't think humans are capable of utopia. And I don't think it's possible to truly exist.
And I don't think... you know, Asimov wrote a great story about that. And I always forget what it's called. Because... I mean, maybe I do and I don't because there's two titles to it and he hated the title. But one of the titles is Misbegotten Missionaries. And it's all about these aliens, these astronauts that go to a world and everything is peaceful and perfect, except the plants and all the animal life that lives together. And it's all one homogenous thing are coming for them, trying to assimilate them. And what they don't realize is that this world is at a perfect equilibrium because it's all just one organism all feeding each other. The animals feed off the fruit that drops from the trees, and the... you know, and continue the population out there. But it doesn't have this fighting back and forth like our world does. It's all one organism kind of doing its own thing. But nothing has freedom. Nothing is of its own mind. Everything exists within this, and you get to exist, but this is what you get. This is what you eat, you have no choice. You have no autonomy. And I don't think that's utopia. So I think ultimately we're going to struggle with that the whole way. And it's gonna be really hard.
We're gonna have a lot of bad times because... you know, a permutation of a great quote by Winston Churchill, Winston Churchill once said democracy is... or he said, America always does the right thing after it's done everything else wrong. And, and I've always loved that quote because I think that's humanity. You know, humanity always ultimately does the right thing after we've exhausted every other option. And, and I think we will eventually come to something that if we look into the future, if the three of us step through a time portal right now and we're 150 years in the future, go, "This place is great!" And then we run into somebody go, "I know, right? It's just a perfect world. I'm so depressed. I wish I lived in your world full of adventure. I want to be a cop and chase robbers and go on adventures, like all the films that I watch." And we're like, "What the hell is going on here? This is, this is terrifying. And no cops going and shooting people, yeah, we made movies about that. But that's not that was never good when that happened." No, you're you're idealizing this horrible past and thinking it's like the way we look at the Wild West. Like, "I would love to be, you know, an outlaw in the Wild West." No, you don't. It's terrible place and no, it's fun for adventures.
So I don't know what that future would look like. I know that it's going to have its problems. And it's going to have things that are a lot better than we have now. And so but ultimately, I think humanity will survive as long as we can figure out how to get off this rock before eventually the sun blows up. So but what do we look like then? Are we entirely digital at that point? Or is humanity gone? But our essence lives on in these machines that we created and have propagated millions of years in the future? Maybe. But I think as our iteration, I think we're for the most part in the future going to survive.
DANIEL: Awesome.
SARA: That's, that's positive.
DANIEL: Yeah.
SARA: Yay. Go humans!
DANIEL: It's funny how you're talking about how people look back because I'm watching Gen Z looking at early 2000s fashion right now with very fond nostalgia. No, don't bring it back!
CARGILL: Yeah, no. No, I mean, there's some of it that would be nice to come back. You know, it's... it's, you know, it's always great to watch some fashions come back, but there's always that thing like, no, don't do it, man. Leave, you do not connect that chain to that wallet. My man, do not do it. Nobody needs to see where your wallet is. No one cares. And then again, I'm saying that and it's like, "Wallet, old man. What are you talking about? I've got it on the back of my phone. Like there's a little sleeve that I put my credit card, what are you talking about?" But yeah, it is... it is interesting to watch is that every generation like looks at things and goes, well that's [expletive]. I remember when my generation was like, "Bell bottoms rock!" And that was a mistake. But then chokers came back because chokers come back every 20 years and chokers are amazing. I think chokers should always be in style. But then again, I grew up in the '90s, so what do I know? But yeah, it's... it's always fascinating to see how we look back at that stuff.
SARA: That could be a whole 'nother episode because I have a lot of things to say about '90s fashion, but we don't have the time for that. And we really appreciate you taking the time to talk to us today. Cargill, what's next for you? Do you have any upcoming projects that you want to share with our listeners? Or do we have any new books in the works?
CARGILL: Well, I'm making a movie right now, back there's a couple of big famous movie stars. I'm making another movie with Scott Derrickson, another crooked highway adventure. This one has temporarily... we don't know if it's the final name, but it's called The Gorge. Stars Miles Teller and Anya Taylor-Joy. And we haven't really talked about what it's about. But if you like Scott Derrickson movies and C. Robert Cargill movies and books, then this is definitely up your alley. And very excited about that. Working on it, I haven't really talked about it much. It's something I'm doing in my off time right now in what little off time I have, I'm writing a Sea of Rust novella. I'm setting another adventure in there. I'm not done with that world. I love that world very much. And I'm playing around with something else. I'm not ready to quite say what it is because I haven't quite fully finished it. So I may throw it out and start again. But... but I'm working on that right now. And that's coming up. And then Scott and I just kind of dicked around last year and made a cool short film that is part of the V/H/S series and V/H/S/85 comes out later this later this year. It's, the V/H/S series for those of you that don't know, is a collection of short films by horror filmmakers that are all found footage. And this one is set in the year 1985. And it's all tapes that could have happened in 1985. And Scott and I made a short film called Dream Kill that we're really proud of. It's a lot of fun. We made a gnarly, Friday night midnight movie type of thing that no studio would ever let us make on a very shoestring budget, and it was us getting back to our roots. And we're really proud of that. So that you can, you can see that later this year.
DANIEL: I love the V/H/S series. I'm excited for that.
CARGILL: V/H/S/85. I've seen several of them. We're very proud of it, it should be a lot of fun. So that I think is premiering late fall and will be on Shutter to be able to be seen and probably for rental if you don't have Shutter. And yeah, so that's pretty much what I've got going on that I'm allowed to talk about at the moment. But we got lots of things in the hopper. And we're really, really excited for the next few years.
SARA: Awesome.
DANIEL: Awesome.
SARA: That all sounds great. We can't wait to see. And honestly, I'm just excited to read the follow-up to Sea of Rust, Day Zero, which I hope to get my hands on really quickly. Because I think your nanny bot is like a cat? Is that what the cover looks like? It's like a cat that has smell.
CARGILL: He's a tiger.
SARA: There are no animals in your book.
CARGILL: The... the basic pitch of Day Zero is it's a nanny bot who was a anthropomorphic tiger, has to decide whether he wants to join the robot revolution and fight for his freedom, or protect the boy that he's been programmed to love. And people have been like, "Wait, is this Calvin and Hobbes at the end of the world?" And I'm like, yeah it absolutely is. 100 percent.
[LAUGHTER]
CARGILL: So if you dug Sea of Rust and like its optimistic nihilism, and you like robot tigers, and you like Calvin and Hobbes, there are people that prefer that -- not prefer, but you know, think Day Zero is the better book. And I love that people don't agree on that. Because whenever you write a book that people are like, "Oh, no, I still like this one better," it's like, great! I've done my job. People, people like one over the other but like them both. So if you dug Sea of Rust, hopefully you'll dig Day Zero and then will want to seek out this novella that I'm writing.
SARA: Awesome.
DANIEL: Awesome.
SARA: Can't wait. That sounds so good. Okay. Well, thank you so much for taking time with us today. We had a really great time talking to you. I mean, I can't speak for Daniel.
DANIEL: I had a great time. Yeah, it was, it was really cool to meet you. Thanks for taking time. And yeah, it was awesome talking with you.
SARA: Gave me a lot to think about with related like, with relation to the book, because, yeah, I mean, if we could just deep dive with all of the authors of the books that we read --
DANIEL: Yeah.
SARA: I would really, I would change my whole outlook on life.
DANIEL: Yeah, definitely it's getting a re-read, I'm gonna re-read it soon.
CARGILL: Guys, thank you for letting me do this. It's always, it's always a thrill to be able to do this because you don't always get to have that conversation. And what I'm finding is these podcasts and these things that we're recording are sticking around online for years and people are able to find them later. And that, you know, I've been able to go back and revisit several of my friends just by going back to this conversation recorded seven years ago while I was promoting a movie and it's two hours of me and two friends in a room drinking whiskey having a good time. But like here, getting to talk about this book and having readers be able to go, "I wonder what Cargill meant by that," and then hearing me be able to talk about it. So thank you for giving me that opportunity to talk to some readers and give me something I could share with fans of the book that want to hear more about it. And thank you for turning people on to it. Librarians do an amazing, amazing job turning people on to books. We do not really exist without them. And so thank you guys for the service that you do, the job that you do and the importance of it, and then choosing my book to be one of those books to share with people. Thank you very much for being alive.
Commercial break
VOICEOVER: The Wichita Public Library has entered the world of streaming with Kanopy. Library cardholders can now access Kanopy's massive collection of movies, TV shows and educational content from The Great Courses. There's content for children, adults, and foreign language options too. Cardholders get seven free checkouts a month. And Kanopy is constantly updating their library so there's always something new to watch. To find out more, please visit wichitalibrary.org/kanopy -- K-A-N-O-P-Y, Kanopy. Just one of the many services provided to you from the Wichita Public Library.
JENNY, VOICEOVER: Here are some reading recommendations for category one, a book with a non-human narrator from our community of readers in the ReadICT Facebook group. To join in on the fun, login to Facebook and search for the group #ReadICT challenge and click join. You can also find more reading recommendations for this and other categories by visiting wichitalibrary.org/readict. Wants to submit your own reading recommendation to be featured on the podcast? Call our book review hotline at (316) 261-8507. Leaving a review is easy. Just state your name, location if you are outside Wichita, category your recommendation is for, title and author of the book, and a brief reason why you recommend this book.
IAN, VOICEOVER: Watership Down by Richard Adams. I chose a book I never got around to reading in my youth,
Watership Down by Richard Adams. I quite enjoyed it. As tends to be my habit these days, I listened to the audiobook, which was well done. After this book, I have an even greater adoration of the bunnies that come and eat in my backyard.
Fox 8 by George Saunders. I recently read Fox 8 for the non-human narrator category. It was written very phonetically, which was actually kind of fun to read. It's also very short: 48 pages. Fox 8 tells the story of his family and friends and how shocked and saddened they are to lose their homes to human development.
The Magic Strings of Frankie Presto by Mitch Albom. I have a recommendation for category one, a non-human narrator book that is narrated by music. This is one of my very favorite books as so many elements that leave you thinking about it for a long time.
The Miraculous Journey of Edward Tulane by Kate DiCamillo. If you've got kids doing the challenge, or you're like me and prefer the adventure of a kid's book over the mundane adult ones, one suggestion for non-human would be The Miraculous Journey of Edward Tulane by Kate DiCamillo. It's told from perspective of very unlucky toy rabbit.
Dog On It by Spencer Quinn. Dog On It by Spencer Quinn turned out to have fun dog narrator and private investigator human. When a book for my favorite genre fits the challenge, it's a bonus for me.
Flush by Virginia Woolf. I chose this book because it's told from the perspective of the greatest, cutest, most noble type of animal ever to live: the cocker spaniel. This is probably a good time to point out that this review is definitely not being dictated to me by my cocker spaniel under the threat of duress, withholding of affection and obedience. Flush is a cocker spaniel who has been immortalized in verse by his owner, Elizabeth Barrett Browning. This biography is ultimately about the love and affection dogs have for their humans, despite humans often failing to be worthy of that love. Overall, this was an at times lovely and at times harrowing book, but one dog's life with a few insights into his famous human.
SARA, VOICEOVER: Wow, who knew a robot apocalypse would give us so much to talk about?
DANIEL, VOICEOVER: I know, right? It actually really made me like hyped up to read the sequel.
SARA, VOICEOVER: Yeah and honestly, like, I might go back and reread certain parts because now I'm thinking about it in a completely different way.
DANIEL, VOICEOVER: Yeah, especially like all he was talking about, like them not having memories. So definitely, I'm gonna check it out, too.
SARA, VOICEOVER: Also, I really want to read about the nanny robot. [LAUGHS]
DANIEL, VOICEOVER: Yeah, some kind of like anthropomorphic tiger robot. They're grrreat!
All right, so let's, I guess this is the end of the show. So let's read the credits, but we're gonna do it like in a robot voice.
SARA, VOICEOVER, AS A ROBOT: A list of the books discussed in today's episode can be found in the accompanying show notes. To request any of the books heard about today's episode, visit wichitalibrary.org or call (316) 261-8500.
DANIEL, VOICEOVER, AS A ROBOT: Thank you to C. Robert Cargill for talking with us today, even with a six hour time difference. We'd also like to thank those of you who shared recommendations for category one.
SARA, VOICEOVER: I can't do that anymore. It's really hard.
[LAUGHTER]
SARA, VOICEOVER: This has been a production of the Wichita Public Library and a big thanks goes out to our production crew and podcast team.
DANIEL, VOICEOVER: To participate in the ReadICT Reading Challenge, please visit wichitalibrary.org/readict. Stay connected with other [CHUCKLING] ReadICT participants on the ReadICT challenge Facebook page. Find out what's trending near you, post book reviews, look for local and virtual events, and share book humor with like-minded folks. To join the group, search #ReadICT challenge on Facebook and click join.
SARA, VOICEOVER: See now, if you were a robot, that would have been perfect and really boring.
DANIEL, VOICEOVER: [CHUCKLING] Good point.
SARA, VOICEOVER: And don't forget to log your books in the reading tracker app. Beanstack. Each month you log a book and the challenge, you're eligible to win fun prizes. If you need any assistance signing up or logging books, give us a call, reach us on chat, or stop by your nearest branch.
DANIEL, VOICEOVER: You can follow this podcast through the Anchor app or stream episodes on whatever platform you listen to podcasts on. If you like what you heard today, be sure to subscribe and share with all your friends. Have a good day!
SARA, VOICEOVER: Bye!