Angry Planet - The Robot Revolution Is Already Here

Episode Date: July 3, 2020

This week we’re joined by P.W. Singer, co-author of Burn-In: A Novel of the Real Robotic Revolution.The future is here, it's just not evenly distributed.Stray thoughts on the Bonus Army.The future o...f policing might be AI and that’s terrifying.Domestic terrorism is about to get even weirder than it already is.War has already changed.You can listen to War College on iTunes, Stitcher, Google Play or follow our RSS directly. Our website is warcollegepodcast.com. You can reach us on our Facebook page: https://www.facebook.com/warcollegepodcast/; and on Twitter: @War_College.Support this show http://supporter.acast.com/warcollege. Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:00 Love this podcast. Support this show through the ACAST supporter feature. It's up to you how much you give, and there's no regular commitment. Just click the link in the show description to support now. We have all of these sensors out there that are gathering data. We've got more unmanned systems, drones. We've got more people on email pushing me things from databases. I'm getting all this data being pushed to me. Isn't it great? No, actually it's the opposite. It's info overload, right? You're listening to War College, a weekly podcast that brings you the stories from behind the front lines. Here are your hosts. Hello and welcome to War College. I am your host, Matthew Galt. Today we are talking about a new kind of science fiction book, one that has lots of citations. The author of that book, P.W. Singer, or should I say co-author of that book, P.W. Singer, is here with us today.
Starting point is 00:01:22 Sir, thank you so much for joining us. us. Oh, I appreciate you having me on. Right. So the book is Burnin, a novel of the real robotic revolution. What is Burnin about? So Burnin is a new kind of book. It's a cross of novel and nonfiction. Now, I know that sounds a little bit different, but what it is is on one hand, it's a techno thriller. You follow a veteran-turned FBI agent who is hunting down a new kind of terrorist through the streets of Washington, D.C., of the future. And through that hunt, you get to see all the ways that, you know, D.C. will be changed in other ways it will be the same.
Starting point is 00:02:03 You follow them everywhere from Union Station to the White House to Starbucks, but you also, you know, not to plot spoil, follow them into a firefight and the like. But baked into the story are actually over 300 explanations. and predictions drawn from the real world with actually 27 pages of end notes in the back to document, hey, this is really from research. And so it might be a micro detail that you're documenting the way a certain drone looks. Hey, that's not what Singer and August Cole, who I worked with in Ghostfleets, the co-author on it. It's not what these guys dreamed up.
Starting point is 00:02:50 Here's the patent for it from Amazon. or it might be a certain type of weapon, the next sniper rifle. Wow, that sounds sci-fite. Nope, here's actually where it's from. Or it might be a macro issue that it's documenting an issue like what is algorithm bias or how does an AI work. And so the idea of the book is that you get the entertainment, you get the fun. And for a lot of people, that's going to be it.
Starting point is 00:03:16 And that's great. But for other people, it allows them to learn about these issues that are so crucial. Okay, what is AI? What's next with robotics? How are we going to use this stuff from science fiction in our real world? Most importantly, what are going to be the dilemmas and questions that we face, whether it's for someone out on the pointy end of the spear, you know, tactically deployed or, you know, even what are the questions they're going to face in their home with their kids? But also, what are the larger questions that we're going to have to face in terms of how it's going to shake up our politics, our security. whether it's national security or cybersecurity. So I'm a parent. I jokingly think about it as sneaking fruit and veggies into the smoothie. And this is the second book that you and August Cole have written this way, right? Ghost Fleet was the first one wildly popular, especially in the military community.
Starting point is 00:04:15 Is there something, and you kind of have, can I describe it as a think tank background? Is that accurate? Yeah, yeah. Okay. You kind of have a think tank background, trying to learn different ways to communicate with people in Washington, especially the military. Was there a conscious decision to use fiction to kind of get points across to them at some point? Like, how did that come about? So, Vernon draws from the experience that we had with Ghost Fleet, but takes it a little bit further.
Starting point is 00:04:48 So the origin of Ghost Fleet was really August. and I wanting to create for people the experience that we had had as kids reading early Tom Clancy, specifically Red Storm Rising. Both of us, we didn't know each other as kids, but we had similar experiences where we both read books like that. For me, it was in the back of my mom's station wagon on the way to beach vacation. It was back when you didn't have to wear seatbelts and all that kind of stuff. And so that was our original was, you know, we wanted to recreate that, that fun feel of early Tom Clancy. And yet what happened with Ghost Fleet is that, you know, it obviously struck a chord for a lot of people in terms of the fun. But within the military
Starting point is 00:05:39 community, it went a little bit further. We were invited to read the read not just sort of what's the fun it but hey what are the real world lessons out of this um and we ended up being asked to brief it uh we've counted actually over 75 different organizations and units uh you know it's everywhere from the white house situation room to the tank uh the the joint chiefs of staff meeting room inside the pentagon to um out at units uh you know 80 second airborne uh j sock pacific fleet you name it, all of these groups saying, you know, hey, this wasn't just entertaining. It was really useful to us. The Navy even named a $3.6 billion program, Ghost Fleet.
Starting point is 00:06:28 And so we hit upon this idea of what we call useful fiction, that it could be entertaining, but it could also be useful at the same time. August actually coined the term ficent, where if you think about it as human, you know, human intelligence or sigint, signals intelligence. Do you have these different tools out there of collection, of analysis that allow explanation, sometimes even prediction, that Fickett is a way of combining real-world research. So it's not pie in the sky, sci-fi, but taking that and putting it in the narrative, and that what that allows you to do is to get something more out of it.
Starting point is 00:07:09 So we had this experience with Ghost Fleet, and we were struck by it. We said, okay, let's take that further. And so with Burnin, from the very start, it was this idea of serving these two masters. Give people an entertaining read. We're just really excited to introduce the story, but also the main characters. I think people are going to really dig it. As a creator, you want people to, you want to share that excitement. But also, again, trying to explain things that we think people really do.
Starting point is 00:07:44 need to know. And not just me saying, oh, I think it's important to understand this. AI is at the center of everything from the new U.S. military, the national defense strategy. It's at the core of it. Oh, by the way, it's also in the Chinese plan. The China said we want to be the world leader in this space by 2030. And so, you know, you have all of this importance of it and yet you have a disconnect. They did one survey. And they found that, um, Only 17% of leaders self-reported that they even had a familiarity with AI and robotics, let alone the implications of it. And you know, you and I know that when people self-report, they're probably, you know,
Starting point is 00:08:30 overestimating. So the fact that it's only 17% with that idea of overestimating shows you've got a pretty big disconnect there. And so with Burnin, we blended it in from the very start, that conscious goal. and it might be, you know, a certain amazing item or vulnerability that we came across. It's not just, hey, all the technology is real. We've got the footnote to back it up. Even the attacks that take place in the story are drawn from the real world.
Starting point is 00:09:04 So I don't want to plot spoil too much. But, for example, the Internet of Things is the way the Internet's changing. It's changing the battlefield too. They call it the battlefield of things. All this massive amount of networking that's out there. But it also introduces new vulnerabilities that adversaries might go after. It changes the nature of cyber threats. It's not just about someone stealing secrets from you, stealing a battle plan, stealing a jet fighter design, stealing your email.
Starting point is 00:09:32 It's about creating kinetic change in the world, turning it truly into a weapon that causes physical damage. And in Israel recently, in the real world, someone went after water treatment systems to change the chlorine level. So they weren't stealing secrets from the water plant. They were changing the water coming out of it was the goal. So again, don't want to plot spoil. But for those listeners who have, well, if you live in the Washington, D.C. area, but this is true from anyone around the world. If you think that the little small towns up river,
Starting point is 00:10:16 so for the Washington, D.C., it'd be upriver and Potomac, little small towns in rural Maryland and Virginia. If you think they've got better cybersecurity on their water treatment plants than the Israeli government does, I got some bad news for you. And so you can sort of then in the fiction world play that and go, ooh, not just could this happen, but what would be the consequences of that?
Starting point is 00:10:38 And so it's that conscious weaving in or it might be a complex idea that you can make more real through the fiction. So something that I know a lot of people on the military side are starting to feel is that, hey, we've, we have all of these sensors out there that are gathering data. We've got more unmanned systems, drones. we've got more people on email pushing me things from databases i'm getting all this data being pushed to me isn't it great no actually it's the opposite it's info overload right and then there's another part i feel this every day too you know yeah yeah yeah so you feel it in everything you feel it um soldiers out in the field feel it you feel it when you wake up in the morning um and so uh there's a scene very early on in the book um this is what what you do with the fiction you
Starting point is 00:11:34 In nonfiction, you tell people the end of it, the bottom line up front. With fiction, you only choose from the early scenes to not ruin the story. So our main character goes into Union Station in Washington, D.C. and is trying to pick a terrorist out of a crowd. So think about what I just depicted there. Everyone listening to that can both kind of, you know, immediately visualize that. Hopefully their heart races just a little bit faster. Ooh, what an interesting scenario.
Starting point is 00:12:04 But the challenge is not just picking the terrace out of the crowd. They're getting all this data flooding them. They're getting everything from face recognition of everyone in the crowd that then matches up to their history, who has a criminal record. All this data. And at a certain point, it's too much. So it's not all that we bank on. But then there's a next twist that we're all starting to feel too. The way we're trying to answer that day.
Starting point is 00:12:34 data overload problem, again, whether it's for the soldier out in the field or it's for when you go shopping, is an AI sifts through all that data and gives you recommendations. And it might be on the military side, recommendations of which route to take. On the civilian side, same thing. There's both military and civilian versions of this a ways map, or maybe it's what to buy. But all these recommendations suffer from what we call algorithmic bias. It might steer you slightly the wrong way. It might do it unintentionally.
Starting point is 00:13:16 Bad data was entered in. The AI was designed the wrong way. Or maybe it steers you the wrong way a little bit intentionally. Hey, this route you have to go. You should go there because, oh, by the way, a restaurant paid us a little bit to make you go by it. So everything I just said, yeah. And oh, by the way, here's a pop-up of a sale just for you, Peter, because it's also going to start to be personalized. And so the point of this is that everything I just communicated hopefully made it feel a little bit more real.
Starting point is 00:13:46 And when you get the book, you get dropped into that. But for most people, they're not going to read an academic paper on algorithmic bias and understand how it works. but through taking that research and dropping it in the fiction, hopefully you get sort of the best of both worlds. You get not pie in the sky sci-fi. It's grounded. It makes your hopefully heart race a little bit more because it's playing in a reality that's familiar. But also you walk away from it learning about something that is important right now.
Starting point is 00:14:24 Let's talk about artificial intelligence specifically. is that a lot of the book turns around on it, machine learning, you know, algorithms, whatever we want to call it, although those are all separate different things. But anyway, what is the threat here? And I'm wondering if you've read that Henry Kissinger piece from 2018 and what your thoughts are on it. And if you were thinking about it at all as you were writing this. The short version is I've read it, but I was not thinking about it. There that's fair. I have apologies, but, you know, the Kissinger, or, you know, I don't know if this is a feature or a flaw, but Kissinger is not a major influence on Burnin book. I would say that's a selling point. Yeah. We'll leave it there.
Starting point is 00:15:18 So what, I think, you know, maybe I'll frame it this way. We are on the 100-year anniversary. of the creation of the word robot. I mean, that's kind of cool. You know, that's part of the concept of this. A hundred years ago, literally the word robot was created. It was created for what we would call a science fiction play called R-U-R. And the playwright took an older check word and sort of remade it.
Starting point is 00:15:51 And it was a check word for serf or servitude. And in his story, he had this concept. of mechanical servants that wise up and then rise up. And so ever since that first sci-fi treatment of the word robot, we've thought about the idea of a robot revolt. You know, it runs through the Terminator movies, the Matrix, you name it. And it doesn't just shape the sci-fi. It shapes the real world.
Starting point is 00:16:27 the concept of killer robots is something that has been debated everywhere from the United Nations to it consumes energy inside, you know, military law circles to, you know, there's scads of articles written on it, to, you know, the fear of existential threat. You've had over $5 billion spent on research and development into existential threat from AI and robots. There may one day be a revolt of the machines and, you know, we'll have to figure out, you know, when do we salute our metal masters or whatnot. But in your my lifetime, it's actually not the robot revolt we have to think about.
Starting point is 00:17:15 It's the robotics revolution. The part that we're not just going to live through, but already starting to live through, is an industrial revolution. of a new kind of tool that's reshaping everything from business to as a result, politics, to as a result, parts of society, even down to it's affecting home life in some situations. And of course, just like past industrial revolutions, it also affects war. And again, it might be the tools of war, the technologies, the tactics, all the way up to the doctrine all the way up to the ideologies that are out there. I mean, think back to the last industrial revolution. You know, it was, you know, mechanization. It was a story that, you know,
Starting point is 00:18:06 had economic winners and losers. And they were economic winners and losers in terms of workers, in terms of, you know, a whole new generation of industrialists, as we would call them, who became, you know, the kingpins, winners and losers in terms of regions, even nations. It's part of the story of the United States becoming a great power. Winners and losers that results in political shifts. It's hard to tell, I would say impossible to tell the story of, for example, the American Civil War without talking about the Industrial Revolution. And again, it's everything from how, you know, it affects these old, they called them literally the compromises. the compromises between the slave-owning South and the North become more afraid as the North begins to industrialize and move ahead. Oh, by the way, it's also part of the story of how the North wins the Civil War.
Starting point is 00:19:04 It's the Industrial North, you know, is able to bring such greater power to bear. It's part of the story of the ideologies of fascism and communism that, of course, become wrapped up within World War I and World War II and the Cold War. So, you know, again, think about all to know, maybe we just want to talk about tactics. You know, think about how individual tactics for military units change as they move from horses to mechanization. Okay, that all played out in the last industrial revolution. You don't think it's going to happen in the next one. And oh, by the way, we're already feeling that because this tool, you know, again,
Starting point is 00:19:41 is, it's like any other tool, but it's a little bit different in that it's an intelligent tool. And you might see it in how we're having more autonomous robotics out there, how, you know, our unmanned systems have moved from being completely remote control to able to do a little bit more on their own. It's changing the, even the manned systems that we're in, whether it's your car or the Navy ship out there that has one-tenth the number of crew because so many of the systems or robotic, to even more so it kind of weaves into everything around you like electricity did. So for military people, it's the way the command post is giving, you know, recommendations to them.
Starting point is 00:20:27 For civilians, it's the games that your kids play with, the toys that you play with. So that part, that industrial revolution of what it's like to live through, that's really what we're trying to wrestle with in this book. and give people the feel of, okay, what does it look like as we move the dial forward, not one week, not one month, a year, 10 years. Yeah, that's really interesting. I had a thought as you were explaining all of that, that we are, like you said, going through an industrial revolution right now.
Starting point is 00:21:09 And one of the only ways that we've been able to kind of articulate our anxiety about that is by taking those old stories of our anxieties and fears from the previous industrial revolution and kind of updating them, which is the way I see like a lot of robot uprising, like I Robot Matrix kind of fiction is. It's kind of still living in with those old ideas, not quite understanding what the future is really going to look like. And this is one of the first books that I've really seen, I think probably because it's so. thoroughly researched and technically based. We're really games out, like, how weird and different the future is going to be from what we all thought it was going to be and how hard, right? So one of the characters that I think really kind of takes center stage and that really kind of articulates these anxieties and speaks to this kind of autonomous revolution is TAMS. So who is Tams? And why was it important to include this character in the book?
Starting point is 00:22:17 So it's interesting. What is? Yeah. And that actually illustrates the fun, but also the hopefully informative side of it. And I think you hit it really well in terms of part of why it feels different and weird is because it's actually. all drawn from the real world. So it feels different when it's these real world places, be it a Starbucks, be it the national mall,
Starting point is 00:22:51 be it someone's home, that you see how they are in some ways the same. In some ways, they're subtly changed by this new technology. And in other ways, there's, you know, full in your face visible changes. And of course, it's the same thing, whether it's the features of, again, a coffee shop or your home or a military unit deploying out in the field. But it's also when you talk about the politics, the business, again, there's some things that will never change. You know, there's some things about war, of course, that will never change no matter what technology that you have.
Starting point is 00:23:34 And then there'll be these subtle shifts and then there'll be these things that seem so different. and the fact that, you know, they're all taking place in the familiar, I think, really drives that home. And that's what TAMS is as part of that. So TAMS, short for tactical autonomous mobility system, you know, military, we've got to have an acronym. And TAMS is essentially what happens when you play forward the real technologies of today. You play forward on the software side, the decision aids. There's military versions of this, but for a lot of people, it's going to be their Siri or their Alexa. Play that forward.
Starting point is 00:24:15 What comes after that? And then on the physical side, it's, again, play forward. Take what's in prototype stage right now and move that forward. So, for example, a lot of people have probably seen the really cool YouTube videos of like Boston Dynamics, robots doing parkour. And remember, those are like for a lot of people, the most popular ones are from, you know, 2017. Say, okay, hold it.
Starting point is 00:24:46 If that was in 2017, you know, what is it like in 2023? What is it like in 2030? So play those forward. And that's TAMS. It's basically what comes next. TAMS is a technology that our main character, Kegan, is assigned to do a burn-in on. And the title of the book, Burnin, is actually a play on a concept.
Starting point is 00:25:15 A burn-in is a real term from engineering of when you push a technology to the breaking point in order to learn from its failure. So, for example, when you take a new watch underwater to see, okay, how deep can it really go? So Keegan is assigned to work with TAMS. And through that, we get to show this incredibly important thing for all our future, you know, whether you're in the military or whether you're at home. What is human machine teaming? What does that relationship look like? What does the machine do well?
Starting point is 00:25:52 What does it not do well? But also, how does it affect the humans? And you and I and the readers and our main character, Keegan, all fall into something that's very much at the heart of the world. this, which is that Tams is a technology. We never say it is anything other than that. And yet, everyone treats it as a character. And Kagan, who's a veteran, who's, you know, used robotic systems out in the Middle East, knows that you shouldn't do that. And again, this is drawn from the real world, for example, units in Afghanistan when their Pachbot, which is a little robot that was used to find roadside bombs, when it would get blown up, they would give it a funeral.
Starting point is 00:26:41 In Iraq, Pachbot got stuck in the mud, and someone, a soldier, ran out under heavy machine gunfire to rescue their Pachbop. That's systems back in the mid-2000s. that remote controlled, didn't look like a person, didn't talk to you. You know, so how do we, how do we emotionally connect, react to ones that can talk to you, that are more autonomous? We can't help ourselves. And look, I've already done this, for example, to my Alexa. I'll admit it.
Starting point is 00:27:18 I've yelled at Alexa. I don't yell at my stove. I yelled at Alexa when it didn't do something exactly that I wanted. I was having a bad coronavirus day. And so I just, it again is this way of, you get to play in science fiction, this notion of, you know, human and machine and human putting emotions onto machine.
Starting point is 00:27:43 I mean, you get the, it's sci-fi. It also has, you know, a little bit of that, that buddy cop element to it. But it's a real world issue that we got to think about. And again, how will we connect and use our machines to the people around us, whether it's a bad guy, are they going to take advantage of that, to our kids? I think the way that you've written Tams is very interesting.
Starting point is 00:28:12 And the relationship between Tams, such as it is, between Tams and Keegan is very interesting. Because especially at the beginning, Kegan is kind of constantly trying to put responsibility for Tams's behavior onto it. and Modi, who is the, what did you describe him, like Tams's handler, is always quick to tell her anything that goes wrong with this robot is on you because you're the one that's teaching it. It is only taking, you know, it's a collection of the inputs that you have given it. And I think that's not something I typically see in like this buddy cop outsider scenario, right? Typically the robot partner would have much more personality. Not saying that Tams doesn't, but that it's just, it's written in a way that like, these are the kind of robots that I see coming up.
Starting point is 00:29:08 Like, this is the kind of robot that makes sense to me. And there's some other great early scenes with Tams where she puts it through its paces. Can you kind of talk about those and like how it reacts to? what we call them, uh, training stress. Yeah, yeah. And what it gets after is a couple of things is, um,
Starting point is 00:29:33 you know, we always get new technologies and tools, you know, whether it was, um, someone swapping a farmer, swapping a shovel for the hammer at the factory assembly line or for, um,
Starting point is 00:29:45 you know, someone in war swapping the spear for the musket or now, you know, the, the, the M4 for the keyboard of cyber weapons. Um, always get that kind of swap. The difference now is that tool is intelligent. And that means a couple of
Starting point is 00:30:03 things that we play with. One, as you know, is that it's always learning. It's always getting slightly better from its continual updates and upgrades, but it's also learning because it's always watching and watching you and reacting to you. And so, you get everything from hold it. Am I actually training my replacement? Because it wasn't as good as me at the start, but it's getting better. I just fixed this philosophy, fix that. And you know, you see the, that thing in the back of people's head everywhere from, well, you know, obviously people in the Air Force have vibed around that, around, you know, what's played in the different generations of unmanned aerial. systems that have gotten smarter and smarter and you need less people controlling them, to radio DJs actually trained the AI that have already replaced them. I'm not talking year 2030 happened in 2019. And so you've got that.
Starting point is 00:31:15 There's a great Simpsons bit about that. Have you ever? Yeah. I mean, that's what so far is all of this is like drawn from either like really dark sci-fi. or like humorous stuff and yet we're living it right now. But then the other part to give you that kind of dark, funny thing is that idea of something that's always watching and learning from you. That's what it's like as a parent with your kids, right?
Starting point is 00:31:38 And, you know, you try and do the best, but then you sometimes screw up and then you're like, hold it. And so, you know, I think of that 1980s, you know, it was meant to be a dark commercial, but we all make fun of it. The, I learned it from watching you, dad, right? That's us with our machines right now. too. And so you get that with, you know, Kegan and Tams throughout all of it. But then there's another part of intelligent machines that, as you mentioned, the scene early on where, so she's a veteran.
Starting point is 00:32:07 Before she's going to go out with this system, she wants to put it through its paces. She wants to see, you know, what works, what doesn't. So she takes it to a training ground that she'd been through and has it go through that. And you get all the like, okay, here's what it's, good at. Okay, here's what it's not good at. Here's how it's learning. But one of the other things with intelligent machines is they call it the black box problem. And what it is is that they work in ways that not only we can't understand, but also they can't communicate to us. The value of that is that, you know, if we could understand what the machine was doing, we could do it ourselves. We wouldn't need it. And so it generates improvements that we and it don't understand. The real world version
Starting point is 00:33:02 of this that people might is Google, Google auto translate. Every so often, they'll pop out a new version and they'll say, this one is better. We're not really sure why it's better, but it's better. And so it feels very sci-fi, but that's what it is. And so you get these moments in that training scene. It's not just that we get to see kind of what robots are good at, they're not good at, but also you get this idea of when it improves, when it does something, it can't always explain why it did it, even if it's a little bit better. And pull back and think about everything from like, hold it. Ooh, that's creepy.
Starting point is 00:33:42 That's cool. For people interested in like, you know, business or military, how do you best buy and use something that you know is kind of. better, but you can't explain why. The scariest part of that to me is that the machine lacks the ability to communicate its improvements to us. That, I mean, that's, I just, I have, can I, can I, can I, can I add to that? It's, it's, it's what it's thinks its improvements are.
Starting point is 00:34:19 And, you know, and that's, again, that's also part of the, again, that's what they call the explainability problem is that go back to what I said before about. algorithmic bias. It thinks, and maybe the programmers think this is an improved. This is better. But one, it might not actually be. It might be a yield a different kind of bias. So, you know, we're going to continue to have fun with this. You'll get for fans of the TV show, The Office, they'll remember when Michael and Dwight were in the car with the GPS and they, you know, followed what the GPS told them to do. And they drove it right. into a lake. You know, that's an example of this, you know, you think it's better, but maybe it's not.
Starting point is 00:35:02 The darker version of this in the real world is that they've used AI to, already, to screen out everything from who should get bank loans or not to certain kinds of medical treatment for heart conditions. And in these situations, in these two different cases, they yielded actually racist recommendations. It was, for example, screening out African Americans from the list of who should get a bank loan. No one told the AI to be racist, but it was. And so, you know, you have to be mindful of that in the real world. And again, we've, we've, you know, this is not made up. This is, this is real stuff. But then you also have, we're people. And each of us has different thoughts on what we think is best for us in our own situation. So, for example, you know, this sort of gives you a hint at
Starting point is 00:35:58 Kagan, our main character, the first time the robotic system, she meets Tams, like how it's been programmed and how it's learned from all of these different iterations, like with your Siri or your Alexa right now, it speaks to her in a voice. And as I said, Keith, She's a she, a little bit different than most techno thrillers. She's like, no, uh-uh, that's not, you're not going to do this, this gender stuff with me, right? You know, think of how we, we took a technology, a tool, Alexa, et cetera, and made it, you know, an observant sounding woman. Kegan, you know, veteran, FBI agent, mom of a kid, she's like, no, no, no, no, reprogram. That's not what, you know, you may think that's what's the best.
Starting point is 00:36:50 that's not how I go. Or she's a veteran. It sometimes gives slightly long, over-polite answers because that's the way people in one situation wanted. She's like, no, no, no, no, we need brevity. You know, if I'm in the middle of a firefight, I need quick answers. So she's constantly reprogramming it for her own utility. We are going to pause there for a break. We are on with Peter Singer talking about his new book, Burnin. Welcome back to the show War College. listeners, we are on with Peter Singer talking about his new book, Burnin. Let's talk about, let's switch tracks a little bit and talk about threats and the threats that you present in the book. Who is your antagonist and what do you see as the battle spaces of the future, which I guess
Starting point is 00:37:42 burn in is more of a domestic threat experience, right? Ghost fleet is kind of the big picture military threat. But Burnin is very much about domestic politics and domestic conflicts. So what do you Yeah. So if you think of the future, sorry. Yeah. If you think of Ghost Fleet was the war abroad, Vernon is the war at home. And, you know, again, not plot spill too much, but you have sort of three layers of bad guys, so to speak. One, you know, one. One, you have criminality, which, you know, we may be in a world of robots and AI, but, you know, guess what? They're still going to have criminals. Sorry to pop the bubble for people. And the driving forces will be all the same forces that are out there. You'll still have, you know,
Starting point is 00:38:39 inequality. You'll still have greed, et cetera. So what does it look like in a world where it's being shaped by these forces, but also criminals are using some of these new technologies too? And that's, you know, theme number one real world side, which is the, this new wave of technology, it has, it may be super advanced, but has low barriers to entry. It'll be widely proliferated. And so again, you've got that. Second layer is the strange coalitions that are starting to come together of different groups, extremists, terrorists, etc., who, might have what seemed like wildly different ideologies, but we're seeing them cooperate more and more. Because in a sense, they all share the same enemy.
Starting point is 00:39:37 They share the same enemy of change of the future, but also a future that has certain values packed into it. Real world side, we've seen, for example, neo-Nazis collaborating with far-left anti-vaxxers and Russian government information warriors. That's in the real world. That's playing out right now. And in the book, we similarly have a network of extremist terrorists who are up to no good. And then the final theme that we're seeing out there, again, on the real world security side that we play with in the book, is that because of these trends, you also get what are called super empowered individuals. And this is not, I need to be clear. Again, you know, the rule of fiction, but with, you know, research is that this is not a story of superpowers, you know, like you put on a gauntlet and you beat everything.
Starting point is 00:40:45 you know like an Avengers or whatever know what we're talking about is the idea that individuals now have access to threats to power that even governments didn't have a generation back so excuse me in particular the what's driving this is actually how the internet is changing The internet was once about communication between people. Now it's more and more about the operation of our systems. You know, we have connected cars. We have connected houses. We have connected power grids for people in the military, you know, connected devices that range from missiles to the power systems on a base.
Starting point is 00:41:33 You name it. GPS, everything connected. The problem is that we're making all of the same mistakes that we did with. security on the original internet. We're not baking security in. And so it creates these vulnerabilities that allow individuals to carry out attacks that the Soviet Union couldn't have even dreamed of back in the day. And so not to plot spoil too much, but what if someone could create versions in the real world using cyber means of the biblical plagues, the 10 biblical plagues.
Starting point is 00:42:13 Not dreamed up because guess what? We've got the end notes to show how each of this might be possible. And so these are the layers of bad guyness that is out there. But one final twist on all of this, one of the main characters who's at this, I mean, you've got some, you know, I would call them sort of bad bad guys to sound like, you know, Donald Rumsfeld and sort of a weird, you know, the layer of adjective upon adjective. But what I'm getting at is that you also have, because of all these changes, there'll be certain bad guys that you might be a little bit empathetic towards their ideology.
Starting point is 00:42:54 You might go, does he actually have a point? And again, I think that makes it feel a little more uncomfortable than just, hey, it's a bad guy for being bad. sake. Right. I think that one of my favorite parts of the book, and it's, again, it's hard to kind of talk about them and him specifically without really spoiling things. Because I think a lot of the, a lot of what drives the plot or drove the plot and the pacing forward for me was learning more about them and their motivations.
Starting point is 00:43:30 Like that was the mystery that I wanted solved. So we won't give too much away. Yeah. Right. But I do think that they speak to the kind of the crisis that is burgeoning around all of this stuff right now in our real world. There's going to be winners and losers. And what happens, and this is something I think about a lot lately, it's like what happens when the economic losers of our system are still highly trained and highly emotional. motivated and feel they have no connection to society or to the lovers of power anymore beyond violence. And what does that look like? And I feel like, you get to see a little bit of that.
Starting point is 00:44:18 Yeah. So, you know, again, we're going through an incredible period of transition, you know, economic transition, political transition, social transition. And, you know, we will plot spoil this way. There is no pandemic. Burden is not a story of a pandemic. So you can get that, you know, escape us read without having to think about coronavirus. But the forces that it explores, I do believe, have been accelerated by the pandemic, whether it's obviously the economics, the distrust.
Starting point is 00:44:58 But even more so, these technologies. that we're talking about, they're being rolled out faster because of everything that's going on around us. We've seen greater levels of remote work. One of the characters in the book, the husband of the main character, is actually doing remote work, but we get to through the book see the effect that it has on a marriage, the effect that it has on parenting, the effect that it has on his politics. So I think people kind of, you know, connect to that. But also, you know, think about like, feels like telemedicine to the area I'm familiar with. In a couple of weeks, we went to the level that the telemedic medicine industry thought it would be 10 years from now. So we sped up. We jumped ahead.
Starting point is 00:45:48 Or it's robotics deployed in certain roles. We've seen robotics deployed out and, you know, everything from policing curfews to cleaning subways and hospitals. Again, and so the point of all this is that, or you know, think about AI surveillance of society, you know, that's rolling out at a much greater level. We're not going back, right? Even when we make it through the pandemic, we're not going back. And so these forces are there. They're sped up.
Starting point is 00:46:21 And as you put your finger on, for a lot of people, it, it feels overwhelming. And it might be because it's economically overwhelming. It might just be because it sort of socially feels like it's overwhelming. We can feel all that around us. We can also feel it taking place in a nation that's divided. Again, politically, economically, you name it. And so you know, you think back to the similar periods in our history of, you know,
Starting point is 00:46:49 whether it was a Great Depression or whether it was the 1960s, you know, these are really challenging times, all the more so if you're incredibly divided. And that transition, again, you know, that's what's interesting to me and I think important to all of us. Yes, maybe one day there will be a revolt to the robots, but we got to figure out this earlier mode. And again, you have to figure out in everything from how do you deal with it in your parenting and, you know, this balancing act that we all have now of really cool, awesome toys and gadgets
Starting point is 00:47:26 for our kids to, oh, do I need to think about my kids' privacy at a whole new way? To, no, maybe it's someone in the military. Okay, you know, what, what of this new technology that we're getting? How should I use it to get the best out of it, you know, the best out of it to lead my unit, the best out of it to defeat the bad guy, but also hold it. What does it mean that the bad guy might have that very same technology? Or if I'm using this technology, what are the new vulnerabilities or problems that I've introduced into everything from how my unit interacts to maybe the bad guy taking advantage
Starting point is 00:48:06 of it and going after this new weakness? Everything I just said could apply to something in a business environment. And so, yeah, it's an incredible transition period. And what we're trying to do is on the nonfiction side, give people a little bit of that familiarity to help them navigate it. Okay, these are some of the terms. These are some of the issues that you're going to have to deal with. But on the fiction side, one, for some people, they're going to enjoy it, hopefully. You know, just a fun, escapist, cool read.
Starting point is 00:48:43 But for other people, it's actually the packaging in the fiction that will make it feel more real to them and help them understand it. So, you know, those, you can read all the white papers that you want on what percentage of jobs might be automated or changed. I feel like it becomes more real when you see a character experiencing that change. or it just might be that, you know, frankly, more people are going to read it and stick into reading of it. You know, we've, we've joked that no one ever said, man, this is such a great PowerPoint. You ought to take it on your next vacation and read it by the pool. Now, it's funny. I was talking about that.
Starting point is 00:49:32 And someone in the military responded to me, he's like, yeah, but you've never seen my PowerPoints. I was like, oh, okay, chouch. maybe your PowerPoints people read on vacation, but most people don't. But we will do that with a novel, right? We will do that. We will talk about like, hey, you know, what's a book you're reading? I'm about to go on a trip. And, you know, hopefully that mix again of useful and fun fiction is a key takeaway of burn it.
Starting point is 00:50:01 Spoilers, that guy's PowerPoint is bad. They're all bad. Hey, you know, I, it's funny. when I was serving back in the Pentagon, there was an army officer in the cubicle beside me, and he had stuck to his wall. It was a play on the George Patton quote where it was no bastard ever won a war for his country
Starting point is 00:50:30 by making powerpoints. You win wars by making some other dumb bastard make powerpoints. That's good. Um, on the, I've got, you got time for just a few more. Absolutely. Okay. On the, the fiction thing, I kind of want to stress for our, the non-military listeners that we have, that there is, I think about Star Wars a lot.
Starting point is 00:50:57 Oh, who does it? It's one of our coronavirus escapes right now with my kids. Um, where, uh, they had not seen all the movies. So we're going back and watching all the movies. And, man, there are some, you know, the incredible highs of Empire Strikes Back and Rogue One. And, oof, there are some incredible lows. And I think everybody knows what I'm talking about in terms of young Anakin. But I was not like a big, you know, I grew up nerdy, but I was not a big Star Wars person.
Starting point is 00:51:33 And I was not a big Star Wars person until I started. really reporting on the military. Because, and I quickly figured out that, like, I had to watch these movies and understand them because I was going to run into, like, a Pentagon report or a lieutenant colonel that was going to use metaphors about the Battle of Hoth that I didn't understand because I had not seen it. And it is surprisingly ubiquitous across all branches that people will communicate to each other. in Star Wars metaphors because everyone's seen it, and it's a very quick way to express complicated issues. You know, this is, there's even a book about this that's a whole bunch of, you know, nonfiction essays that are, that are Star Wars military connected, you know. Max Brooks edited it.
Starting point is 00:52:28 I can't remember the exact title. But fiction does have a place when we're talking about this stuff. And in like you said it's a- Strategy strikes back, I believe. Strategy strikes back, yes. It's a good book. And it is, these books are important. Books like that and like yours are important because they allow you to quickly communicate these complicated ideas, right?
Starting point is 00:52:52 I guess that's more of a thought than a question. Well, look, and we can give a, there's a short version, which is that it's fun, it's different. the long wonky version is that actually one of the oldest technologies of communication, actually maybe the oldest, is the story. Our human brains are wired to react to stories and we therefore can't help ourselves. we put ourselves into those stories. We imagine ourselves in that scene or we imagine ourselves either as the character or reacting to the character.
Starting point is 00:53:47 And that is why they've been the oldest way of sharing information. The information might be a lesson of what to do or what to avoid. it might be morality it again and you know you think I'm you know
Starting point is 00:54:06 it's literally you know military codes were taught via I mean like codes of behavior and honor were taught by
Starting point is 00:54:16 the story of the Elliott to you know you mentioned the Star Wars one you know every every we get different forms of communication
Starting point is 00:54:26 you know we're 2,000 years back almost exactly the concept of a basically a guy named Pliny gathered 200 facts from 200 scholars. That was the very first
Starting point is 00:54:43 encyclopedia. Almost exactly 1,000 years ago there was it was in ancient Japan. The very first novel was written. then 30 years ago, maybe a darker day, PowerPoint enters the market.
Starting point is 00:55:07 The point is we've had these different ways of sharing information. Stories, though, is the longest and arguably the most effective. And there's not just me saying it. Actually, the research shows that people from the field to psychology, communications like find that narrative, again, because the way the human brain reacts is actually as or in some cases more powerful an influencer or not just the general public, but the most senior policymakers than the most canonical academic sources. You know, they're finding that when presidents or generals are talking about what to do,
Starting point is 00:55:48 they're referencing sometimes consciously, sometimes unconsciously, stories rather than that, you know, deep report on the theory of international relations. And so, you know, one, it affects us. Two, as I talked about, it, you're more likely to share it with someone else. You're more likely to read it than a white paper. Third, it doesn't just affect your brain, kind of how you process. It also affects your emotions. And again, that means it circles back.
Starting point is 00:56:20 You're more likely to sort of take it to heart. or you're more likely to actually take an action. So I can write a white paper saying there's this vulnerability in the supply chain. But the reality is the last book that it goes fleet, that's the one that sparked the investigations both in the GAO and the U.S. military to do something about the supply chain vulnerabilities. And so that's one of the other aspects of this we hope to get at is that you can be both predictive but also preventative. And so if story has that power, wouldn't it be great if instead of it being set in a galaxy far, far away and, you know, relying on, I hate to break people, but there's no such thing as midaclorians on, you know, fiction like that. But instead, it's actually drawing on real world research, but you still get the story. Wouldn't it be great that you could get both, that someone can walk in the shoes of someone else, experience this future work?
Starting point is 00:57:22 learn from it, but it's actually they're pulling from real world research. I think a really classic example of this is Ronald Reagan watching the anti-nuclear film the day after, which aired on ABC in 1983, I think. Then writing, he wrote a journal entry, because he kept a diary. He wrote a journal entry about how depressing and affecting the movie was. and then began to, in real life, pursue drawdown, nuclear drawdown with Russia. So my question, do you...
Starting point is 00:58:05 There's another Reagan moment of a similar fiction impact on real world, which is he watches war games, and the next day presses his national security advisors, hey, could this really happen? What are we doing about it? And that is actually the origin of the very first U.S. government cybersecurity directive. But this, you know, there's a long history of fiction influencing the real world, you know, whether it's Reagan to scientists, getting an idea of building the first flip phones.
Starting point is 00:58:42 They were watching Star Trek during a break at their lab. But the influence that mode is in-in-in-in-in-in-vis. and unintentional. You know, the creators of war games did not say we, we want to change cybersecurity. They also weren't drawing from research. And so, you know, you've always had that. I think what we're getting at now is can it be an intentional tool? And if that's the case, what are the key topics?
Starting point is 00:59:22 that people, you know, ought to know about, ought to learn from. But, you know, again, there's, there's, you got to serve two masters here. It doesn't work if it's not interesting. It doesn't work if it's not a great story. You know, if war games, if Reagan had turned off war games after the first scene and said, this is boring. Or, you know, for us, you know, our hope is that, you know, just like you were saying, that, you know, you want to follow Keegan and Thames as they hunt down this dastardly villain
Starting point is 00:59:51 And, you know, if you're not plucking away at that entertainment part of our brain, you don't get to the good stuff. It's going to go back to, you know, if the smoothie tastes disgusting, no one's going to drink it to the bottom and get the good stuff. Last question. Is anyone teaching robots how to tell stories? Yes. Even more so, they're teaching robots to lie. And again, that's a wonderful. real world sci-fi thing to play with, which is, what does it mean to be in a world where
Starting point is 01:00:29 you can't tell whether it's a physical robot or Siri or Alexa is telling you the truth or not? Sounds very sci-fi. You know, Asimov would play with that. Guess what? Real world. And that cross between, okay, these are the situations of, when it will happen and how people are going to react to that and have to deal with it. Oh, by the way, with the research backing it up, but then putting it in a fun, engaging story,
Starting point is 01:01:04 you know, that, that's, that to me is, is the magic. That's the great thing of, like, it feels very sci-fi, but it's our world to come. Peter Singer, thank you for coming on the program and talking about your book. It is Burned In, a novel of the real robotic revolution. Thanks so much for having me. That's it for this week, War College listeners. War College is me, Matthew Galt, Jason Fields, and Kevin Nodell. It was created by myself and Jason Fields.
Starting point is 01:01:37 And this week's episode was edited by Jason Fields. So if you are upset with the way things sound, please drop him a line by going to our website, warcollegepodcast.com, where you'll find links to our social. We are on Twitter at war underscore college. We are on Facebook, facebook.com forward slash war college podcast.
Starting point is 01:01:58 We will be back next week with more conversations about a world in conflict. Stay safe until then.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.