In The Arena by TechArena - Futurecasting Technology, Society and You with Brian David Johnson

Episode Date: January 10, 2023

TechArena host Allyson Klein talks with futurist Brian David Johnson about future and threatcasting, and how taking agency to envision our future places us in the drivers seat to shape it....

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to the Tech Arena, featuring authentic discussions between tech's leading innovators and our host, Alison Klein. Now, let's step into the arena. Welcome to the Tech Arena. My name is Allison Klein, and today I'm joined by Brian David Johnson. I'm so excited to have you on the program. You are a futurist. You are a published author, a professor, a filmmaker. I don't even know where to start with you, Brian, but why don't you go ahead and just introduce yourself to the audience and the scope of topics that you focus on. Awesome. Well, and it's great to be on the program today, Alison. Great to be back chatting with you. Always a joy there. I think you got them all, actually, Alison. I am a
Starting point is 00:00:56 futurist first and foremost. I'm an applied futurist. I work with organizations to look 10 years out into the future, and I model both positive and negative futures in a range of futures. So I don't make predictions. And then as an applied futurist, I work with those organizations to say, okay, what do you need to do today, tomorrow, five years from now to move towards that positive future and avoid that negative future? And also what are the indicators along the way? Probably the best example of that is I was the chief futurist at the Intel Corporation for over a decade. And I was there simply because it took them 10 years to design, develop, and deploy a chip.
Starting point is 00:01:30 And so they needed to know what people wanted to do with technology 10 years in advance. And that was my job. I'm an engineer and a designer by training. And so I would do this future casting modeling, and we'd look at the future, and then we would write that specification. And then that would go into the design of the chip and the technology. And then we could use a lot of that for other things like patents and mergers and acquisitions and HR and a lot of other things. To date myself, they brought me in at Intel to do smart TV.
Starting point is 00:01:57 So that idea of using an Internet to watch television back in the early 2000s was this crazy idea. And so that's what we did. And so we kind of built that technology and sort of built that market. For me, back in 2016, I stepped down as the chief futurist because I ought to do something called threat casting, which is the, is as it sounds, it's the other side of that, looking at some of the negative implications of technology. So now I'm a professor at Arizona State University. I also have a threat casting lab where I do national security work. And I also have a private practice where I do future casting with my clients.
Starting point is 00:02:29 And the range of work that I do is, you know, I am a technologist. As I mentioned, I'm an engineer and I come out of Silicon Valley. And so everything I do is kind of based, you know, kind of looks towards technology is driven by technology. But very fortunate over the last 25 years to really be able to look at all sectors, to be able to look at the future of agriculture, because I'm the farm futurist. I write for Successful Farming Magazine. I do a lot of work in finance as well as medical. I do a lot of work with governments and cities. So really anything that kind of has to do with the future. But as you say, I'm kind of firmly rooted in sort of technology is really kind of where
Starting point is 00:03:07 most of my work starts. As we head into 2023, we are looking at the future of innovation on the tech arena. And I absolutely wanted to have you on because I'm familiar with your work, not just in future casting of technology, but also on some of the more human related aspects of your work on applying future casting to yourself. And I want to talk to you about that. Why don't we just start, and we will get to threat casting because I want to ask some questions about that too. Why don't we just start with technology? And this is a very interesting time for me as somebody who's been in the industry for over two decades. The future seems to be accelerating towards us.
Starting point is 00:03:50 And we see lots of technologies coming together to paint a very different picture of how society will function in the next 10 years than it does today. Tell me about what you're excited about and what you see in terms of opportunity on the technology front when you look at that 10-year time horizon. And that 10-year part is the key, Allison. So as a futurist, and when I'm working either with the government or the military or large corporations, or when I'm working with just average people, just chatting with people over dinner about the future, you know, I'm always looking with that 10 years out. And it's really my job as a futurist, not to look at the next big thing, you know, when technology, everybody wants to talk about what's the next big thing. And certainly that's important. And we
Starting point is 00:04:37 should talk about that. But what I'm always pushing people to do is say, well, what comes after that? What's the thing after that? What's the thing after the thing? And in business and especially in your personal life, that's important because you can say, OK, let's say this technology is really successful. Awesome. Great. Then what do you more control. Because that's one of the things I'm always pushing, whether it be with my clients or with just average people, is that we have control about them. We have control over the future. You know, the future isn't fixed. And it's not this static place that we're all running towards helpless to do anything about. You know, I like to joke with people, people ask me about the future and they treat it like a place that we're all going to, like we're all going to Des Moines, Iowa together. And the fact of the matter is, it's not true. And again, I love Des Moines, Iowa. I do a lot of work there. As I mentioned, I'm the foreign futurist. But because people, you know, what will it be like in Des Moines? How do I prepare for Des Moines? What should I see in Des Moines?
Starting point is 00:05:39 And I can tell you all those things. But I tell a lot of people, you don't have to go to Des Moines. You can go to Seattle, Washington. It's also beautiful there. So that idea of where you're trying to give people a little bit more agency. But to get back to the crux of your question, I think what I'm really excited about is that you start looking at these host of technologies and everything from the chatbots like chat GPT and sort of AI and ML in general. You start looking at these sort of new devices and new platforms around the sort of metaverse and things general. You start looking at these sort of new devices and new platforms around the sort of metaverse and things that are coming out. There's also some things with sort of synthetic biology and that sort of line between biology and technology. And, you know, also as we come out of the pandemic, it was the great accelerator and it really pushed
Starting point is 00:06:18 in positive and negative ways. So for me, I get really excited because I can say, I see all of this opportunity. Then I push people to say, okay, well, what do you want to do with it? How do you keep it about people? How do you keep it about yourself, your organization, your industry to say, what are you going to do with all these things? Because that to me is really how you future cast. It's how you think about your life and take that back because you can prepare for it so much better if you're looking out and in that way and
Starting point is 00:06:45 with that agency they say what will you do with these things that are coming that's that's what gets me most excited certainly willing to to dive into alice in any of the technologies you want to talk about but that's what how what is always going to be the thing that drives me is that keeping the human beings at the center because they tell people if you don't keep humans at the center it's a very bad thing because it's's about people. Everything we do is about people. You know, technology is just a tool. And so we always have to remember it is about the people. And I do think the pandemic really, really brought that to the forefront. I'm glad you brought up the pandemic. You know, I look at this period and, you know, see all of the things that have changed and, you know, the use of technology
Starting point is 00:07:25 during this period of time, but also the shift in outlook and what we've learned from the pandemic and what we don't even know we've learned yet. I'm sure you and your team have spent a lot of time studying what has happened to technology, to society, to people. How do you look at the pandemic as a futurist in terms of the change in trajectory of what that next 10 year time horizon looks like? It's really, it is that accelerator. So I think that's one of the things, you know, it was an interesting time, you know, coming, being a futurist and coming into the pandemic and then actually doing work on the pandemic. So, you know, being a professor at a large public university, I actually then was doing a lot of work in the Western states. I was actually doing COVID and pandemic modeling and doing future
Starting point is 00:08:13 casting and threat casting on that, really sort of in the center of it with epidemiologists all up and down the West Coast, which is where I'm based. And then, but also working with the organizations that I had a long history with. So in one of those organizations was the Association of Supply Chain Managers. So, and supply chain trade associations had a really tough time. If you remember, it used to be, you know, supply chains were in the back room and then all of a sudden there's no paper, there's no toilet paper on the shelves. And all of a sudden it's at the kitchen table and people are freaking out because if you've ever watched any zombie movie, the sure way to know that the zombie apocalypse has happened is that there's no toilet paper. And so like people were freaking out. And so what I was working with them is kind of thinking about, all right, how do we then navigate through this?
Starting point is 00:08:56 But then a lot of things also went right in the pandemic. And I think we're getting close to the time to talk about that. And it was a terrible time, certainly. And a lot of things went wrong. A lot of things went right. A lot of our digital infrastructure held. A lot of our, with a lot of shifts for a lot of companies there, their IT worked. A lot of productivity for people and for workers, even though they were all shut up in our dining rooms doing, working, it worked. It worked. You know, things kept going. Now, a lot of things didn't work, you know, especially when you had to be in place. But I think we can take a lot of lessons learned when you could see, okay, what could we do to increase productivity? What could we do to change work culture and to change our personal culture around how we think about our lives and how we think about our futures?
Starting point is 00:09:42 I mean, that's one of the things about sharing a global destabilizing event that is historic. You realize that you're living history. And when you're living history, you really start having a different perception of what the future could be because you realize you are living a moment of history and there is a future. And so a lot of people, as I've come out of that, have a really different way of thinking about and talking about and valuing the future. And it's been a really fascinating shift coming out of that time. and how we shift to technologies as required to function. You can see the acceleration of cloud computing to keep society running. You can see the use of remote work tools. How do you see the human aspects of that in terms of the change in what we've learned
Starting point is 00:10:41 about what we want the future to be for us. And I know that your last book turned future casting on self. How do you think the pandemic has shaped this moment as the right moment for some self-reflection about where the future goes? I'm glad you bring that up because that's one of the things I was going to bring up when you talked about technologies and the pandemic. One of the things that I've seen, and I'm glad you bring that up because that's one of the things I was going to bring up when you talked about technologies and the pandemic. One of the things that I've seen, and I'm sure all of your listeners have seen, which I think is a beautiful and wonderful thing, is that I know we're doing this without cameras on and this is a podcast, but it is now completely natural for you to be on a video call and have your cat walk in front of the camera. That's not a career-ending problem. It's okay if you have children to have your child come running in the room, because by the way, children need things, and people have adjusted. Our work culture has really adjusted to allow us
Starting point is 00:11:39 to be more human. And part of that was, yes, we were working from home. Some of us were working from our cars, depending upon what was going on in our homes. Like it allowed people to actually give each other a little bit more slack, you know, and also allowed people sometimes to say, you know, I messed up the time zone on this call. I'm really not camera ready. So I'm just going to keep my camera off. So let's move forward. It really allowed people to do that. And, you know, quite honestly, and, you know, it's heartbreaking, but people have, a lot of people were touched by personal loss during the pandemic. And so we started cutting each other a little bit more slack when there could be family medical appointments, when there could be family tragedies, when there could be family vacations. I've seen a lot of that in our culture that we've just become, especially in our sort of work culture and technology has really enabled it, has allowed us to be just a lot more human to each other and has allowed more humanity, I think, into our work.
Starting point is 00:12:37 And that's one of the things I think is incredibly important and has really increased a lot of people's creativity and productivity. With The Future You, you're right. I mean, that was one of the things, the genesis of that book is I had a friend of mine, a director and an artist by the name of Leopoldo Boot. And Leopoldo said, you know, BDJ, you do this for militaries and corporations. And he goes, why don't you just have average people do it? And it really dawned on me that every time I would go to a dinner party or, you know, go over to a baseball game or whatever, and I'd be talking to people and they'd learn I was a futurist, they would start asking me questions as I do. And again, having done this for so long, I'm really used to it. And they would ask me about, you know, what they should do for
Starting point is 00:13:17 the future of their kids or what they should do for the future of their occupation change and their career change, or how do they, how do they buy a house? And then they would ask me crazy stuff like, what's the future of love and sex? And I'm a middle-aged white dude. I met my wife before the internet, right? So I didn't know what to say, but it was really important. And again, I value the humans. And so I said, look, I could never be so arrogant as to tell you your future, but let me do this. Let me tell you how to think like a futurist. As you said, how to think out into that future, how to think differently about the future, get rid of a lot of misconceptions and then think kind of differently. delivered it right before the pandemic. And as we started locking down, my publisher, who was then going through and getting the manuscript ready, called me and said, BDJ, you wrote a book about the pandemic before the pandemic. Chapter seven of the book is all
Starting point is 00:14:14 about existential crises. And it talks about, certainly wasn't about just a pandemic, but it was about these big world, about war and famine and pandemics and sort of how you deal with that. And so it became really a tool for a lot of people. They started on chapter seven and then went back to chapter one, but they sort of, how do you deal with the future? How do you take agency back? And that was the whole point was to say, you will build your future and don't let anybody take your future from you. Like if anybody comes to you and, you know, gives you a prediction or says that, you know, you're a failure, this can work. They're taking away your agency.
Starting point is 00:14:50 They're a toxic person. And so a lot of what I was doing with working with people, and especially as we were in the pandemic and when really with my students. So at ASU, working with my students, I was, you know, getting them copies of the book however I could to kind of go, let's use this as a way to kind of talk about it. Because for them, you know, you can imagine these, these young women and men as they were, you know, this was the formative years of their lives and trying to think about, you know, okay, how do you get to the other side of this? And what do you want that to look like? And so I think, again, for that, it's, it's, it's given people, hopefully,
Starting point is 00:15:23 a little bit more of a platform to kind of say, okay, we're going to get to the other side of this and getting out there and really being able to change that future, to take that agency and take that power. I think the agency is so critical at this time because we've never had a technology as powerful as artificial intelligence to yield. And I think that, you know, you've talked about this, you've talked about the future of AI and talked about sentient technology, and you paint a very rosy picture of it in your book. I love your viewpoint that this is going to be something that is very beneficial to technology and that we need to do some things like redefine what makes a life when technology has the opportunity to redefine what work looks like for us, for example. Can you talk about how you view that and going back to that 10-year lens, where you think we're going to be on that journey
Starting point is 00:16:26 within that 10-year time horizon? Yeah, and I love this concept because, well, and I should say, and I know you know this as well, Alison, so a lot of the work that I do on sort of AI and the future and a lot of the writing has been very positive because I'm an optimist, obviously, because I think the future is built by people,
Starting point is 00:16:43 so let's get together and build a future that's great. There also is that other whole work stream of my work, which is threat casting. And I've written some very negative things about the future of AI, because I want to make sure that we talk about those dark spaces so we can avoid them, which I also think is incredibly important. But for AI, you start thinking about this and what it's pushing us to do. Well, first, let's talk about where it's going. So in the maturation of AI, right, it moved from science fiction. And then over the last 10 years, it was kind of in labs and it was kind of experimented with in here. And really, over the last couple of years, we started seeing it move into kind of meaningful tools that we can use.
Starting point is 00:17:23 You know, AI has been around for a long time. It lands our planes. It helps us pick music. It helps us pick movies. There's things that it kind of does in the background. Don't worry about all the sensationalism and don't worry about all the kind of the predictions. I always tell people, beware of predictions and people who make them because they're generally trying to sell you something or scare you. I'm trying to see, okay, how are we using it as a tool? How are we using it? Because again, technology is just a tool. And then what we're really seeing, and you pegged it, Allison, is really over the last couple of months with
Starting point is 00:17:53 chat GPT and with AI and ML, and a lot of what we're seeing coming is you're starting to see it kind of make meaningful strides towards doing work for humans. And I think that's how I view it. When you hear stories about people using, you know, these technologies so that high schoolers are writing better, you know, term papers. Yes, that's a thing. And we need to deal with that as an educator. But for me, I'm like, oh, it's actually a meaningful usage of this technology. You're actually seeing it kind of put to use. So as we look 10 years out, what I'm starting to see is it starts to get riven through all of our software, because that's really what, you know, AI is, it's software. So as we look 10 years out with this technology, what I'm
Starting point is 00:18:36 beginning to see is it's, you know, he says, it's just software. So AI gets kind of riven through all of the software and software, it just does work, right? It's just does work on a hardware platform, whatever that hard work platform might be. And so you start to see it. And by the way, so really the future of AI is that it really starts to go away. We don't see it that much anymore. It's not this big sensational thing because it's just being put to use. It's like electricity. It's like the internet. As we're recording this, the Consumer Electronics Show is going on in Las Vegas and I'm not there. But a lot of the scouts and a lot of people who I do work with who are there are talking about is AI, it's having its day.
Starting point is 00:19:13 Everything is powered by AI, driven by AI. It's the new smart home. It's the new internet enabled. It's the new, and if, you know, Allison, you and I have been in technology for a long time. So there's a lot of that is cloud enabled. And it's just shows part of it has becomes a marketing term, but that's okay because it's starting to get kind of pulled into all of these technologies. So now, you know, you don't pick up your smartphone and doesn't say, Hey, it's internet enabled because you just don't see it anymore. It's just this thing, you know, you don't pick up a lamp and it goes, Whoa, enabled by electricity. You know, it's not an oil lamp anymore. So you're starting to see this sort of technology taken into it.
Starting point is 00:19:49 So that's where I sort of see AI going. But as you say, it is fundamentally changing our relationship with technologies. And it's fundamentally changing when we think about the future of work, how we think about work. And when I say that, what I mean is, how do we value human labor? This is really important because, you know, coming out of the industrial revolution, human beings were sort of treated as machines. They were on assembly lines, you know,
Starting point is 00:20:15 they were doing things that robots couldn't do. Well, now robots can do them. Same thing, which is happening with AI is we're being able to use these technologies to go and do things. And it doesn't make, doesn't mean we're obsolete. It means that we're not being turned into machines anymore. That's what I tell people, you know, if technology can take your job, you know, if a robot or AI can take your job, that means your job probably sucked because you were a robot and you were turned into
Starting point is 00:20:41 a machine. And so that, and it's, it's, it's a, at one point, it's really kind of existential. It really kind of makes us go, what, what's the value? And if so, if a machine can do it and that's cool, we need it done, but how does it change it? And so I think that's what we're seeing and what I'm really excited about. I don't have the answers to that because I think it's different depending upon different people on different sectors, on different industries, as we see that change. But I think that's, for me, the story behind the story is this fundamental shift on how we value humans. And for myself, again, the optimistic side of me who wants to build a better future is to say, how can we then take advantage of what it means to be human and
Starting point is 00:21:20 the flowering and complexity and diversity of human thought and human creativity and human communication. That I think to me is the thing that gets me really excited. And if we focus on that, then there's so much more that we can do and so much more that we can accomplish. I think that one thing that's interesting is when we come to these points, we are, as a human race, so of the moment. And it's not as if we haven't invented technologies before that have displaced entire functions in life. You know, there's no longer somebody that is responsible for tending the fire because if the fire goes out, the tribe might die. There are roles in history that have been eliminated based on technology innovation, time and again. And we are at that another one of those asymptotic moments.
Starting point is 00:22:14 What do you see? And I know that you are very good at looking back at the past to help with your future casting. What do you see in the past that might help guide us in this moment? And part of it is me, but also part of it is the amazing cultural historians that I work with, because that's a big input to the work that I do. And what they'll tell you is that history doesn't repeat itself. But what happens is that history is the language that we use to talk about the future. So you're right, I use so much a lot of the work that I do with Jamie Carrott, who's an amazing cultural historian is looking at those moments in time. And I think you really pegged a lot of them. I mean, you can pick big, wide swaths of those, like the looms, the weaving looms and where the Luddites
Starting point is 00:22:57 came from and the idea that people who were weaving by hand, it was mechanized and sort of taken out. You can look at things, but I like to go to the kind of sometimes the silly so that we also remember that there's a little humor. Because again, that's also why humans are awesome because humans are funny. So we have to remember that because, you know, generally technology doesn't crack jokes. The idea of saying, you know, think about, you know, the work that travel agents used to do, you know, travel agents, you know, before, you know, they were there and they were completely displaced by the internet. They were completely displaced by also deep changes in the travel industry and the idea
Starting point is 00:23:37 of being able to print your own ticket, which was a huge idea. So you start seeing those changes in a lot of those travel agents. There's still some travel agents around, but they've kind disgruntled, they actually move on and they'll do different things. And that's why I think that agency, again, bringing it back to that is so important to think about, well, what do you want to do? How do you kind of go there? And how do you kind of think about those things? And there's lots of examples when it comes to sort of huge infrastructure shifts as well. And, you know, as we moved from the horse and the horse and carriage to the automobile, that to me is a really interesting one. And one that we could do an entire show on is looking at all the different changes. You know,
Starting point is 00:24:35 what a lot of people will say is, you know, look what happened to the buggy whip industry as it completely went away when cars came out, which is true. But if you think about all the infrastructure changes that had to happen, not only when it came to the assembly line with Henry Ford, but think about the network of highways and how that changed and moved from deer paths and foot paths and horse paths that then were turned into roads. And of course, if you've ever been to New England and driven Boston, it's a nightmare because it was those paths. If you come out to the West, we've got kind of big superhighways, right? Because we didn't really build in that way. And so I think there's a lot of interesting ways to look at how real change happens. Think about,
Starting point is 00:25:17 as we're going through this move to electrification, which is another sort of shift in that moving to electric cars, think about how we move from gas stations, but not only just gas stations. Think about when we moved with fuel efficiency. There used to be gas stations everywhere, right? You think about back in the fifties and sixties, there were gas stations everywhere. That's because cars were terrible. They were these awful machines that broke down all the time and used all this gas. So you had to have a safe place to go and you had to have trained people to fix these awful machines. And then they got better and they got more fuel efficient. And the petrol stations and the gas stations have really changed. If you look at the landscape of the US, it's really changed. And now as we start to think about for charging stations and what that's going on,
Starting point is 00:25:58 it's very similar. It's following a very similar trajectory, but there's a lot of changes there. There's the changes to the technologies, there's the changes to the laws and the infrastructure. So I do think that for me, that shift for the car is really interesting because we're seeing it happen, especially when it comes to electrification and a lot of these other big, big shifts to how we live and how we work and how we move goods and people around, I think to me, that's a really interesting one. Now, I would not be a good podcast host if I didn't bring up the bad side. And you talked about how you also spend a lot of time on threat casting.
Starting point is 00:26:39 Because every time technology innovation happens, there are wonderful ways that it can be deployed, but there are also some bad actors out there that use it for ill intent. What do you see are the biggest risks to our society with technology right now? And describe to me how that applies to your work with threat casting. Yeah, thanks for bringing that up. And again, I have no problem talking about threat casting and talking about the darker futures. Again, I'm an optimist, though, because I think we need to look at those dark futures. And that's really what I do around national security, global security, resource security, climate change, to kind of go, okay, this is where it could go really bad. What do we need to do about it? What are the steps that we can take to disrupt, mitigate, and recover from it? You know, again, how are we having those conversations now so we're not caught off guard? And so I have no problem talking about this stuff as well. And so for me, in the threat casting work that I do, you're right. And especially when it comes
Starting point is 00:27:38 to technology, myself being a technologist, is that, you know, any technology that is sufficient to kind of delight people or make people more productivity, make them more productive and for people to make money out of it, it can also be turned into something that could be weaponized. It can do harm to people. And that's not, you know, a nightmare scenario. It's just the way it's always been, as you mentioned. So, you know, you can't build, you can't make a hammer sufficient
Starting point is 00:28:05 enough to build a house that's also not sufficient enough to bash somebody's head in. You just can't. And that's the way technology is. Now, the good news is, Alison, we're all not walking around with hammers in our heads. And there's a reason why is because we have laws and we have culture and we have norms to say that that's not cool. Like, don't do that. And so that's why it is important to look at these sort of darker areas and say, OK, what are the things that we do? And that's why, again, I do a lot of this national security and global security work. I do work with the with the Army and the Air Force. Certainly, I do work with NATO kind of thinking about these areas as well as the Secret Service and some other places. And so for me, as we look at these technologies, it's all the really positive technologies. So everything when it comes to the use of artificial intelligence, I've written several reports on the weaponization of AI and what that might look like and what that might do literally to the fabric of our nation and how it can be a threat to that. I've looked at things
Starting point is 00:28:59 around sort of synthetic biology, things like CRISPR and genetics and genomics and what that might look like. I think these are certainly kind of very large threats. But so much of the conversation that I then turn it to, and really working with these people, these very sort of serious organizations whose job it is to kind of keep us safe, is to talk about not only the technology, but to talk about power. Because if you're just talking about the technology, you're not having the right conversation. Because it's really not about the technology. You need to understand the capabilities of the technology, but you also need to understand the power dynamics of who is using it, who has the ability to use it,
Starting point is 00:29:38 what is fueling it, how is it applied, and who is it used upon? And to me, it's generally those very human-centric things, power structures, historic power structures, inequities, a lot of the geopolitical landscape. A lot of that, to me, is where the threat lies, not in the technology itself, but it is a conversation back and forth around the technologies and how they might be used, how we might detect that they are being used, and then what we might be able to do to prepare so that we can disrupt it or mitigate it once we've seen it coming. So all the kind of positive things, you can always just flip it. It's the other side of the coin and go, okay, here's all the negative things.
Starting point is 00:30:16 But when you're thinking about those negative things, that's what fuels a lot of the work that I do with a lot of the organizations is understanding those existing structures and understanding how that technology might be used to harm people. When you talk about this, Brian, I think about what if you cannot protect yourself from what you don't see. And I think that we do have a desire to be, you know, head in the sand on some of these things. What makes you an optimist as you look out at the future? And is it something that is a constant truism? Or is there something unique about this moment that is making you optimistic? Well, I think there's there's about three reasons for it, Allison, and it kind of spans time, if you will. So as we talked, I'm a futurist, I do 10 years out, and I So, and we all, and we all live that during the pandemic, right? So, and seeing that makes me
Starting point is 00:31:28 really humble. You know, the idea that we made billions of chips and we made a market and we made this really, this thing that is really quite incredible and brought about a new golden age of television. That makes me really humble because you realize, oh my gosh, we did that. Like, we actually did that. Like, we actually did that. Like, I remember when we were told, and I say that very large because it was a large group of people and not just at Intel, all over the world, but very early on, you could fit us in a room and they thought we were crazy. Like, literally, I had a room in the late 90s of like 600 people who laughed at me when
Starting point is 00:31:58 I said we would use the internet to watch television. And that there's no arrogance in that. There's the humility of that to go, wow, wait, we can actually do this. And so for me, it started there, very optimistic. Oh my gosh, look at what we can do. And it really then became a choice. I'm an optimist by choice. Optimism and pessimism is not naturally occurring in the wild. It's a choice that you make. It's a way you look at the world. Again, as we talked, I do threat casting, right? I do work on weapons of mass destruction for NATO.
Starting point is 00:32:33 I look at some really bad things, but I can be an optimist to go, okay, what do we do about it? And that's the, I think a lot of the difference with sort of pessimists is, and why I actually have a problem with dystopias, is that I think you need to be responsible for your dystopia. You know, if anybody comes and says, Oh, my gosh, the world is going to end, we're all going to die. Again, people are taking your agency, and they're saying you're all going to, it's terrible. Like you're the worst party guest ever. I think the thing about being an optimist is saying, hey, things could go really,
Starting point is 00:33:03 really wrong. And it could get really dark. But here's what we should do about it. And I think the thing about being an optimist is saying, hey, things could go really, really wrong and it could get really dark, but here's what we should do about it. And I think that, to me, brings a lot of light to that darkness and especially a lot of the dark work that I do and the stuff that can get really, really bad. And I have moments with people where I turn from being a futurist. So I do a lot of participatory design. I do a lot of work with these organizations. So again, being an applied futurist means I'm working with organizations to do this. So they're actually in the room a lot of times. And sometimes I actually have to have people step out because they do have that moment where it gets too dark for them. And I have to turn from a futurist into a therapist and be like, nope, come on, let's walk back. That's where the future you also was written, was kind of saying, no, no, no,
Starting point is 00:33:47 here's what we can do about it. That's why this is so dark is to kind of get back from it. And so that to me is really important, especially during the pandemic and during these time of war and during a lot of what we're doing is to kind of bring some more of that light and agency to it. And then the final bit that makes me an optimist is that I'm not alone. That I said, I do this with people and I do this with people all over the world. And there's a lot of really smart people who are thinking about these dark places and trying to make people safer and keeping people at the center and really trying to do this work. And so every time I do this threat casting, I get a little bit more optimistic because I meet more and more people who are doing it and have a passion for it, who really do want to make the future better.
Starting point is 00:34:33 And to me, that's as good as it gets. Brian, one final question for you. This has been a fantastic talk. So thank you for being on the program. Where can folks engage with you, find your books, learn more about what you're writing about? Probably the, I'm infinitely searchable. So again, Brian David Johnson, futurist, infinitely, infinitely searchable. You can always find me at, if you're interested in the threat casting work, my lab at ASU is threatcasting.asu.edu.
Starting point is 00:35:11 So is a great way to go there, or you can just search Threat Casting Arizona State University. And a lot of the reports that I do there, they're not classified. They're actually there to actually have this broader conversation, Allison, like you and I were just having. So you can do there. Oftentimes you can follow me on some different social media and stuff like that. If anybody's really interested, there's, you know, the Future You, I think is a good place to start just for anybody to kind of getting into it. And then as a, you know, really applicable to this conversation we have today, there's a book I'm just finishing right now called The New Dogs of War that really looks at this idea of weaponizing commercial technologies and what people can do about it, how you can think about it as a futurist for yourself, for your family, for your community, for your organization. So that should be coming out later in 2023 as well.
Starting point is 00:36:01 So that's where you can find me. Fantastic. Thank you so much for being on today. It was a pleasure. Allison, it's always great to talk to you. Thanks for joining the Tech Arena. Subscribe and engage at our website, thetecharena.net. All content is copyright by The Tech Arena.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.