Ologies with Alie Ward - Futurology (THE FUTURE) with Rose Eveleth

Episode Date: January 7, 2020

"The future's not ours to see..." OR IS IT? Professional futurologist Rose Eveleth -- host of the podcast Flash Forward -- endures all kinds of breathless questions from Alie about shiny metal and imp...lanted microchips and biohacking and population density curves and flying cars and equality and utopias and the Jetsons and technology and nuclear fusion and whether or not our phones are spying on us and if we should have kids or dogs. Also: how long do we, as a species, have on Earth? Despite some grim conditions, find out why Rose remains an optimist and get fired the hell up about making the future better for everyone.Visit Rose Eveleth's websiteListen to her podcast Flash ForwardDonations went to Gender Reveal Grant and the Magnolia Mother's TrustMore episode sources & linksSponsors of OlogiesTranscripts & bleeped episodesBecome a patron of Ologies for as little as a buck a monthOlogiesMerch.com has hats, shirts, pins, totes!Follow @Ologies on Twitter and InstagramFollow @AlieWard on Twitter and InstagramSound editing by Jarrett Sleeper of MindJam Media & Steven Ray MorrisTheme song by Nick Thorburn

Transcript
Discussion (0)
Starting point is 00:00:00 Oh hey, it's just the podcast host that calls herself your internet dad, Allie Ward, back from the holidays and my first work break in many years. I have a cold right now, but my heart's burning, ready to deliver a year's worth of brand new episodes of oligies. Hello again, y'all, it's 2020. We are no longer dabbling in the 2000s, like poking a gnarled toe in the waters of tomorrow. We're shin deep in the century. The world is burning, new wars are a bloom.
Starting point is 00:00:32 Drones deliver our lunches, we're in this bitch. It's officially the future. So what better time to talk about futureology, the study of what happens next? It's a thing. But before we move forward, let's pause a quick second to thank every single patron who's ever supported the show and allowed me to do things like take the last two weeks off without sponsors. Thanks to everyone who's wearing fresh oligies swag from oligiesmerch.com.
Starting point is 00:00:58 Thanks to everyone who's telling friends about the show, keeping it in the top science podcast on iTunes, rating and subscribing and reviewing. You know, I tenderly creep your reviews. It's true. So I can pull out a fresh one. Let's have at it. Okay, this week is from Kylan Kells, who said, Allie is my dad, my mom, my weird classmate, who has really good points, and my best friend, all at one.
Starting point is 00:01:21 The fanatology episode with Cole and Perry changed my life, and now I'm trying to open my own business, helping people plan the last party of their life the way they want. Let's do end of life parties instead of funerals. Okay? Okay, cool. Anyways, give it a listen. Okay, goodbye. Dang, that's a good idea.
Starting point is 00:01:36 There is no better situation in which to be the end of the life of the party. And I say, we're a tiara. We're all going to die. Okay, futureology. Very much a thing in a scholarly sense, and it's also called future studies. Sometimes futurism, although we're going to talk more with this week's guests about that, it's a study of possible, probable, and preferable futures. So it's a social science like history, but the gear shift goes forward rather than in
Starting point is 00:02:03 reverse. And this guest studied ecology, behavior, and evolution at UC San Diego, has a master's in journalism science, and has investigated topics like bionic, human exoskeletons, and sex robots, and tech progress, facial recognition, some uncanny valleys, and space travel for publications like the BBC, and Scientific American, Vice, Vox, The Atlantic, Motherboard, and more. And she's also produced, hosted, and edited over 100 episodes of the very highly lauded podcast Flash Forward, which is just pure auditory futurology.
Starting point is 00:02:43 And in each episode, she looks at possible future scenarios, and then she talks to these experts about the trajectory that we squishy humans may take just marching toward tomorrow. So she's also just a badass advocate and a creative soul. I am a fan lady. She was leading a seminar at Zycom Camp in November, and so I nabbed her for an hour to ask all about her full-time job studying and examining and forecasting the future. And we talked about everything from crotches to the Jetsons, what scares her the most, what gives her hope, the many types of technology that she has buried in her own body, why some
Starting point is 00:03:20 futurists don't want things to change, and if your phone is spying on you, and also why amid all of this technology and chaos she considers herself an optimist. So cuddle up in your space blanket and have your cyborg butler brew you a goblet of the good stuff, bolster yourself for the friendliness and the forecasting of Flash Forward host and professional literal futurologist, the Rose Evelyn. Now, you are a futurologist? I am a futurologist. What does that mean?
Starting point is 00:04:11 So futurologist is not like a super common term, but it is the one that I like to use for what I do because I kind of sit in between a bunch of different things that have other terms. So futurist is something that people probably have heard of. Futurists generally are people who are working in industry, so it's sort of like a vocation. There are degrees you can get in strategic foresight to become a futurist, which is very cool. Strategic foresight.
Starting point is 00:04:35 Yeah. There are like various programs that have these, or schools that have these programs, and those people tend to kind of work as consultants. They'll go into a company and be like, okay, let me help you project out 5, 10, 15 years and kind of think about the future of Coca-Cola or Nike or whatever it is. There's like professional groups for those futurists, and that's one bucket, and then there's this sort of like science fiction writers bucket, and then you have them sort of consider themselves futurists, and there's overlap between all of these.
Starting point is 00:05:02 There are sci-fi writers who are also futurists, but those folks are like imagining fictional futures and thinking about the future in that way, and then there are sort of like critical future studies people or like future studies people, and they're on the academic side of like, what do we talk about when we talk about futurism? And I kind of like straddle all of those. And so I think of myself as someone who like sort of studies futurists and someone who like studies science fiction writers and someone who like thinks about these like academic fields.
Starting point is 00:05:25 And so I like futurologists because it kind of like, is that nice umbrella term? And also people laugh when I say it, so that's all I can ask for. How long have you been using it? About two years now, I would say I've been using it, or I don't mind if people call me futurist, but I do like futurologist better. I like it better too. I should thought you would. Of course.
Starting point is 00:05:41 I mean, here we are. So yeah, if you were not a futurologist, would we be sitting on this couch today? I would have invented the word for you if I could, because I've been wanting to talk to you for so long about what you do and about how you got kind of captivated by it because you have a journalism background. Yeah. And so when did you start studying journalism? When did you know that you wanted to tell stories? Yeah. So I thought I would be a scientist for a really long time.
Starting point is 00:06:07 When I was a kid, I dressed up as Jane Goodall for like many years in a row. It's Halloween, Halloween, like many years in a row enough for people to be like, this is weird, this isn't cute anymore, you know, like and then and then I learned that like there's Jacuzzo and there was this underwater world. So I was like, great, I'm going to be the Django on the sea. This is great. I have my career plan. Like I'm ready. I made my dad get me scuba certification when I was 12, which is the youngest age allowed to do it.
Starting point is 00:06:30 I got certified through Naoi, which was run by this like X Marine. And I was like this scrawny 12 year old among like former Navy SEALs to getting this training. So it was like extremely brutal, but I was like committed to it. So I was like, I'm going to be a scuba diver. I'm going to like study the ocean. This is going to be my thing. I went to college in San Diego. I worked at Scripps Institute of Oceanography and I was like, this is great. I'm going to be the Jacuzzo of the ocean. Excellent.
Starting point is 00:06:53 Turns out I love science and actually doing science is not the thing I'm the best at. And so I had a PI who I worked for who was great. And was like, what about this thing called science journalism, which was like, like I never occurred to me that that was like a job someone could have. You can do that. And so that was really it. Like I had no journalism training. I had no journalism background. I applied to this program at NYU.
Starting point is 00:07:18 That's a master's program in science journalism specifically. Oh, wow. Which is like there's not that many programs like that. And I did not get in because my I was not a very good student in school. And like my application was like all over the place. I had written this like speculative fiction thing about like a researcher who got obsessed with this squid and then like was like traveling the world. They're like, put that in my application for journalism school, which is like makes no sense. I would it was like all over the place.
Starting point is 00:07:43 And so the program coordinator who I'm now close friends with, I didn't get in. I was on the wait list and he I tease him all the time about it because but he said he was like, yeah, your application was bizarre. I was like, who is this person? And by chance and it almost never happens at that program. Somebody decided not to go and so I got to go. So it was like, you know, luck of the universe. And so I got to go to this program and that's really where I learned everything.
Starting point is 00:08:04 I didn't know anything about journalism. And so I got thrown in with all these people who had like worked at their student newspapers and like had wanted to be journalists since they were kids. And I was like, what is a headline? Like, I don't know what this means. So I like really was a crash course and it was great. And it was I'm very lucky to have gone to that program. And that's kind of like how I got into journalism.
Starting point is 00:08:20 And I thought I would write about ocean. I thought I'd write about like environmental science. I just happened to start writing about prosthetics, actually, as like like the first couple of stories I wrote that were published were about the first big feature I wrote was about whether Oscar Pistorius has an unfair advantage as a person with like bionic legs. And this is like a huge scientific debate. And I wrote about for Scientific American and then sort of like from there
Starting point is 00:08:42 started writing more about like bionics and that kind of like led me into body hacking and like this sort of like weird world of futurism. And at the time, everyone was covering it so badly that I was like, they're like, someone has to do a better job of doing this. And so that's kind of like how I wound up starting to do it. What is a bad way of covering it? A lot of just like breathless coverage of technology companies being like, wow, look at this amazing thing that's going to like be on our shelves in a year
Starting point is 00:09:08 and solve all our problems. And like, that's not how anything works, you know, no critical coverage. Now it's much better. But a lot of technology coverage at the time was literally just like the latest iPhone. How does it work? You know, and there was nothing that was like analyzing like, should you be giving Apple your data? Like, you know, and none of that stuff was really happening.
Starting point is 00:09:24 And it was like early facial recognition. And I remember being like, are we, we like should probably, I don't know, talk about this and no one was talking about it. So it was really honestly like easy to kind of be a person who could make a name as someone who was like an interesting person in that field, because there were so few people talking about that stuff in the journalism world, in academics and other places people were, but in sort of like technology journalism about the future.
Starting point is 00:09:45 I mean, it was like pretty, pretty like slim pickings. How often do people come to you to ask you like, should I get an Alexa? Should I do it all the time, all the time, all the time? And I love it. Actually, I'm really glad that people ask because it's it is kind of one of my bug bears where like, I, I know it's convenient, but I'm just like, it's so it's actually really hard. I think one of my jobs, honestly, is to help people understand
Starting point is 00:10:08 like what the risks actually are. Because sort of like, I think this is less true of climate change now. I think people are more aware that climate change is important and a thing. I think a lot of people with privacy and surveillance, they're like, like Google already has all my information. And like what I didn't, I'm not a serial killer. Like, why do I care if they're scanning my face? It's really hard to kind of like conceptualize the risk because it
Starting point is 00:10:28 sort of feels nebulous. It sort of feels like, like, whatever. And it's convenient. I can ask Alexa, like what the temperature is or whatever. The most common question I get actually is, is my phone listening to me? Oh, yes, it's not. What? No, come on.
Starting point is 00:10:41 It's, it's mostly not. So there have been tons of studies that academics have done. And there are some really shady apps that you can install on your phone that might be doing this, but like for the vast majority of people, it's just that we're like really predictable as humans and like figuring out what you want to buy is actually like really easy based on all the other data that you've given Google already. But yeah.
Starting point is 00:11:00 And I think also it's a confirmation bias thing where it's like, you only notice the times when, you know, you were just talking to somebody about like some product and then all of a sudden it shows up. You don't notice all the times when that doesn't happen. So that's the most common question I get is my phone listening to me. Well, when did you decide to take your futurism journalism and futurology career and make it into podcast format? At what point?
Starting point is 00:11:24 Because you're killing the game. Thank you. Um, and how, when did you decide to do flash forward? Yeah. So I had worked in podcasts for a bit. I worked at radio lab. I helped the New York Times launch their now no longer science podcast. So I knew I was like really interested in audio and radio and these things.
Starting point is 00:11:40 And actually like an editor, Annali Nuitz, who's an amazing writer came to me because she was the founder of I09 and she came to me and was like, Hey, we really want to do a podcast. Do you have any ideas? And I was like, boy, do I have ideas for a podcast. And so we talked about it. And this was the one that we were both really excited about because flash word blends sort of like science fiction and these sort of like
Starting point is 00:11:59 audio dramas and then journalism. And that's what I09 did too. Okay. So if you haven't heard it yet, every episode of flash forward starts with this short radio play to set the scene, which kind of normalizes and humanizes what might be on the horizon for us all. But before we get ahead of ourselves, let's just take a step back in time. And see how our elders saw our todays.
Starting point is 00:12:21 Does that make sense? So when it comes to the history of futureology, what did futureology look like at different points? Did people even fathom or try to predict or draw the future before like the industrial revolution? That's a great question. And it's actually something that like academics debate a lot. So, I mean, obviously the prophecy has been around for a long time, right?
Starting point is 00:12:43 Like looking to the stars, trying to figure out what's going to happen. But, you know, this question of like, when did it become the case that we sort of assume that the future will be different, like really different than it is now? Because for a long time in human history, like the future was kind of you're like still did your thing, you were like in your cave or you were in your house or like you farmed or you did this thing. The idea that like your future, even within your lifetime, would be like radically different is not like that old.
Starting point is 00:13:08 And there are debates about when this happened. Some people point to the Victorian era when a lot of social norms actually started really changing. And you had people sort of questioning, you know, family structures, you had people questioning sort of like high society, you had all of that stuff. And that was sort of a gateway into being like, well, wait a minute, why couldn't things be totally different? Some people point to electricity as being like the thing where like the
Starting point is 00:13:29 literal libel moment where like, you know, things happen. Some people point to the industrial revolution where like all of a sudden all of industry changes, it sort of depends on who you ask. So, well, now it's pretty common place to point a tiny handheld computer at our face and use an image filter that changes us into a cat and then beam that to millions of people across the world, just for funsies. It wasn't until the mid 1800s that we even had flushing toilets or light switches life, man, it comes at you fast.
Starting point is 00:14:01 But yeah, it's not like people have had people talk about futurism and prophecy and religion and all that stuff. But this idea that, you know, we kind of have where like, you know, in a hundred years, like who even knows, you know, like there's that feeling. And that wasn't always the case. But like when exactly that started is actually up for debate. What do you think about Tomorrowland? When Disneyland opened in 1955, Tomorrowland, the world of the future,
Starting point is 00:14:27 see more science fiction than actual fact. In Disneyland. I love it. Oh, really? I do because like, I love nostalgic future stuff, right? Because you can kind of be like, oh, like remember when we thought this was going to happen or remember when we thought that. I think also the cool thing, one of the cool things about being a
Starting point is 00:14:41 futurologist is that like a lot of futurism, especially looking back on it, it says a lot more about the time that it was created than it says about the future, right? And this is anything with science fiction, right? Yes. In theory, we're talking about the future, but a lot of the time you're really kind of talking about now. So, you know, you could see this in the ways that the kitchens of the future
Starting point is 00:14:59 were presented in the 1950s, where you have all of these companies sort of realizing that like housewives are no longer complacent to be at home and be cooking all the time. They got jobs during the war, they don't really want to stop having jobs. They want to be able to kind of like have it all, quote unquote. And you see all of these different companies presenting these like kitchens of the future to sort of like, basically appease women and make it like, oh, no, no, no, no, you can still like do all the housework and stuff and like
Starting point is 00:15:26 not really, you don't need, you don't need to go in it. You could stay, you don't need to go anywhere, you know? And like, I think a lot of the time when I see those like sort of like retro future things, it's really fun because it's a reminder of like, oh, that's how we thought about the future. And that's how I thought about the present then. And we still do see that. I mean, in kitchen designs of today, you still see some of those ideas of like,
Starting point is 00:15:45 what, who is, who's depicted in those promotional images? It's always women still. And you're like, okay, we can like land a rover on Mars, but like, you can't imagine a man in the kitchen, you know? Oh my gosh, I get a real bug in my bonnet when I see so many house cleaning products just for women, just, just only women are cleaning toilets. It drives me a little crazy. There's a great book called More Work for Mother about how automation in homes
Starting point is 00:16:10 has always been marketed to women as like, oh, it's going to be a time saver. It's going to be a time saver in every single study shows that it just puts more work on women every time. Okay, quick aside. I look this up and there's a 1985 book by Ruth Schwartz Cohen called More Work for Mother, the ironies of household technology from the open hearth to the microwave. And it has a ton of data to support that the role of the homemaker has not in
Starting point is 00:16:34 fact gotten easier. And this was before people pinterested their kids lunch every day. And granted, it was written 35 years ago when people still smoked on airplanes. But every time nowadays you see a commercial for home janitorial products. Just keep an eye on what gender is usually doing the scrubbing. Also, side note, on Amazon, More Work for Mother, the book has mostly favorable responses, but there is one one star review, which I of course read. And this person who identifies themselves only as avid reader said,
Starting point is 00:17:07 More Work for Mother, give me a break. The book we need is Modern Marriage. What's in it for father? Apparently avid reader is often dissatisfied with their purchases because another Amazon reviewer posted the response. Every book you've reviewed received one star. Are you just really bad at picking books? I'll put the link up to this in case you want to read this bloodthirsty book
Starting point is 00:17:26 drama while you're like killing time on your phone, waiting for a robot to bring you a panini. Now, why do you think that at no point in history, people realize that we were all going to be staring at phones at some point? Oh, I think people did. I think people did. I think, you know, maybe not in the exact version of this, but people have been predicting sort of like video conferencing and video calling and sort of
Starting point is 00:17:49 like screen based interactions for a long time. I think that the exact form of these like little glass boxes in our pockets might be slightly different, but you can look at some of the World Fair stuff in like the early 1900s and you see people predicting video like basically FaceTime. And so like, you know, the exact details of what it looks like are different, but people have kind of thought about this a lot. I mean, as soon as you have things like yellow journalism, like the attention
Starting point is 00:18:15 economy that we talk about now, where it's like everyone is, you know, Facebook is competing for your eyeballs. They're not really competing for your money necessarily, because what they want is for you to look at stuff so that they can track what you're looking at and then sell that information to other people. Like that idea, I think, is actually relatively old and that kind of plays into the phone thing as well. So I think actually that kind of concept has been around for longer than we might expect.
Starting point is 00:18:37 OK, so quick aside, yellow journalism essentially means tabloid fodder, like big exaggerated headlines and sensationalized crime stories and just hot, hot celeb goss, a.k.a. pretty much the whole internet now. But the term yellow journalism arose during the late 1800s New York newspaper wars between Joseph Pulitzer and William Randolph Hearst. And these two rag magnets had a tug of war for the rights to publish this comic strip called Hogan's Alley, featuring a character called The Yellow Kid, a child from a rough part of the city, wearing a mustard colored nightgown
Starting point is 00:19:14 and depicted with two big front teeth and a shaved head from a lice infestation, according to its creator. He looks kind of like Dopey from The Seven Dwarfs. And The Yellow Kid was everywhere in the late 1800s. And these two rival newspapers competed to run Hogan's Alley in their dailies. And if you're like, wait, isn't Hogan's Alley the name of the FBI Tactical Training Center and also a Nintendo game? It is. And it's named after the same strip.
Starting point is 00:19:39 So now you can drop some ancient newspaper drama and comic trivia on your unsuspecting friends, or maybe you can break the ice with your FBI interrogator. You're welcome. Oh, speaking of cartoons, what about the Jetsons? What did they get right? The Jetsons. The Jetsons. I mean, there's it's so funny. The Jetsons is the one is the thing that people always talk about.
Starting point is 00:19:58 Like, oh, flying cars, robot butlers, blah, blah, blah. And like super regressive family dynamics, although you still have you have like you have your Jane Jetson goes to work, you know? And I was like, yay, that was a big deal at the time. You know, you have your robot butler who falls in love with another robot, which I love. It's one of my favorites. They have their food machine, which I think is really interesting because that's been a common thing in science fiction that we don't see.
Starting point is 00:20:21 Like, why don't they're obviously vending machines? But like, why aren't there like push a button and food comes out? Raw. And your eggs? Cold. I don't get it. When we first got married, you could punch out a breakfast like mother used to make. In the opening credits, a perky Jane Jetson is also shown taking George Jetson's wallet to go shopping in a floating Sky Mall. As so many of us do these days, gross.
Starting point is 00:20:46 Anyway, when you are looking at the future, how much do you think about yourself in those situations? I try to think about myself in those situations, but I also try to remember that like my experience is like singular and like I am a like cis white lady born into the United States and like all of those things. And so one of the big things I try to do in my work with Flash Forward and elsewhere is think about like, OK, but what if I was somebody else
Starting point is 00:21:10 and what if I like had less privilege than I have now? And how does this impact these people? So, for example, in the episode I did about CRISPR, sort of like gene editing of human babies and stuff like that, which I was working on and had basically done. And then the news broke about the Chinese scientist with the Chris rabies and I was like, no, I just finished this episode. I have to go back and redo stuff.
Starting point is 00:21:29 But that episode, most of the interviewees, most of the guests on that episode were disabled people and sort of about like, what is it like to hear all these scientists talk about eliminating you? Basically, like, like, how does that feel? Like, what is that like? What would it be like to be the last deaf person on earth? You know, and like, is that something we actually want? So I think I try really hard, actually, to think about how people who are not
Starting point is 00:21:48 like me might feel or find themselves in those futures. And I try to interview a lot of people on the show that are not like me. So I can kind of be like, how do you feel about this? Like, what do you think about this for the episode about body swapping? Everybody on that episode almost, except for one, is a trans person. And it's like, OK, how does this correlate with your experiences and your feelings about bodies and stuff like that? Because, like, I don't know what that's like.
Starting point is 00:22:10 So, yeah, like, thinking through, like, who are the people who are going to be most impacted by this? And like, it's probably not me. And like trying to find those people is really important, I think. How much anxiety do you feel about the future? I feel I feel like I feel like a healthy amount of anxiety about the future, which is to say I feel anxious about the future. But I also I think that you can't get buried in it.
Starting point is 00:22:33 Because otherwise, like, what do you do? There's really hard to do anything. A lot of the future is scary, but also the future isn't written yet. And I think that's like something that we need to remember. Like, yes, certain things are in motion. And yes, like, all of us are kind of like one person. But in fact, like, the future has not happened yet. Like, there are things that we can do and like things that you can try and also
Starting point is 00:22:52 ways to kind of mentally think about the future. I've been thinking a lot about there's a concept in psychology called mental time travel, which is basically that you can kind of imagine yourself in future situations and certain people with certain forms of amnesia actually cannot do that, which is like kind of hard to even fathom. They talk about being marooned in the moment, which is like really terrifying sounding, but also they know that people with depression actually have a really hard time with this where it's like, if you're depressed,
Starting point is 00:23:14 it's really hard for you to imagine yourself in the future. And but they also find in the studies that if you can and when you do imagine yourself in the future, specifically imagine what you're going to do, imagine your next steps, people are happier. People tend to be more productive. People tend to like just feel better because they can actually be like, OK, here's my plan. Here's what I'm going to do.
Starting point is 00:23:32 So I always say that when people are like, how do I cope with this like horrible anxiety that I'm feeling about the future, which is totally normal. Given like what's happening in the world is like actually thinking through like imagining yourself and what you can do can be really like calming because it's specific and it's you. It's not just like, oh, my God, like, I don't know, like climate changes in the hands of nine companies, you know, which is like really terrifying. Unless you're one of those companies, I guess, in which case, it's just
Starting point is 00:23:58 these really yes, money, money, money, hardy every day. Exactly. It's the world burns in literally what do you think based on all the episodes you've done, like 100, what do you think is going to change the most in like the next decade? I think that and this is like an unpopular answer, given that I live in Berkeley, which is like in tech central kind of area. But I think it's social stuff that's going to change the most. I think right now we're seeing a really awesome moment with like gender, for
Starting point is 00:24:27 example, where like people are sort of like finally realizing in the more popular culture that like gender is not binary and like that people have these different options and like it's more acceptable to kind of talk about those questions. I always say that like trans people are the original body hackers, because like they actually do all the stuff that the body hackers talk about doing and are doing like really amazing stuff. I mean, like you're seeing people question like the value of capitalism, which like was probably was unfathomable to people 20 years ago to even say,
Starting point is 00:24:56 like, I mean, people did say this, but to have like it be a more common thing you see on Twitter, being like capitalism, the worst. Like you did not see that for a long time. So I think a lot of it is going to be more social change than technological change. I think people assume when we talk about the future that we're always talking about like flying cars and jetpacks and like these tech things. But I think that a lot of this, especially coming out of these climate change conversations, it's going to be people asking like, why have we made
Starting point is 00:25:19 the decisions we've made culturally and sort of like sociopolitically? And like, are they actually the right decisions and can we change the way that this whole thing works? And like, should we have prisons? Like that's a question that I think is getting a lot more attention now that like you didn't see 20 years ago as much. So yeah, I think those are the questions that people are going to be asking more of as opposed to like, you know, which app should I get?
Starting point is 00:25:39 Or like, you know, is there going to be the next iPhone? Which is the thing people love to ask. And it's like, who cares? It's like, not important, you know, how do you reconcile some of the regressions that seems that are happening socially? Like we're in a moment that is so progressive in some areas and so not in others. What the fuck's going on? Yeah, I mean, I think that like, different people have said this where, you know,
Starting point is 00:26:02 this is common, right? You make progress and then you move back. I should say that I do not actually believe that like the arc of justice and the arc of history bends towards justice, which is a quote from Dr. Martin Luther King that I'm probably butchering. When our nights become darker than a thousand midnight, let us remember that is a creative force in this universe, working to pull down the gigantic mountains of evil, the power that is able to make a way out of no way and
Starting point is 00:26:30 transform dark yesterdays into bright tomorrow. Let us realize that the arc of the moral universe is long, but it bends towards justice. But I do think that like you're seeing, you know, you are seeing some huge regressions that you're seeing like the rise of populism and sort of the rise of fascism in countries like the United States. And and that's hugely problematic. You're also seeing people kind of like come show up to say like no to that. So I think that like, yes, we're seeing a lot of bad, bad things, but we're also,
Starting point is 00:27:01 I think, seeing a lot of people sort of like realize that they need to actually show up and like do something to stop those things from happening, whether that's about like populism or just like climate change, which is the thing that a lot of people ask me about and are like, well, why should I have kids if we're all going to be on fire underwater in the next like three years? Which is not true. But yeah, I think like that's the thing I'm sort of muddling this answer. But like to to remember that like you can do something and we can do something
Starting point is 00:27:25 and it's not like predetermined. Like they haven't won yet, right? It's like, you know, well, I have another question is why should I have kids if the world is going to be a fire on the water? I mean, I cannot tell anyone to or to not have kids. OK, at the same time, like I think that it's a really interesting question. And there is a whole movement of people, right? Who believe that like we just shouldn't have kids and we shouldn't put them
Starting point is 00:27:52 into the situation. OK, side note, as discussed in the ex-catology episode from last November about the apocalypse, which is really more relevant than ever this week, the choice to nope babies is called voluntary childlessness or a child free lifestyle. And a more scholastic term is anti-natalism, which includes the philosophy that you can't get the consent of a child to exist. Therefore, it's immoral to procreate. So do you have a child who, when told to do a chore, has screamed,
Starting point is 00:28:23 whatever, I didn't ask to be born. Well, congratulations. You have engaged in philosophical discourse about anti-natalism. Oh, and if your child free, but your relatives insist you should pop out some shorties, here's an idea. You can present them with the data that one American child has the same carbon impact per year as 75 round trip transatlantic flights or that it would take 150 meat eaters going vegan to offset a kiddo.
Starting point is 00:28:51 But on the flip side, babies are cute and their heads smell like powder and milk and they turn into adults that keep hospitals and government and the world running. So to all the smug child free folks who have dogs instead. And I'm talking to myself right now. Bad news or meat hungry, hairy children have substantial carbon paw prints, too. So what to do? I don't know. Go with your heart, man.
Starting point is 00:29:14 Rose says that having kids or dogs is just a deeply personal decision. So cut bangs, text your crush and decide on your own terms if you want to have babies with them and if the babies will have bangs. If you really want to have children and you are not having them because you think climate change is scary, then you should do something about climate change. You should like actually get off your butt and do something about climate change. And that's hard to do, right? Like as we've said, like, you know, climate change is largely in the hands
Starting point is 00:29:42 of like very huge companies, but like, I mean, there are protests you can go to. Like there are local politics you can get involved in. There are like local issues you can work on. I mean, I can't tell people to or to not have children. I do not have children myself. Or do or am I planning to myself? Right. I've been worried about overpopulation. Don't worry about it. OK.
Starting point is 00:30:03 Really, because I've been worried about it since like high school. Yeah. So this is a huge topic in the 1980s, which is maybe when you and I are both in high school. Yes. And I guess in the 90s. And it was largely a lot of it came out of, honestly, just racism of like, oh, all of these countries like India and China and places in Africa are suddenly have all these people and they're going to like want rights and they're not going to want us to just tell them what to do. And like, oh, suddenly we have an overpopulation issue.
Starting point is 00:30:33 And I think that like a lot of that there's a lot of really good work done by researchers and scholars that basically say that like that is like not the problem. The problem is racism, you know, or the problem is like, you know, distribution of resources among the people on earth. There are a lot of people on earth. But if you want to talk about like, what's the problem with climate change? Overpopulation is pretty low on the actual list. Like if we all stop having children right now, the climate will still warm, you know,
Starting point is 00:30:59 like that's not, you know, necessarily what's happening. I mean, so yeah, I think that like a lot of there's been a lot of really interesting stuff. Mother Jones had a big series about this, I think a couple of years ago about like trying to push back on this narrative of overpopulation and sort of reveal that a lot of people who first posited this as like the problem were basically just racist. Wow. Oh, man, because I remember looking at population density curves of like deer and mice in certain populations and looking at the human population density curve.
Starting point is 00:31:25 And I was like, well, something's going to give. And it's already leveling off too, right? Like we are not still in the like highest highest upswing. Like, you know, humans are having less children generally, like as I think a global trend, I hear millennials are having, having fewer. Yeah, I'm getting dogs instead. Is what I hear. That's what I got. So your dog, by the way. Thank you.
Starting point is 00:31:44 OK, quick aside. Population curves can make a shape like a J or really more like a backwards L, which is when the head count for a species kind of lobes along steady for a long time and then has a huge upswing and goes up sharply in a really short matter of time. So, for instance, in the year 1900, Earth's population was 1.6 billion, but we just hit 7.7 billion in 2019, partly oddly because of better agriculture and just figuring out how to make ammonia from airborne nitrogen. How weird is that?
Starting point is 00:32:17 Anyway, there's a bunch of us. And when a population hits what's called a carrying capacity, it tends to level off rather than continuing to shoot up. So think of that J kind of taking a right in forming an S shaped curve. It doesn't necessarily plunge downward into extinction. It's not like a good party that is so lit, so off the chain, as your children say, that it suddenly gets shut down by the cops. In the great mysterious kegger that is life,
Starting point is 00:32:46 we're so worried about the apocalypse of a squad car. We forget that some parties just kind of peter out because they suck. When it comes to topics to explore in the future, how do you decide which ones are worthy of exploring? Yeah, that's a great question. I feel like for flash forward, I always want to do something that's like interesting and surprising that like people haven't heard before. So there's certain topics that I just like part of me is like,
Starting point is 00:33:10 I just don't know what I'm going to say about this, like self driving cars. I feel like they've been just like, everyone writes about them. People know I've always struggled with like, what is the like version of this that I can do that like no one thought of before people aren't talking about? There was one that I did that people had asked people. One of the most requested episodes always is living forever, which is the future that I'm not particularly interested in because really just the preview of the rich, basically.
Starting point is 00:33:31 And it's like, oh, living forever is great. If you are making compound interest, you know, like wonderful. And so it's just like a thing that like really, really rich white guys are obsessed with. And for me, I'm like, I just feel like there's so many other things that are more interesting, but so it was requested so often. And finally, I sort of figured out like how to do it. And I did it as an episode about what would happen if we had that technology and applied it to the criminal justice system.
Starting point is 00:33:52 And prisoners could be sentenced to like 300 years in prison. And like, how does that, what does that look like? You know, we already have issues when people get out of prison and they like don't have never seen a cell phone before. Right. And like, what does that look like times, you know, 300. So that was kind of my way into that episode of like, OK, what's the interesting thing that I can say about this that people aren't talking about?
Starting point is 00:34:09 What about the way that we'll look at animal rights in the future? It's one of my favorite topics. I mean, it's something that is just something that to do. And there's a lot of some things that you do in the past that are egregious. Totally. What is going to happen with that? What's your stance? I'm so fascinated by this because I think, you know, the more you look into animals and their intelligence,
Starting point is 00:34:33 everything we ever thought about them and how smart they are has been wrong because they're smarter, right? They're just like, we've consistently underestimated non-human animals just over and over and over again. And so, yeah, this question of like, at what point do we start to decide? Like, actually, like we can't do it. We should not do this. I mean, there are people who believe we shouldn't have pets. There are people that believe that zoos are unethical.
Starting point is 00:34:54 And then there's this question of like, what is the point of some of these things? And like some pets, right? Like dogs and cats evolved with humans in a way that like, I don't. I mean, most people's dogs would not survive out in the wild. Some cats would, not all cats. But so like this question of like, what are what are our obligations to these animals and what, you know, what should we be doing? Like, should we be training dolphins to do tricks in zoos?
Starting point is 00:35:18 Like, you know, it's hard to say. I'm obsessed with this question. And I should say, like, I don't know what I think still about it. It's really hard. Like, you know, on the one hand, you know, it's not something, you know, I'm not totally comfortable coming in, particularly as like, you know, a Western white lady being like, you have to stop eating meat, you know, where it's like, OK, like that's like a hard proposition to make.
Starting point is 00:35:40 At the same time, like the more you learn about what these animals are capable of, the more it's like, oh, should we really be doing this? Like, is this really OK? You know, what do we owe to these creatures that we've like destroyed their habitats, like, you know, all of this stuff. So I'm I'm I'm like obsessed with this question of like the future of animal rights. And if we could actually communicate with them, how would that change things? Or like, if we could understand what they were saying or they could understand
Starting point is 00:36:03 what we were saying better, like how would that change things? And I've done a bunch of episodes about that and sort of like this question of like, what, where do we draw the line? I think it's really hard. Can we talk to dogs? Certain places have have passed laws. So last year, India passed a law, I think it was actually just a region of India passed a law that basically said that human animals had human rights,
Starting point is 00:36:23 basically. And then when you get into it, like, what does that actually mean? Because like, if that's the case, then like, you can't eat meat, right? Because that'd be murdering someone. But the details of the law are actually kind of confusing and it's unclear if it's actually going to change anything. But yeah, these questions of like, what should we do about these like other creatures that we share the planet with that like are actually like way smarter than we thought they were?
Starting point is 00:36:43 Okay, real quick, I look this up and recently the Indian High Court of Punjab and Haryana has determined that all animals are persons and that humans and Haryana are de facto parents to all the animals, giving animals rights in the courts and the eyes of the law. Also in India, the Ganges and the Yamada Rivers, plus all of their streams and all of their little tributaries are also considered persons. And side note, I'm the youngest of three daughters and by default, I was usually just a dirty faced rug rat tagging along with older siblings and their friends.
Starting point is 00:37:18 And my mom, your grandpa, Fancy Nancy, taught me to respond to any bullying by saying, I'm a person with rights and feelings. And I hope that when you're feeling down, whether you're a turtle or a stream or a hairless ape, and remember that you're a person with rights and feelings. So take that into the future. What about what are some other things that you are looking forward to in the future? What are you most stoked about? Oh, most stoked about I'm like, this makes me feel kind of old, but I'm
Starting point is 00:37:50 stoked about the youth and their like their obsession with climate change. Like, honestly, I think, like, I feel like the last couple of years, there's been such a huge change in the way that younger people are talking about climate change. I did a series recently on Flash Forward, where I had a bunch of teen actors come in and do some stuff. And then I was asking them, you know, how do you feel about privacy? And they were basically like, I don't care about privacy. I care about climate change.
Starting point is 00:38:15 Like, literally, they were like, privacy is not going to end the human race. And I was like, whoa, and they were like very intense about it. And just to see that like them be like, no, like this is the thing that we care about, like we are out in the streets, like we are doing this. I think like that's honestly quite exciting to me as like someone who has watched people sort of be apathetic about climate change for so long. So I think like that is really exciting. I'm excited by like the advances in like health services for trans people.
Starting point is 00:38:41 And like the way that like that has advanced the ways that we talk about gender is like cooler now, I think in general. That's great. I'm generally don't get excited about like technology stuff because like so much of it, it's like, is it really going to happen? First of all, and also like, is it only going to be available to like super rich people? Question? Probably, you know? I think those are the kids these days, kids are doing such good things. So great.
Starting point is 00:39:05 Sci-Fi movies. Oh, do you avoid them because it's your work? Or do you see every single sci-fi movie set in the future? I feel like I'm in between. I do not avoid them. I like them, but I also like, I just sometimes don't get to have fine time to watch them. It's not like an on purpose thing. But I watch a lot of sci-fi movies. I am not someone who like gets caught up in like, that's not realistic.
Starting point is 00:39:29 Or like warp drive doesn't work like that. Like I just like don't care. I'm like totally happy to suspend disbelief and like have a good time in a movie. I'm much more like get, I get much more annoyed if like the character development is bad, you know? And like the women just like are only there to suffer or whatever it is. Like that's what I care about. I don't care if like whatever technology thing doesn't make sense.
Starting point is 00:39:47 Like that doesn't bother me. Yeah. Do you think that there are any sci-fi movies that predicted this time in history, right? I mean, 2001 got a lot of stuff right about the way we were going to communicate with one another. Like the sort of way that space has no sound. Just kind of like, I mean, that movie is like really interesting.
Starting point is 00:40:06 The movie that I always talk about that like no one has seen is called Born in Flames. And it's this old movie from the, I want to say early 90s. And it has Catherine Bigelow in it, who was Zero Dark Thirty. Catherine Bigelow. She's an actress in it. It's very weird. It's this indie movie.
Starting point is 00:40:32 It's so interesting. And like some of it has not aged well, because it's about like these dueling feminist radio stations in New York City. So Rose explained that the plot involves a futuristic social estate, but there's a lot of police brutality, feminism that is not intersectional. And she says it's eerily prescient. Also, good news, you can stream Born in Flames on Vimeo for three bucks.
Starting point is 00:40:54 And I'll put a link on my site at alleyward.com, slash allergies, slash futurology. And I asked Rose if at the heart of futurology is just wanting to believe that things will be better than they are today. And she said that one reason she uses the term futurologist for herself as someone who studies the future is because a lot of professional futurists are people hired by big Fortune 500 companies who have an interest in maintaining the status quo, because it's how those corporations make their money.
Starting point is 00:41:24 And ethically, she doesn't feel aligned with that. So the future, not everyone has the shiniest, most gleaming intentions. Now, on that note, why do you think our visions of the future involve so much metal, so much shiny silver metal, shiny silver metal? Yeah, I mean, they're like the techno utopianism is like, it's so alluring. We love like the robots with their like beautiful, shiny, shiny. I mean, in 1909 ish, the Italian art movement called Futurism. The main guy there published a manifesto about Futurism.
Starting point is 00:41:58 It's not the same Futurism what we're talking about, but it I think is deeply connected because basically his manifesto was like, I mean, they were fascists. His manifesto was like, we don't care about what's happened in the past. Like we only care about the future. We like speed, we like youth, we like disruption. It's like kind of eerie how much it sounds like the way that people talk about Silicon Valley and the Futurist, the Italian Futurists art movement. They were really into like shiny, gold, smooth, like that kind of aesthetic.
Starting point is 00:42:25 And you see it still in some of this future imagining. So I think some of it has to do with that. I'm not an art historian, but like I feel like there's a shared aesthetic among some of this where it's like this beautiful, perfect, shiny chrome. And that is like not the future I'm interested in. Do you think that that in the future will just kind of lose control over our own lives more and more? Or do you think that we have more control of our voices
Starting point is 00:42:53 because social media has democratized things like, where does the control lie? Yeah, I think that we are ceding a lot of control to algorithms right now that kind of tell us what we want, right? And there's some really interesting work right now going on about the ways that suggestion algorithms kind of like create a monoculture. So when you go into Spotify and it recommends what you should listen to, we're all kind of getting recommended the same stuff. You're not you're not finding these new interesting things.
Starting point is 00:43:21 You're not kind of like stumbling upon a book in a bookstore because Amazon is telling you what you should read based on what you've read before. And those recommendations can be really useful, but I think a lot of people really worry that we're kind of like collapsing everything down into like one taste like that you wouldn't have. You know, you have your like people don't have their own aesthetic anymore. They don't have their taste. They have whatever tastes sort of fits within the Spotify algorithm
Starting point is 00:43:42 or fits within the Amazon algorithm. And I think, you know, a lot of people have written about like the attention economy, this idea that like we are all kind of our eyeballs are sort of like what all of these apps want. And we're all kind of looking at the same couple of apps all the time. You know, many of which are owned by Facebook, WhatsApp, Instagram, Facebook, all of that. And so I think like that's the thing that I worry about is that like
Starting point is 00:44:03 all of our communication and all the cool creative stuff that people do in group text, right? Like when you think about like the jokes you make and like the gifts and whatever and all that stuff that is constrained by the the technology that is constrained by what they allow you to like you only get a certain number of like reactions on the like I message thing, right? So it's sort of like collapsing down what we can do and what we can say in the ways we can communicate.
Starting point is 00:44:24 And that's why when people complain about there being too many emoji, I'm like, no, unlimited emoji, like give me all the emoji, right? Because like you want like this ability to be able to like pick and choose emoticons and emoji that are like not just the like six smiley faces or the like the four reactions that Facebook gives you and because which it kind of collapses to emotion and reaction and the way that we talk to each other, the ways that we are allowed to kind of react to each other and like interact with each other are constrained by these design choices
Starting point is 00:44:49 that I think like we aren't interrogating enough. That's such a good point. I mean, someone also once said to me, forget where this quote comes from, but if the app is free, then you are the product. You're the product. Yeah. Yeah, I don't know who said that. But it's like, yeah, yeah, yeah. Shilling and like so obvious.
Starting point is 00:45:07 PS, I needed to know who said that. And I did a little digging and nine years ago on the website Metafilter, the user Blue Beetle, a.k.a. a guy named Andrew Lewis, wrote this regarding the internet quote, if you're not paying for it, you're not the customer. You are the product being sold. And he later said that he wasn't quoting anyone directly, but merely restating a fairly common sentiment.
Starting point is 00:45:28 And for some reason, the internet just seems to have picked it up and ran with it. If you're like, damn, that should go on a t-shirt. Don't worry, Andrew Lewis has set up a cafe press store and you could get a t-shirt bearing that wisdom for 1495, which honestly seems pretty fair. And since you'd be paying for it, you're not the product. Totally. Yeah, right. So obvious. Yeah, your data, like all that stuff.
Starting point is 00:45:47 I mean, it's it's definitely something I think about. And even if the app isn't free, you still might be the product. I mean, like Alexa is not free, but you are absolutely still the product, right? So like it's definitely something that I think about a lot where it's like and it's hard also, I was like, I want to talk to my friends and they all we all use these same apps and whatever. And like, I'm not going to be able to convince all my friends to use an encrypted messaging app that I want to use, you know, like I'm that friend.
Starting point is 00:46:10 I was like, if we can use Signal, that would be great. And everyone's like, no, I love that about you. Like I want to be like on your plot of land during the apocalypse. Trust me, I literally look at land. I literally like I could build a compound. Yeah. What do you think about Doomsday Preppers? Oh, I mean, they're funny. Like it's silly.
Starting point is 00:46:29 I enjoy it because it's like, why not think about it? But yeah, I mean, like a lot of it is like really rich people buying land in New Zealand for for that. And they'll be fine. But like, I don't know. I think, I mean, you've talked about this on the Disasterology episode about like we're actually more prosocial during these kinds of things than we think we're going to be.
Starting point is 00:46:44 So like I'm ready to I'm ready to fight with everybody else. Oh, that's so sweet. Just get solar power, dehydrated potatoes. Yeah, yeah. Oh, that was so you know what message just popped up on my computer off work in 10 minutes because I've tried to put a thing for myself and I'm off work at seven o'clock. There you go. So what happens is every night at seven o'clock, I say, shut the fuck up.
Starting point is 00:47:04 Funny joke. I was like, not today. I'm going to ask you questions from patrons. Yeah, I did. I did peek. I did peek. Good for you. This is because you're prepared for the future. And I if I can't prepare, it's funny. I was I was talking to a psychologist recently about this mental time travel
Starting point is 00:47:18 things. I'm really interested in it. And I think that it actually is like very meta, like what I'm in what I want to talk about without I talk about the future where I'm sort of like, no, no, like we if we if we think collectively about the future, it actually makes us more prepared, makes us happier, makes us less freaked out. But there is a fine line, right? Where if you think too much about the future, you're like, oh, God. And you're like, imagine every possible terrible scenario, which I do sometimes.
Starting point is 00:47:38 So yeah, it's a fine line. Would you say that in general, you're a prepared person? Like, do you prepare for the future? Well, like, would you pack well for trips? Do you pay your your taxes well before they're due quarterly? Things like that? Yes and no, I find that I'm very prepared if I think it's important. But if I don't, then I'm not.
Starting point is 00:47:56 And so like things that fall into my like bucket, like are probably things that are in fact important, like paying taxes. No, I do pay my taxes. Do not audit me, please. But yeah, there are certain things I'm good at and certain things that I like at some point, I just have to be like, it's going to be fine. Like, but packing, I'm very particular about. Oh, yes.
Starting point is 00:48:12 When there is someone who is very smart and knowledgeable and prepared, I like to just parasite on to them and ask them advice about everything, which is what I do with you a lot. Like, Rhodes, what are we doing? What's happening? Yeah. OK, so patron questions. But before we dive into your questions to your patrons, a few words from sponsors who make it possible for us to donate to a charity of the oligists choosing.
Starting point is 00:48:34 And this week, Rose chose the Gender Reveal Grant, which she says is a grant program run by the amazing Molly Woodstock through their podcast Gender Reveal, which is a super good, funny and informational podcast about gender. Gender Reveal describes itself as a podcast for non-binary folks, for people who don't know what non-binary means, and everyone in between. Rose is a huge proponent of direct giving and the Gender Reveal Grant goes directly to trans artists, activists and educators around the world doing rad shit. In her words.
Starting point is 00:49:03 So more info is up at genderpodcast.com slash grant. And now you may hear some words about sponsors, biologies who make that possible. OK, on to your questions. Sabine Dushazo wants to know, how do we remain optimistic about the future when it feels like things are going badly on a global scale? So just how screwed are we? Yeah, that's a great question. OK. And I think, like, first of all,
Starting point is 00:49:28 that's a real, like, honestly, probably actually the most common question I get. I said earlier that it was about like phones or whatever. I think that's probably the most common question I get, which is, like, and I feel that and that's totally normal. And if you don't feel that, then, like, you aren't paying attention, right? So, like, a normal, regular, you're a regular person. But I think a lot of it does come down to this thing that I've been thinking a lot about, which is if we can stop, like, pause for a second and actually think
Starting point is 00:49:52 about specific things we can do, like, get involved with community, local community stuff, and, you know, we joked about preppers earlier. But I do think actually that, like, in our climate change future, in our future where things might get kind of dicey, if that's where we're going, like, being a part of your local community is actually the best thing you can do because those are the people that you're going to want to rely on. And whether that's because there's like an extreme weather event in your area because of climate change, or whether that's because, like, the global economy
Starting point is 00:50:16 collapses, like, whatever it is that you're worried about, like, those are the people that you're going to rely on. So I think the first step I would say is, like, small local community stuff can make you feel, like, plugged in and connected to people who genuinely care about the future. And that makes a huge difference if you feel like you're not going it alone. We are a social species, as much as I joke about hating people. Like, we all, like, want to be with other people.
Starting point is 00:50:37 And so the biggest thing I say, and, like, you are not going to necessarily, like, fix climate change by getting involved with a local community group, but you will feel better about it. And also, you will be able to move us, you can move the needle. I think, like, local community politics is so overlooked and so important. And yeah, just, like, getting in touch with people and, like, talking about it and figuring out, like, what are specific things that we can do is really important.
Starting point is 00:51:01 The other thing I'll say is, like, I've sort of been going on and on about this mental time travel thing. But I think it's the studies I'm reading about it are really interesting because they do show students who mentally visualize doing well on an exam tend to actually do better on the exam. And obviously, like, to a point, right? Like, this is not, like, it's not magic, you know, like the secret. And, like, but you can kind of, like, if you think about it, like, OK,
Starting point is 00:51:23 what do I want the future to look like? And actually being specific about what that is, like, what do you want to see? And then thinking about, like, OK, if that's what I want, like, how do I get that? Like, where do I go? What do I do? Sort of, like, really visualizing what you want out of the future in specifics where it's like, if you have kids, like, I want my kid to have a world where blank happens and then you can kind of work back from there and figure out,
Starting point is 00:51:43 like, what are the organizations that are getting that are trying to work towards that? Who are the people that I can, even if you can't donate your time, donate your money or, like, do something where you kind of spread the word about it? Like, that little those little things, I think, actually make you feel a little bit like you're in more control and can actually kind of try to push towards the future. So that's what I would say to that. That's some great advice. OK, side note. Humans obviously mentally time travel and there's great debate among ethologists,
Starting point is 00:52:07 folks who study animal behavior as to whether other apes and ravens and crows and jays can imagine their futures. And for more on bird brains, listen to the Halloween 2018 Corvid Thanatology episode about crow funerals with expert Dr. Kaley Swift, Dead Birds Man. It's a wild world, I promise. Also, fun tidbit. Another word for mental time travel is a chronosthesia.
Starting point is 00:52:34 So if your boss is not the Googling type and you envision a day off work eating corn dogs on a beach, just call in sick with chronosthesia. You won't even be a liar. You'll be a hero, my hero. JX Barnett wants to know, what do you think has been the biggest misstep in future technology? Examples might include the Ford Edsel, Crystal Pepsi or Friendster.
Starting point is 00:52:57 I would say that I would sort of say like broadly surveillance culture and like our willingness to seed so much of our personal space to devices and companies. And that's like facial recognition. That's the always on listening devices. I can hear you. Shit. OK, also, this next part is really interesting from a futurological legal perspective.
Starting point is 00:53:25 And I think that's important and we, you know, because. Yes, you might think like, oh, well, Google already knows everything about me. So like, what's why not just add Google home into my house? And my answer to that is that there is this idea in law that you are protected from unreasonable surveillance. Right. And so that means that like in certain states, you can't record somebody without them knowing or, you know, you're really not supposed to wiretap people like in general, you know, like.
Starting point is 00:53:52 And that's because we have a reasonable expectation of privacy. And that's sort of what the term is that they use. And so, you know, if I were to sneak into this room and put a recorder here and leave and record you, that would be a violation of your privacy because you have a reasonable expectation of being in your hotel room and not having a recording device. If we all accept that we are going to put these devices into our homes and let them listen all the time, we are basically telling a court in the future
Starting point is 00:54:16 that like, actually, I don't have a reasonable expectation of privacy in my house anymore. And that also means that anybody who walks through the door no longer has a reasonable expectation of privacy in your home. And that I think is scary because that means basically that law enforcement can subpoena. We know that Amazon and Google both have worked with law enforcement to give over certain records. We know that like law enforcement targets certain individuals more than they target other individuals.
Starting point is 00:54:40 We know like ICE, for example, can sometimes use facial recognition systems to try to find people they're looking for. And so those of us who like maybe don't have to worry about those things are making it much harder in the future for other folks who do have those concerns because they no longer can be protected in their own homes from surveillance. And so I think that's the big thing that I worry about. Obviously, like fossil fuels is probably the better answer to this question now that I'm thinking about it, like, you know, burning dinosaur bones
Starting point is 00:55:03 that have been liquefied. But but I think that's the other the other big one I think a lot about is this like sort of like we throw our hands up and say, like, well, like there's nothing we can do at this point. And I think actually we are still at a point before we have completely lost the battle for privacy, but like we're getting there. And so that's my like big treatise against the like always on listening devices. Like at some point there will be a case in the future in which a judge has to decide
Starting point is 00:55:28 did that person have a reasonable expectation of privacy. And if everybody has these devices, it's possible that judge will be like, nope, not anymore. And there is no such thing as it's not listening to me unless I say it's made. Correct. I mean, in theory, that is how it works. But there have been so many documents that employees at these companies are listening in on conversations as part of either testing or by accident or whatever it is.
Starting point is 00:55:49 I like I would say that correct. You are sort of like you should expect that they can hear anything you say. Yeah, do not like Jessica Jansen wants to know, what do you foresee for the future of health care, like using our own immune systems and genetics to defend us against the world? What do you think about health care? Yeah, I mean, there's such interesting work going on with with genetics and with sort of the ways that we are using gene editing.
Starting point is 00:56:15 I think that is very exciting. My my worry always is like the way that the health care research system works is that like a lot of things don't get developed because they don't make sense financially for a pharmaceutical company, which means that like they're not going to make money on a population, even though like that people need that stuff. That said, there is some really cool stuff going on with like genetic advances. And yeah, like personalized genomics is something people have talked about for a really long time, but I think actually is like finally getting some like real progress.
Starting point is 00:56:42 That's a classic one. We're sort of like nuclear fusion where it's like always 10 years away. You know, but like I think at this point, like there is some really interesting research on that. There's another bit of research that I find really interesting about biomedical tattoos and sort of like, let's say you might have like you want to met under your glucose and you don't you could have a tattoo that actually like turns a color when it's like when your glucose is low.
Starting point is 00:57:01 So that kind of stuff is really interesting to me. Things that are just like personalized medicine, I think is like really interesting. And it's kind of a buzzword and it can get kind of like pseudoscience. Okay, side note. If you're like, what is the deal with nuclear fusion? Don't worry, I look this up for us because I wasn't sure either. So the gist is nuclear power we're using now is fission, where we split uranium atoms and it generates a ton of power, but also some radioactive waste.
Starting point is 00:57:25 Whoops. Now, nuclear fusion would instead jam two hydrogen atoms together into a helium and would theoretically give off more power than fission without the waste. Now, there is one giant proof of concept already in use and it's the sun. Same shit happens in the sun. But when Rose said that everyone keeps saying it's 10 years off, she's not whistling Dixie. News articles from 2014 promise it'll be a reality by 2025.
Starting point is 00:57:53 And in a report that was out just last week, two labs in the UK are apparently neck and neck to get it figured out in, yep, the next decade. Using deuterium and tritium, which are two isotopes or forms of hydrogen. And I want to say that if I had a pair of hamsters, I would like to name them Dudes and Trit. I know you don't care, but this is my podcast and I'll say what I want to. Katie Coast wants to know if there was one thing that we currently don't know about the future that you could be magically gifted with the full understanding.
Starting point is 00:58:23 What would it be? What do you want to know? What do I want to know? What do I want to know? That's such a good question. I wish I had read that one in advance and thought of an answer. What do I want to know? I mean, what would I want to know about the future?
Starting point is 00:58:37 I mean, I think that like what I would love is to just like be able to teleport a hundred years from now, just to kind of like look around. And I think like there'd be so many, it's not like one thing, because there'd be so many pieces of information to be getting from being how the cities are laid out, like what people are wearing, like what people are using, what people are doing, if we exist at all, like if we're still here. I think that would be like a hundred years is a cool time, not because it's just like a, you know, a nice round number, but because it does feel like
Starting point is 00:59:02 enough time that like it's really hard to know what's going to happen, but not so much time that it's like, you know, we might be primordial slime somewhere else on another planet. Like, you know, I have a picture, like coming out of the teleporter and it's just smoldering ashes and you just like turn around and get back in. Yeah. And there's just like, oh, yikes, we did it. I mean, I always think, and this is maybe this is terrible, but I feel like
Starting point is 00:59:25 it's just logical that like, we're going to get wiped out and that's fine. Like you're ready for it. I'm ready for it. Like I don't feel like we need to put ourselves on any other planet. Like we had our shot. We had our shot like 99. It was a good run. Yeah.
Starting point is 00:59:39 It was a good run. Bye bye. Yeah. Even the Sopranos had to end. Right. Get the check. Let's get out of here. Like 99.9% of all species that have existed are extinct.
Starting point is 00:59:48 And like if we fuck ourselves up beyond the ability to survive, that is how the game works. Like, you can't be like, no, no, not there. Yeah. No, it's like you did this to yourself and you took a bunch of species with you and bye bye. We're going to take your atoms and make them into new, interesting animals. I mean, I think that like on a long enough time scale, that's absolutely true.
Starting point is 01:00:10 Like humans are not forever. Like nothing is forever. Right. Like not even James Bond. And so like you, yes, we will absolutely. But the question is like, is it in 100 years or is it in like 10,000 years? I feel like it's in 12, but 12 years. 12 years.
Starting point is 01:00:24 12 years. That's what, okay. Why 12? Because I was recently with some teens and they were like, is it true that climate change is going to kill us all in 12 years? And I was like, where did you get that number from? That seems a little conservative, but I just was thinking like enough time where like I'd still have a house payment, you know, like enough time where you
Starting point is 01:00:42 wouldn't definitely wouldn't be ready for it. People still have student loans. They have to pay off. Yeah, yeah. Like you still be like, fuck, you know, like I'm sure there are still boxes in the garage. I won't have unpacked. So, right.
Starting point is 01:00:52 Right. Yeah. Mail I have not opened since I like moved four houses ago. Yeah. I mean, I feel like I have had spices for over 12 years. You know, I know that I'm like, how long have I had this garlic powder? Like you moved it between every. Yes.
Starting point is 01:01:06 Totally. You can't throw it away. No, it's no garlic powder. So I feel like it's the same where it's just like, just where it catches you off guard. Yeah. 12 is good. Yeah.
Starting point is 01:01:14 Okay. Yeah. But is climate change going to kill us in 12 years? No, it will kill a lot of things in 12 years. Potentially all the insects. Oh no. Oh, no. Um, I don't, 12 years.
Starting point is 01:01:24 No, I don't think so. But I mean, like 12 years is long enough for us to kind of really, I mean, we're already seeing impacts of climate change, right? Like there are already things happening. I think 12 years, it will start to become like very clear to everybody that like this is happening and, you know, like things are changing. Is there anything that you do in your personal life differently because of climate change?
Starting point is 01:01:46 I don't eat meat. Um, I, I feel terrible. I fly a lot, which I don't like, but I do pay for carbon offset stuff. Nice. Um, I try to do as much reusable, non-plastic kind of things. And I just like, I, I donate a lot of money. Good for you. Yeah.
Starting point is 01:02:05 Gets one. I mean, a lot of money. I'm not Bill Gates, you know, I donate as much money as I can. Right now you're, you have an open briefcase full of $100 dollars. Yeah, listeners can't see this, but I'm just rolling around in cash right now. Your overalls are just brimming. Yeah, just nowhere else to put it. Yeah.
Starting point is 01:02:24 Um, Casey Wright says, my dream has finally come true. Flash forward was my first podcast love. Hi Rose. Um, anyway, onto the question, do you think true equality across race, gender, sexuality is actually possible in the future? Hi Casey, I love you. Um, possible, possible. Yes.
Starting point is 01:02:43 Very, very difficult. Okay. I mean, I don't want to say it's impossible. I think that's like too defeatist, but I think it's really hard. There will always be people who, I mean, power corrupts. Absolutely. Right. Absolutely.
Starting point is 01:02:56 Power corrupts. Absolutely. And anybody who's at like a top of a chain should is always going to want to maintain that. And so I think it's really hard, but I want to believe that it's possible because otherwise I think it's hard to push forward. But yeah, I think it's possible, but very, very hard. And we all like literally every person has to be working towards it all the time.
Starting point is 01:03:15 Every person. Yeah. Or put the ones that we don't like on an island. Yeah, they can go to Mars. Now this next great question was also asked by patron Ron Dagdeck. Great question, Lindsay Beasley. With the high frequency of robots replacing the human workforce, what changes do you predict for our society?
Starting point is 01:03:31 How will the vocational options available to us change? That's such a good question. I know. Yes. Okay. So two things I will say. Number one is I'm just because this is like a bug bear of mine. Robots are not replacing us.
Starting point is 01:03:45 Managers are replacing us with robots, right? It's a specific choice that like people who are making money are making like the robots are not doing this on their own. That's a good point. Brian Merchant, who works for or used to work for Gizmodo, has written a bunch of really great pieces about this, basically being like, stop saying that the robots are taking our jobs. It's managers that are doing this.
Starting point is 01:04:02 It's CEOs. It's like specific people who are making the decision to hire or to fire employees and replace them with these sorts of machines. Honestly, we don't even want your jobs. Um, and often that's not necessarily because the machines are going to save them more money in the long run. It's because machines don't unionize. Machines don't complain.
Starting point is 01:04:20 Machines don't need, you know, labor protections. There's no OSHA protection for machines. So it is a specific choice that human beings are making. That said, like it is happening lots and lots of places. There are always reports about automation, um, depending on who you ask, automation is killing jobs and making a lot of people unemployed. Some people believe that those people are finding work elsewhere. There was recently a study that looked at this and found that people who had
Starting point is 01:04:46 been sort of like automated out of their jobs wound up with, I think, 11% lower income, wherever they moved on to. There's an interesting problem with automation, which is that a lot of the actual kind of like, quote unquote, good jobs of mid level jobs, things like accounting and even like car manufacturing assembly lines, which, you know, blue collar, but like union jobs have good worker protections. Those are the things that are pretty easy to automate. And the jobs that are hard to automate are often the ones that are, um, we
Starting point is 01:05:14 consider quote unquote low skill jobs. I think there's no such thing as a low skill job. Working at McDonald's is very challenging. Um, but things like working at McDonald's, being a waiter, those sorts of jobs where it requires kind of being able to process a lot of information at once and kind of like do a lot of things at once. Machines are really bad at that. Machines are really bad at picking vegetables.
Starting point is 01:05:32 So like, you know, in a field, because it's like in a car plant, it's all on a semi-line, the environment is completely regulated. Everything looks the same. Every windshield is the same size. If you're out in the field, every pepper is not the same size. Every apple is not the same size. Every tree is slightly different. And so a lot of agriculture is really hard to make is to hard to automate.
Starting point is 01:05:49 And that's like a place where there is a labor shortage where people don't want to do those jobs because those jobs fucking suck. You know, there's a name for this. It's called Polanyi's paradox, which is basically that like the jobs that people kind of want are the ones that are getting automated. And the jobs that people don't want are the ones that can't be automated. It's really hard. And if you're like, who is Polanyi and why does he have a paradox?
Starting point is 01:06:10 Okay, quick, Cliff's notes. So Carl Polanyi was a Hungarian born and British based chemist and a philosopher and a professor who essentially posited that we know way more than we can explain. So we can perform some duties so intuitively that they're just hard to describe, let alone map out for artificial intelligence to then replicate. So things like advanced facial recognition or driving a car on windy roads or picking out the most magical pumpkin on the farm. See the Q-Curpitology episode about pumpkins to know what the hell I'm talking about.
Starting point is 01:06:47 Now Polanyi explained this whole theory in his book, the tacit dimension, which came out in 1966, but did a little look in and perhaps some credit should go to the US Supreme Court Justice Potter Stewart because just two years prior in a famous case about the First Amendment and obscenity in an art house film, Justice Stewart legendarily addressed hardcore pornography by saying, I shall not today attempt further to define the kind of material I understand to be embraced within that shorthand description. And perhaps I could never succeed in intelligibly doing so.
Starting point is 01:07:22 But I know it when I see it. We know more than we can tell. Sometimes robots just don't get it. Yeah. So neither of those things answers the question. But I think the answer is that we're going to see a lot of those sorts of jobs that are really hard for robots to do. So that's like food service, customer service, anything where you're kind of
Starting point is 01:07:41 like really having to think across a range of disciplines. Those we'll see more of. I mean, this is why like unions are really important because like it is it does help buffer against some of this, some of the automation. I think we'll see a lot of creative jobs. I mean, like robots can be creative in their own way, but they're not going to be able to do a lot of the stuff that we currently think of as sort of like uniquely human creative work.
Starting point is 01:08:03 But I think this is why like social safety nets are so important because like there will be a period of time where a lot of people are kind of like, I don't know what to do because again, like managers and CEOs have made a decision to automate a process. The future, the future of work, unfortunately, right now looks like a lot of freelancing, which I think is not good for stability and for a lot of people. But yeah, it'll be a lot of sort of like, what are the things that robots are really bad at?
Starting point is 01:08:24 Things we have to kind of like generalize information across domains. Like those are the jobs that are going to take longer to automate. What's your take when you see really advanced robots that are like running and jumping that are so black mirror? Like how? Because I know that all of us are like, I have a fear deep in the core of my being that turned like icy. How would, yeah, war robots are very scary.
Starting point is 01:08:49 And that's what those are, right? Boston Dynamics, like they are creating robots for DARPA, basically. Okay, heads up. If you're like DARPA, that's also a great hamster name. Cool your jets. It stands for the Defense Advanced Research Projects Agency. And it's part of the Department of Defense. It makes new war toys like insect spies and submarine drones and computer
Starting point is 01:09:08 brain implants and robots that could probably tear our limbs off like rose petals. And the one thing I will say, and this doesn't necessarily make me feel better or worse, but the video you see, you don't see all the failures. You don't see all the times that a robot just fell on its face, which is like 99% of the time. It's really hard to get robots to run and jump. That said, like, yeah, it's terrifying, right? Like, especially given what we know about the ways that the military industrial
Starting point is 01:09:34 complex treats certain kinds of people. I'm like, imagine ICE having that robot, right? Like that's scary to me. I don't, I don't like that. And I think, you know, there's a lot of conversations right now among technologists about like what are the ethical questions that people who work at places like Amazon or, you know, whatever it is, like what should they be thinking about?
Starting point is 01:09:53 Like the bunch of workers at GitHub recently quit because they were working and GitHub was working with ICE, like where do you draw the line? And a lot of people who go into technology and especially go into like robotics and engineering, like they don't have any kind of training in ethics. And sort of like thinking through these questions, you know, they when I ask them, you know, I did an episode about deep fakes a couple of years ago. And my question was like, OK, like what about, you know, people who create these videos of their exes and it's like sort of a revenge porn situation.
Starting point is 01:10:25 And I asked the engineer, I was like, do you ever think about that? And like had not occurred. So like this is just like, I think there's a wall sometimes between people who are technical and people who like think about like, oh, how do I make this robot's legs work really well? And then this bigger, like, OK, but like, should I write this? I just didn't stop to ask if they should. They didn't stop to think if they should, you know, in the words of Goldblum.
Starting point is 01:10:45 So right, like I think there is that question. And I think more and more it's becoming common for technologists to start thinking about that. But many of them don't have the training. They don't have the background. They don't have like the sort of like frameworks to even ask, like, OK, but like, why are we doing this? And should we be doing it?
Starting point is 01:10:59 And like, what is the benefit here? And, you know, who wins and who loses? We can look forward to a future that might have war robots. Oh, we already have war robots. We do have them. Yeah. Well, I guess we have drones. Right. So. Right. OK, here's a really big, important question. Sarah, I knew she wants to know,
Starting point is 01:11:16 will Brad ask me out for the spring fling? I mean, if you want him to, I hope so. If not, fuck Brad. OK. Or don't fuck Brad. Or don't fuck Brad. If you do, please use protection. Always thinking about the future. You know what? I just let you know.
Starting point is 01:11:32 Family planning, very important. OK, speaking of kids and populations, these patrons, Anna Valerie, Jamie Pickles, Vanessa Frey and Tara McNeigh asked about family sizes in the future. Family planning, very important. Do you think anything's going to change family planning wise in the future? Yes, absolutely. I mean, like this is male birth control has been like on the horizon for a long time, whether it will ever actually happen, I think, is like open to debate.
Starting point is 01:11:56 It sort of feels like nuclear fusion and that, like, theoretically possible. Will it happen? Don't know. I mean, there are people who believe that birth control is actually the most impactful technology that has ever kind of come up post industrial revolution. Some people argue that that's refrigeration. So sort of like those are the two camps. It's like birth control or fridges, you know, and both actually were like incredibly important. But birth control is huge in the way that has changed how we think about the future
Starting point is 01:12:23 and how our social structures and like, you know, people who can have babies are now like free to not have babies if they don't want to, which is like a giant shift. I often say that that the IUD is the original sort of like body hacking. I write about how I have an IUD and RFID chip in my hand. Wait, what? Rose, futurologist, has a radio frequency identification chip in her body. Before we all lose our shit. It's pretty much the same thing my dog, Grammy, has.
Starting point is 01:12:51 But unlike Grammy, Rose also has an IUD or an intrauterine device, which although it's fully analog, she says is a way more powerful technology. I talk about how like anybody who has an IUD is basically a cyborg and you should like own it. Yeah, the RFID chip is just like a party trick and the IUD actually like makes my life better. So you have an IUD and you have an RFID chip. Correct. Implanted. OK, I'd like to know more. Yes.
Starting point is 01:13:19 OK, so if you have a dog or a cat that is microchipped, it's basically the same exact technology. It's like a small glass bead that is in my hand. If you would like to feel it, you can. I would like to feel it. OK, so if you touch like right there, do you? How big is the beam? Right here.
Starting point is 01:13:33 It's right there. It's like a little nugget. It's like smaller than a pill. It's kind of the size of a tic-tac. Yeah, or a grain of rice maybe. Yeah, like a grain of rice. Yeah. And inside of it is an RFID chip which cannot communicate with satellites.
Starting point is 01:13:45 There's no power source in here. It can only contain a very small amount of information. So if you like have ever used a fob to get into a door or like an apple pay, it's like that. So you would swipe it and you sort of touch it to something. So I can either like unlock my car door or unlock my house door. It's really kind of a party trick. At one point I had it set to be a geocaching site. So if you found it, you could get a little gift that would pop up when you would scan it and like dance.
Starting point is 01:14:07 But yeah, that's that's my little little hand thing. When I wrote about it, I got this years ago, when I wrote about it, I got a lot of emails from people who believe that I am now like cursed by the devil. So there's that, which might be true. Who knows? Do you have to have like a some sort of receiving site to open your car door? Like what? I know that's such a dumb question.
Starting point is 01:14:24 Yeah, so it's like anything like you would have an office. If you have like a little black box on your office to like touch your car too, you have to install that on your car or on your house. But yeah, it's like, I mean, these are easy to buy. You can buy them at Best Buy. So you can buy the receiver at like Best Buy or online. A whole door handle with a chip reader will set you back around a hundred bucks or a simple reader is three dollars on wish.
Starting point is 01:14:46 But the chip itself, the part that lives all snugly in your flesh is not an electronic store like sold in a kit with a syringe. You cannot buy an RFID chip to implant into yourself at Best Buy. You can buy it at a website called dangerous things. If you would like to do that, obviously like you should probably if you want to do it, you can purchase it there. I would suggest taking it to a piercer who actually knows what they're doing and don't try to do it yourself.
Starting point is 01:15:10 But yeah, I mean, you can you can buy them. So all of the talk about like, oh, one day we're just going to have microchips in our hand like you're living that dream. Yeah, people like me have them and there is a company that sort of as almost a stunt did this and they offered it to all their employees as actually a way to get in and out of the building. I would say like, don't do that. Okay. In part because like your employer should never have access to something
Starting point is 01:15:32 that is like physically inside of your body, in my opinion. But yeah, that's that's how I feel about it. Is that does that freak people out? Do you think more than it should? The RFID chip? Yeah, I think as soon as people understand what it actually is, they're less freaked out. Some people are the first question I always get is like, so are you being tracked right now? And it's no like it can't communicate with satellites.
Starting point is 01:15:51 It is not powerful enough to do that. It's really truly just like a silly thing that I have. And once they kind of understand what it can and can't do, they're like, oh, okay, that's cool. It is interesting when I am in body hacking spaces, like this is totally normal. But when I try to talk about my IUD as a body hacking piece, there are definitely dudes who are like, it makes me uncomfortable. Why do you think that like contraception isn't regarded the same way that like I have way more concentration when I put coconut oil in my coffee?
Starting point is 01:16:20 Yeah, I mean, I think it's because it's largely the domain of women. I mean, the same way that like Soylent is just slim fast marketed to men, right? Like this is like there when you repackage something as like a disruptive technology that like is cool and that men do, then as soon as you start to try to say like, actually, like all this other stuff then also counts like that, like it makes it less cool to men. Please see Diet Coke versus Coke Zero or Pepsi Max, which actually used images of a Playboy bunny in its ad campaigns. Like why not just shape the bottle like a dong?
Starting point is 01:16:55 So I think that's mostly part of it. There are people in the body hacking space that are totally on board with talking about IUDs as like body hacking. I think that it's not all of them, but they're definitely I have had moments. I mean, I go to these like body hacking conferences and I've had moments where men have been like, hmm, that's gross. You know, it's like talking about periods. Oh my God.
Starting point is 01:17:11 I'm like, you came out of one of those. I mean, what could be more disruptive literally than an IUD? Truly literally the nightie where like people who bear children no longer have to have just unlimited children until they die. Like whoa, that's like changes a lot, you know. Yeah, yeah, yeah. My grandma had 11. Man.
Starting point is 01:17:33 30. It's like that's a lot. Oof. Oof. Pop it out. That's a lot. Oh, no. Okay.
Starting point is 01:17:39 The worst thing about being a futurologist, what sucks the most? Is it people trying to force gadgets on you? Is it depression? I would say the worst thing about being a futurologist is men on the internet telling me that I'm being too pessimistic. Oh, smile sweetie. Exactly. Or like, oh, you just don't understand the benefits that this is going to happen.
Starting point is 01:18:02 I'm like, hmm, I do actually. And I don't actually think of myself as particularly pessimistic. I think of myself as kind of like a skeptical optimist. I actually do think that like there are lots of people who are doing really amazing things and that are making the future better as we go. I think there's a lot to do. I'm not like naive about like how much fucked up stuff there is out there like children are engaged is like, you know, we like there's a lot of work to do.
Starting point is 01:18:24 But I think that sometimes I get people who are like, oh, you're being overly dramatic or like, oh, you're overstating, you know, how important this is. You're like, oh, why should I care or whatever it is? I mean, you know, I have a lot of conversations with people about Alexa and Google and like the always on operating systems and people being like, well, you know, it's so convenient. Why should I care? You're just being a downer. Like why are you being such a downer about this like amazing technology?
Starting point is 01:18:48 And, you know, I have my own spiel that I give about like, well, if you allow these always on listening devices into your house there, you're setting a court precedent to be like no longer have privacy. And like that is an important thing. And that's something I care a lot about. And I think that's like a thing that scares me is like this a bit this willingness to kind of just seed control to these companies that do not have our best interests in mind. Do you think that what was once paranoia is now just becoming fact of life?
Starting point is 01:19:16 Well, I think this is the challenge, right? Like I joke that I am a tinfoil hat person, right? I'm always, you know, I'm also very, very careful about like my home address and my phone number. I was doxxed during Gamergate and like they came to my house and we're like, we're going to kill you. So like I am very careful about those things. So I am definitely the person who's like constantly like, get your two factor authentication,
Starting point is 01:19:36 use encrypted services, like don't put your home address places, don't post where you are. And like my threat model is different, right, than other people's. But I do think that it's hard because like, yes, there are certain things to worry about, but then because we're so, we sort of, we know these companies are so powerful, we kind of almost over index on what they think they can do. So like this idea that like, oh, our phones are always listening. I think most people I talk to believe that their phone is listening to them all the time. And like that's actually probably not true, right?
Starting point is 01:20:03 So it's like, it's hard to sometimes know because like, yes, the Alexa is listening to you, but your phone isn't. And like, so you should be worried and kind of like, you know, not paranoid, but you shouldn't be worried, but it's hard sometimes to know what to actually worry about. Yeah. Because it's like, which one of these things is spying on me, you know, all that stuff. So many things to be paranoid and cautious about. But yeah, but it's also like, I understand people being exhausted by it,
Starting point is 01:20:26 where you're just like, I know, I know, I shouldn't do this and I shouldn't do that. And you know, like, I totally get that. And you just have to like make whatever reasonable decision for yourself that makes sense. Right. And we, I do see more and more people just giving up. Totally. And I get it. Like, I get it. Like, I mean, I have a PO box because I don't ever use my home address
Starting point is 01:20:42 and that costs money to have a PO box. Like, like it is, like it is a lot of time and a lot of money and a lot of effort to make sure that you like have X, Y and Z. And I want to say, I don't think it should be the user's responsibility. I think it should be on these companies to not constantly take as much land grab, like as much as they can. And not of like, oh, well, you didn't read the privacy policy. It's like, who can read every privacy policy?
Starting point is 01:21:02 They're also written in a way that there's no way a regular person would understand them. So like, it's not on you and me to fix this. So like the companies need to be held accountable for like all the shady shit they're doing. Oh, what about your favorite thing about being? I get to talk to such cool people all the time. Yeah. I mean, like being a journalist, do you have the excuse of literally calling anybody that you want to?
Starting point is 01:21:20 And most of the time they talk to you, which is like wild, right? And you're like, you don't have to do this. But like, it's so fun. I get to talk to like just also, and I think this is why I don't, I often feel like I may be more optimistic about the future than a lot of people is that I get to talk to people every day who are doing stuff that is going to make the future better, that are like committed and working really hard at making the future better. And that's like a bomb.
Starting point is 01:21:42 It's not that I don't wake up some days and I'm just like, oh my God, like I can't, you know, like I definitely have those days where you're just like, just feels totally hopeless and like there's nothing we can do. But then I get to go call people and listen to people and talk to people who are like in their small communities and their small ways like making a change. So I often like to point to this woman, Aisha Nandoro, who is working on a universal basic income program, specifically from Black mothers in the South. And it's a small program, but like they're giving money away and they're like helping people in
Starting point is 01:22:11 this very specific way, in this very specific context. And like, they're working really hard and it's having these impacts and it's not going to solve like the nation's problem, but it is like a small thing that is making a difference locally. So this program is called the Magnolia Mothers Trust through the Spring Board to Opportunities program. And there's more info on that at springboard2.org. And an additional donation went their way as well for this episode. So it's a futureology twofer because damn it, let's turn this boat around,
Starting point is 01:22:40 make the world a little better if we can. And I think that's like why I can sometimes be a little bit more not optimistic, but like hopeful about the future because I get to talk to all those people every day. And it's such a joy. And I'm like so thankful that they give me their time and like make time to like have me talk to them. But it's so fun to talk to people who are smart and interesting and like working really hard on something and really care about it. Yeah. If you, if you're not a journalist or don't have a podcast, it's hard just to call
Starting point is 01:23:04 someone and be like, hi, I think you're cool. Talk to me. Yeah. I mean, I will say like Twitter is good for that, right? For true. Like people like love, like especially people who are working really hard on making the future better. Like they want to talk to people. I mean, they also feel like they are twailing away at like a hopeless problem a lot of the time. So if you find someone who's doing something cool, like literally just being like, hey, I think you're doing something cool. It's like that like keeps people going.
Starting point is 01:23:26 So I would say like, if you admire somebody's work, like absolutely say something because it truly does make a difference. That's very good advice. Because probably people think that it would be weird to no compliment someone. Compliments are never weird. I mean, that's not true. Compliments can absolutely be weird. But like in general, if you're like, Hey, I just love what you're doing. And also like often if you are like, I don't know how to get involved, is there something I can do?
Starting point is 01:23:47 Just like reach out to people. I mean, people, and they won't always be able to reply because a lot of them are strapped for time and stuff. But yeah, people like community building and like making a difference. Like we all want to make the world better, I think. And especially oligies listeners, oligites. So yeah, like reaching out to people and being like, How can I help? Like what can I do? Any kind of words of wisdom or anything stick with you like on the day to day that kind of keeps you going? Yeah, there's a quote by Octavia Butler that I come back to over and over again.
Starting point is 01:24:13 And I will just like read it to you because I always forget that I'm worried. She's like such a pillar and important person that I don't want to like butcher her words because she's so good at them. Okay. How prepared for the future is Rose? She had this quote readily available on her phone. So I'm going to read it to you. It says, There's no single answer that will solve all of our future problems.
Starting point is 01:24:31 There's no magic bullet. Instead, there are thousands of answers at least. You can be one of them if you choose to be. Goosebumps. Yeah, I know. It's like, that's the thing I think about a lot where it's like, you can be part of the solution. It doesn't have to be hopeless. So you just have to figure out what your priorities are. Figure out what kind of future you want, work backwards from there,
Starting point is 01:24:53 and then kind of get to work. Yeah, get on it. Get on it. So the future is ours. Whether we would like it or not. And so we might as well try to make it better. Yeah, the future is not set yet. It hasn't happened. It can be what you want it to be.
Starting point is 01:25:07 Oh, that's so exciting. Yeah, I know. Oh, we can fix this. Yeah. I feel so much better. Yeah. Okay. So sometimes I feel like people's therapists or they're like,
Starting point is 01:25:18 I need to talk to you about the future. It's going to be okay. Oh my gosh, thank you so much for doing this. Thank you for having me. I'm a huge fan. I love this. Oh, I'm a huge fan. No, you. No, you.
Starting point is 01:25:28 No, you. No, you. So if you are now a huge fan of professional futurologist, Rose Evelith, feel free to board your internet space car and just zoom over to roseevelith.com. She's at twitter.com slash roseevelith, instagram.com slash roseevelith. And of course, listen to her wonderful podcast, Flash Forward.
Starting point is 01:25:49 That's at flashforwardpod.com. She's on Facebook at flashforwardpod. Twitter, same handle, flashforwardpod on Instagram too. We are at Allergies on Instagram and Twitter. I'm at Alliward with 1L on both. And more links about all the things we talked about will be up at Alliward.com slash Allergies slash Futurology. There's a link to that and to the causes
Starting point is 01:26:11 and the sponsors for this episode in the show notes. And for merch, you can head to AllergiesMerch.com or Alliward.com. Thank you, Shannon Feltis and Bonnie Dutch of the Comedy Podcast. You are that. They handle all the merch. Thanks to Aaron Talbert and Hannah Libow for admitting the Facebook group.
Starting point is 01:26:27 Thank you to Emily White and all the Facebook transcriptionists who make transcripts available at Alliward.com slash Allergies slash extras. We're getting through all the past episodes still. Thank you as always to Jared Sleeper, host of the podcast, My Good Bad Brain About Mental Health, for assistant editing and to the man whose mustache resides. In both the past and the future at once, Stephen Remoris, host of the cat podcast, The Percast,
Starting point is 01:26:51 and the dinosaur pod, Sea Jurassic Wright, for piecing all these clips together for me each week. He is a hero. Nick Thorburn of the Very Good Band Islands wrote and performed the theme music. And if you stick around to the end, you know, I tell you a secret. This week's secret, it's going to be threefold because I just took two weeks off.
Starting point is 01:27:10 I'm just like a chicken holding eggs over here. So, okay, one is that I've been trying to use mental time travel before things that stress me out to imagine a good outcome. And it's helped me shake off the jitters a bunch. So thanks, Chronostasia. Also, since the chronobiology episode, I've been sleeping in a bed with the lights off way more and it's glorious.
Starting point is 01:27:30 But also, I think I'm getting sick more. And my doctor said that that could happen when you finally rest. So yes, we need an immunology episode stat. Also, I have a potato for a brain and I used way too harsha cleaner on an engineered quartz countertop and now it's spotty and dull. And if anyone has done this or fixed this, please tell me your secret because, wow, it looks real bad.
Starting point is 01:27:55 And it's my fault. Okay, let's meet back here next week in the future. We have a whole year of brand new episodes now that I'm all rested up. So 2020, let's make this ding-dang future better together. Okay, bye-bye. Hackadermatology. Homiology.
Starting point is 01:28:10 Cryptozoology. Litology. Nanotechnology. Meteorology. Lonectology. Nephology. Cereology. Cellulogy.
Starting point is 01:28:27 Difficult and painful as it is, we must walk on in the days ahead with an audacious faith in the future.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.