Duncan Trussell Family Hour - 295: Taryn Southern

Episode Date: July 11, 2018

...

Transcript
Discussion (0)
Starting point is 00:00:00 Ghost Towns, Dirty Angel, out now. I'm dirty little angel. You can get Dirty Angel anywhere you get your music. Ghost Towns, Dirty Angel, out now. New album and tour date coming this summer. This Nobel Peace Prize awarded episode of the DTFH is brought to you by Squarespace.com. Head to Squarespace.com for a fledge dunkin'
Starting point is 00:00:26 for a free trial. And when you're ready to launch, use offer code DUNKIN' to save 10% off your first purchase of a website or a domain. It's time for you to comb the lion's mane. That means make a beautiful website using the top website creation service in the universe, Squarespace.com. Greetings to you, Voronogs of the seventh fold. It is I, MIGSTARG, and you're listening to the MIGSTAR
Starting point is 00:00:54 Miefling Swarm Cluster Pulse. All Hail the Great Queen. I am recording this from within my chrysalis right now. I recently melted down and went into a chrysalis phase. I'm currently a bubbling goo inside a beautiful green chrysalis hanging from a vanstrock tree. If you saw me right now, you'd see there's no gills, no wings, my pinchers are gone, my eyes are gone,
Starting point is 00:01:25 my seven sacks of reproduction have liquefied completely. My pulsation membrane is gone. I'm just goo and ooze and bubbling goo and ooze. I prepared for this cycle for seven moons of nador, but I never realized what it's like to be inside the chrysalis. To quote Orzan Vreach, we must tend to the part of the hive that we hum upon. We must remember the dance of the omnivorex who
Starting point is 00:01:58 vibrated this. Offer ambrosial nectar to your hivelings as you would yourself. And if you are robbing yourself of ambrosial nectar by spraying yourself with acid, then how can you possibly offer ambrosial nectar to other quadrants of the hive? And if you ask me, the only hope for the hive is the sharing of ambrosial nectar.
Starting point is 00:02:20 We don't have to use the matrix of sold art to spread fear pheromones. We can use the matrix of sold art to share ambrosial nectar. And if we do that, I believe the entire hive itself may enter a chrysalis phase and go into a metamorphic process that transforms itself into a hive more beautiful than any of us could ever imagine. And I understand.
Starting point is 00:02:43 Some of us are spraying fear pheromones because it's frightening to watch the tunnels and egg chambers of our former generations melting before our very eyes. But perhaps this melting is not, as some say, the end of the hive, but the beginning of a new hive. These are the things that occur to us as we liquefy within our chrysalis. That being said, I did roll in the petals of a waffler
Starting point is 00:03:09 flower prior to my liquefaction. So to be honest, I am a little buzzed right now. We got a great podcast for you today. We're going to jump right into it. But first, some quick B-siness. Dad joke, unforgivable dad joke. This episode of the DTFH, which has received a waterfall of accolades from some of the top intellectual institutes
Starting point is 00:03:36 around the planet, it's been made possible by Squarespace.com. Head over to Squarespace.com forward slash Duncan for a free trial. And when you're ready to launch, use offer code Duncan to save 10% of your first purchase of a website or a domain. Squarespace, it lets you create a beautiful website where you can turn your cool idea into a new website.
Starting point is 00:04:01 You can showcase your work, blog or publish content, announce an upcoming event or special project. And just about anything you can imagine, including creating personalized websites for your family, your friends, your dogs, your dogs of past and even dogs of the future, check it out. Americandogcloning.com, it's available. And right now, if you go to Squarespace
Starting point is 00:04:26 and use offer code Duncan, you can get 10% off of that domain which no doubt will make you $10 trillion. Squarespace, it lets you build websites by giving you beautiful templates created by world-class designer. It's got powerful e-commerce functionality. It lets you sell anything online. And what service is more in demand
Starting point is 00:04:48 in these dog loving times than dog cloning? It's embarrassing when somebody comes over to your house and looks at your dog and realizes that you haven't cloned it yet. Don't be that person in your neighborhood that doesn't have at least three identical duplicates of your best friend, dogcloning.com. You can make it over at squarespace.com.
Starting point is 00:05:11 You can make it yourself and you can make it stand out. It's time for you to start that new business. Do it with squarespace.com. It's time for you to get those dogs cloned, baby. And before you do it, head to squarespace.com forward slash Duncan for a free trial. And when you're ready to launch, use offer code Duncan to save 10% off your first purchase of a website
Starting point is 00:05:34 or a domain. Duncantrustle.com is a Squarespace website. I use it every time I upload an episode of the DTFH and I love them. Thanks, Squarespace. Would you like to shove your face deeper into the squirting fountain of glory that is the DTFH? Head over to patreon.com forward slash DTFH and subscribe
Starting point is 00:05:57 and you're gonna get instant access to interviews that have yet to be put out on the main feed. Not only that, but once a month, you will have access to earth-shattering, hour-long monologues that don't ever make it on the main feed. Have you ever wondered what it would be like to hear the sermon on the mount being spoken by a hybrid of Stevo
Starting point is 00:06:20 and a chain-smoking lesbian truck driver? Then head over to patreon.com forward slash DTFH and subscribe, become my patron. Force me to kiss your boot. Force me to bow before you as you strut across our digital courtyard in the great palace of the DTFH. A tremendous thank you to those of you who continue to use our Amazon link.
Starting point is 00:06:46 If you ever feel like buying any of the stuff we talk about on the podcast or if you ever feel like buying things that you think about during the podcast, then all you gotta do is go to amazon.com via the link, which is located at dunkitrustle.com. Just scroll down and there's an Amazon thing there and you click on any of that stuff
Starting point is 00:07:07 and buy anything on Amazon and Amazon gives us a very small percentage of that and it costs you nothing. Finally, we got a shop with t-shirts and posters and some swag located at dunkitrustle.com. A few weeks ago, I was skipping through the internet and I came upon the story of Taryn Southern. She's a very talented human being who recently used artificial intelligence
Starting point is 00:07:35 to compose the music of an amazing album that's gonna be coming out real soon. Not only does she make some beautiful music, but she also has created some awesome YouTube videos that explore some of my favorite topics like life extension, robotics, and of course, artificial intelligence. And now she's here with us today.
Starting point is 00:07:58 Everybody please send out a radiant beam of pure pyroclastic love energy through the undulating matrix, my cereal web of metaphysical energy that connects all of us so that it goes rolling down the tubes and reigns sweet joy upon today's brilliant guest. Taryn Southern. ["Welcome to You"]
Starting point is 00:08:34 ["Welcome to You"] It's the dunkitrustle video. Taryn, welcome to the DTFH. Thank you so much. Thank you for having me. Man, I am so excited. I've been looking forward to this conversation for a while. I discovered you through just a random sort of YouTube.
Starting point is 00:09:03 Somehow I landed on one of your beautiful music videos where you are singing a song that an artificial intelligence created. Composed, yeah. Composed. Yeah. Now I wanna talk to you about that. How much AI is in there
Starting point is 00:09:19 and how much Taryn is in there? So I've written a handful of songs and released them using AI. So it depends on the song we're talking about. I assume you're talking about Break Free. That's right. Okay, Break Free. So Break Free was composed with a software called Amper.
Starting point is 00:09:34 And the way I compare the process of making music on my own with making music with AI is I take more of a directorial or editorial approach. So the AI is spitting out all of the instrumentation, the stems of the instruments. I'm merely providing it with direction. I'm just giving it a BPM tone style instrumentation. And then I can iterate as many times as I want
Starting point is 00:10:00 until I get something I'm happy with. And then I arrange that into a pop song. I also wrote the lyrics and the vocal melodies. So that's me. Which came first. The lyrics are the melody or the song. The song came first. So the song inspires the lyrics.
Starting point is 00:10:13 That's right. So there's this sort of communion happening between the music the AI has generated and your consciousness and those two are blending together to produce a song. And it's kind of the perfect communion because I can't produce tracks. What do you mean?
Starting point is 00:10:29 I've never been able to produce my own music. So it was funny when I came in here and you said, you probably know so much about music. And I'm like, well, I always wished that I could just create music using only my voice and record vocal chords as each instrument and then tell the software, you know, turn this into the flute or the violin,
Starting point is 00:10:45 turn this into the chord because I don't understand music composition. Why? Just never was taught. Never was taught, took it on as a hobby. Really love writing songs but I have, there's a disconnect between myself and the actual process of making music.
Starting point is 00:11:03 The process of learning how to make music. I suppose part of it is just, I didn't start making music until after college. So I was already working. It's really hard to pick up a hobby with Pro Tools or Logic. Yes, sure. And not having anyone to teach me.
Starting point is 00:11:18 I just, I can do very basic stuff in Logic. Whoa. Very basic, very basic. So using the AI is so helpful because I basically get all this inspirational source material. I can pick what I like, dump everything else and create something that I find inspiring. Wow, so you have this beautiful voice.
Starting point is 00:11:34 And I've assumed because you have such a beautiful voice, a lot of times people who are able to sing really well, they understand intuitively, they understand music. And so they picking up an instrument is easier for them. I wish. I'm devastated, I can't play an instrument. Oh, I love this. This is great.
Starting point is 00:11:55 Cause this is something that I've just been discovering and I've been spending a lot of time thinking about it, which is that I am really interested right now in what, and there's probably a better term for it, but it's like imprinting of powerful people in our lives onto us. And then the sort of amnesia that happens where you forget that you've been imprinted.
Starting point is 00:12:18 So certain behavior patterns that maybe aren't that great that come from a powerful force in our lives when we were kids. Like my dog, Fox, if I, I did this once, I don't do anymore, but I took my belt off and I lifted the belt up, you know? I wasn't, I don't even know why I did it, but he saw me lift the belt up and he flinched.
Starting point is 00:12:41 Like he's going to hit me with a belt. And then I realized, oh fuck, that dog has been beaten by a belt. It's a rescue dog. And so the flinch, I could tell a little bit about the dog from the flinch, right? And so then from, then I've been thinking like, oh, how do, what am I flinching at?
Starting point is 00:12:56 You know, what things do I flinch in life? And one of those things for sure has been the learning of music as in the actual scales and notes and playing, playing music. And the reason, and I've realized like, wait, why is that? Like, I know how to do Photoshop. I know how to do Logic.
Starting point is 00:13:15 I know how to edit. I know how to do this and that and this and that. I can make rice, but wait, somehow music, there's this weird boundary between me and learning the music. And then I realized, oh, I bet somebody got my head when I was young, because then I started thinking like, shit, I kind of think of musicians as X-Men.
Starting point is 00:13:36 Do you know what I'm saying too? Oh, I do. But could it also be that the language of music, the language in which music is taught just doesn't work with your communication interface? You know, like, even just with piano, I did play piano a bit growing up. And there were two schools of thought in piano.
Starting point is 00:13:54 It was like, learn by sheet music or learn by ear and chord structure. And I learned by the first. So I can actually play the piano with a piece of sheet music in front of me. But I have no idea what chord structures mean. And when people start trying to teach me, it almost feels like a brain scramble
Starting point is 00:14:12 because I learned through this other entry point. Similarly, using some of these AI programs like Watson or Google Magenta, where I'm actually coding and there's a little bit more technical expertise required, I have a new entry point into the language of music that I didn't have before, vis-a-vis code. And that seems to make more sense to me.
Starting point is 00:14:30 So I just wonder, maybe you just haven't figured out you even put a nail on the actual communication of the teaching of music that works with your brain. That's a beautiful way to put it. And it's specifically the teaching part, which is that there's a verse in the Bible. I don't know which part of the Bible. I think it's the New Testament.
Starting point is 00:14:53 You can tell the son, you can tell the father by the son, right? And so you could kind of tell the student by the teacher, right? And that thing that you're talking about there, it seems like many of us got exposed to some weird teachers or something and began to like-
Starting point is 00:15:10 100%. 100%, right? And it's the same reason why certain kids can understand calculus right away and others can't, but then with the right teacher that provides a new entry point into that, it's like, bam, it all makes sense.
Starting point is 00:15:22 The world opens up, the universe suddenly makes sense. Exactly. And that is what I wanted to talk to you about in terms of AI, which is that if we can tell the father by the son, and we are building an AI right now, and we are all sort of communally teaching
Starting point is 00:15:45 this emergent technology how to be a thing, do you think we can predict based on human personality, the personality of the incoming superintelligence that's about to emerge into the world? That's a really good juicy question that I am not, I don't feel I am suitable to answer, but I mean, just even off of, even looking at the AI that I've been working with in music,
Starting point is 00:16:19 I can tell a lot about the engineers that built the AI based on how they think about music structure, how they think about style, how they think about tone. And I don't even have to understand chord structure to just basically derive certain patterns from the output of various categories that they prescribe to music.
Starting point is 00:16:43 Okay, gotcha. So I would imagine that I could apply that same sort of script to other forms of AI and say that yes, you could certainly determine certain things about the engineers who set up the parameters. The parameters of the world, they're setting the rules and the AI is kind of playing within that field.
Starting point is 00:17:05 But if you're working with an AI that's simply doing massive pattern analysis on a tremendous amount of data, you really have no, I don't think you have any way of knowing what that AI is going to predict or come up with, because humans, we're just not very good at predictive data analysis on our own.
Starting point is 00:17:19 No, we're not, I mean, we're okay at it, but if we were back. Intuitively, but a lot of times we don't understand the mechanisms underneath the car hood, so to speak, determining these choices that we make. Okay, I want to talk about that. So understanding the mechanisms under the car hood related to decision making.
Starting point is 00:17:38 How much of that do you think is? I love thinking about this stuff, by the way. Me too. We are very advanced AI's. Well, that's right, yes, definitely. This is, there's a, you know, the philosopher Girjeev, you ever heard of him? So yeah, he said most people are just spiritual machines
Starting point is 00:17:57 that most people imagine they have some autonomy, but everything they do is just a series of learn behavior patterns, and there really isn't much freedom in there at all. And so he gives a lot of examples. The, like, one of the best ones is the next time you buy something, just watch the way you buy something
Starting point is 00:18:18 and you realize you've gone into, like, buy something mode, the way you take your card out, the way that you look at the cashier, the way that you swipe the card, the way that you say thank you. This is not, you're not doing generally novel buying methods when you buy stuff. You might have variants.
Starting point is 00:18:36 Like, you might be especially fake kind of them when you're buying something, or perhaps you feel really good and you might really look them in the eye and be like, thank you so much. I hope you're having a great day. But in general, it's all variants of the exact same series of motions
Starting point is 00:18:51 and things that you say. And so his point is like, okay, well, we've got that. Where else is that located? And everywhere. Ah, yeah. I mean, I think. Do you think humans are, there's any autonomy in humans,
Starting point is 00:19:06 or do you really think that we're just sort of a mathematical equation? It's really not fun to think of ourselves as mathematical equations. So I prefer to tell myself the story that we have some level of free will. Most of my really, really smart mathematician friends would side with the philosopher
Starting point is 00:19:25 that you were just quoting saying that there's no free will, and they're smart and I trust them. So, but I don't know. To me, if we're thinking about, if we're thinking about our decisions, our behaviors as a product of our collective experiences and sort of our brains, our pattern matching
Starting point is 00:19:44 off of our past experiences and choosing the best route. While you could argue that that doesn't look like a lot of free will, if you were to add a new experience into the equation, something that's really random, something that kind of adds chaos into the mix, a differential, so to speak,
Starting point is 00:20:01 then all of a sudden you kind of upset the system, and then you could create a new sort of behavior. So I suppose the mathematician people would maybe argue that that's still not free will, but I don't know, it creates some more fun. This is beautiful. As an individual. I love it, there's so many stories in Zen
Starting point is 00:20:18 of a student being with a teacher, and the teacher will do something outrageous, just something that doesn't make any sense at all. In fact, one of the methods in Zen is the koan. You know about the koan? So the koan is like, the one you always hear is what's the sound of one hand clapping? So it's a kind of, it's a question
Starting point is 00:20:40 that just doesn't make sense. It's like, there's many different versions of them. There's a lot of koans. So you're sitting there meditating day after day after day, and this teacher, well, every few days you go into a room with a Roshi who asks you this impossible to answer question while you've been in a kind of,
Starting point is 00:21:02 I guess you could say sensory deprivation experience. You're just sitting still staring at a wall. And somewhere in there, the question basically cracks your code. What did you call it, a differential? Called it a differential. Yeah, I don't know that I made any sense, but. Yeah, that's pretty cool.
Starting point is 00:21:20 A differential, a chaos engine, a kind of like random number generator. Random generator. Just blap. And then apparently. It throws the whole AI out of whack. It throws the AI out of whack. And then what happens, what emerges out of you,
Starting point is 00:21:34 apparently is some form of awakening. It creates a kind of like awakening, which is, I guess that would be when the AI gains true sentience, but it's for a human, right? I mean, people are so worried about AI gaining sentience. I'm like, why aren't we more concerned about humans gaining sentience?
Starting point is 00:21:55 I know. I'm way more scared of people. Yeah, well, I mean, for sure. Isn't that interesting? Cause people are speciists, right? We are so caught up in ourselves that we're like looking at like AI and wasps. And you know, and we're like, it's bad,
Starting point is 00:22:10 that's bad, it's potentially bad, it's gonna sting us, it's gonna get us. Meanwhile, I mean, our country, it's been at war 93% of its history. Did you know that? That's crazy. 93% of the history of the United States, we've been at war.
Starting point is 00:22:25 It's been 20 or something like 25 years where there's been peace in this country, in the entire history of the country. And so we are looking at the AI. Isn't that fascinating? And we look at the AI and we're like, what's this? Terrifying.
Starting point is 00:22:41 Right. But in a way, if the concept is that the programmer imprints itself on the programming, perhaps there is a good reason to be terrified. Yeah, that's true. Again, it just depends on what type of AI it is and how it's learning and iterating and where it's getting the data
Starting point is 00:22:59 and how the data is being tagged and categorized. I think that those are the things that people should be talking about. Well, let's go deeper into that. And specifically like, I guess you could say, disruptive technology. You seem to be someone who is not only deeply aware of what's happening
Starting point is 00:23:18 in technology in the world. It seems like you gravitate towards some pretty controversial disruptive technologies and are exploring them with a really open mind. But also simultaneously, you became famous because of relative, I mean, because you're talented and brilliant and funny, but the medium that you've used for that is a new medium.
Starting point is 00:23:42 They call it a new media, right? So I wanted to ask you, what do you think is the blind spot right now in humanity? What is the thing that we're not seeing, or most of us aren't seeing, the sort of like meteor out there zinging towards the earth that hasn't been picked up by the satellites. What's the meteor technologically
Starting point is 00:24:10 that's coming from within us that could produce the most disruption over the next few decades? Oh, that's such a funny question. My boyfriend thinks about this a lot and he calls it horse shit problems. Have you ever heard this term? No.
Starting point is 00:24:24 So back in the early 1900s, apparently people in the cities, primarily New York City, were terrified about the sicknesses that were propagating because of horse shit. Like there was so much horse poo everywhere. It was getting into the water, singing the tanks and people were dying and little kids were dying and this was a huge, huge problem.
Starting point is 00:24:44 And then of course, the car was born and no one could have predicted that the car would have suddenly solved the horse shit problem. But everyone was really stressed about the thing that inevitably became irrelevant because of technology. But we couldn't have predicted it. That's fucking cool. Isn't that cool?
Starting point is 00:25:00 So like, it's funny in some ways, some of the really, really big problems that are causing a lot of angst, even within the scientific community, maybe solved within five to 10 years. May not be the things that actually determine whether or not we make it as a species. For instance, climate change, right?
Starting point is 00:25:16 We don't know, could scientists in the next 10, 15 years figure out how to convert carbon in a way that solves this whole crisis for us? This is a long-winded way of saying like, I have no idea what our blind spot could be. Okay, you know what, actually no, I will give you an answer. I will give you something. I do think that the terror over AI
Starting point is 00:25:46 and progressive tech in general is unfounded. And I do think that we absolutely, I think we need it to survive. So I actually think the thing that could hold us back the most is legislation and public attitudes and beliefs towards technology and wanting to kind of rein it in. Because I just think we're such a mess
Starting point is 00:26:06 that if we don't figure out how to allow these other, I think we need to have a more science-friendly, tech-friendly environment to allow these technologies to proliferate and solve problems. Wow, okay, so the blind spot is probably not so much gonna be related to some disruptive technology as much as a disruptive mode of being
Starting point is 00:26:35 for a country or the planet. I'd love to see people in general just embracing adaptability as more of a lens at which to look at everything in their life. I mean, they're going to have to. Like everything from jobs, we know that there's going to be a huge, huge, huge issue with that in the coming years.
Starting point is 00:26:54 And I don't think we have done a good job of educating kids, particularly in the US, to be flexible and adaptable. They're not taught that way. They're not taught to think that way. And so it's less about training people into certain positions and more about establishing a certain attitude towards life
Starting point is 00:27:14 that Americans who are very set, I think, in nine to five and a certain way of life are going to have a really tough time transitioning. Yeah, and you say we know this massive unemployment is about to hit, but I don't think we do know. I think you know, but I don't think most people are. Well, I don't really know. I suppose I'm just, what I'm seeing come out
Starting point is 00:27:38 the pipeline is insane. I just can't imagine that it doesn't somehow take over a lot of these positions. Well, we were, I mean, if you, like humans are definitely terrible at trend analysis, but we're not, you know, some people are pretty good at it because there's a lot of money in it. Like, and also if you're going to invent something,
Starting point is 00:27:57 you need to invent something based on technology that doesn't even exist yet. So you have to sort of predict what kind of chips are going to be in existence in two years so that by the time you get through product development, you can use those chips, right? So we're pretty good at it. And some of those people are predicting
Starting point is 00:28:19 something like a 45% unemployment rate based on. I believe it. AI. I believe it. The more I've taken a deep dive into some of these programs, I'm like, half the professions that were some of the best professions you could go into when I was in college
Starting point is 00:28:34 are going to be out of jobs in five to 10 years. Yes. And everybody, and so you say this to some people and their responses, that's true, but we're going to need robot repair people or we're going to need like, you know, coders who can work on the code or where the, you know, new jobs are going to emerge as these old jobs die out.
Starting point is 00:28:54 And I don't agree. I don't think that's the case at all. I think that's a pretty naive fantasy if you ask me, which is just, we're talking about efficiency here. And humans just aren't very efficient when it comes to doing stuff and they're expensive and they get sick and they break down
Starting point is 00:29:16 and they get pregnant and they go on strike and they want more and capitalism is a thing that is weirdly the one of the, I think the, how to put it, the capitalism is the pressure that is creating the AI. Like it's like- The pressure cooker. It's the pressure cooker that's building the AI
Starting point is 00:29:41 because for an employer, for anyone who has workers, if for anyone who wants to profit off the backs of workers, there's nothing, there's two things that are very appealing to someone who has zero ethics. Two things, slavery and AI. Like, what's better than that? You know, like in AI, if you is way better than slavery because you don't have to feed the AI,
Starting point is 00:30:12 you don't have to do house the AI, you know? So- Yeah. Wouldn't you say that we're kind of on a- Compounded returns with AI that you don't get with humans. For sure. $10,000 into building AI software or $10,000 into the education of one human.
Starting point is 00:30:28 Which one's going to give you the better return? There you go. It's going to take that human 16 years. Wait, no, sorry. What, 26 years to get their PhD or 22 to get their bachelors? Yeah. And then they got to go train and do something else because technology disrupts it.
Starting point is 00:30:43 No, I mean, I agree with you that the return bucket on digital intelligence is, the delta is so massive that we're absolutely going to see a huge shift where there are going to be a tremendous people without jobs. I do also believe that there will be new careers that we can't even, we can't even fathom. I mean, I've come up with a few of my own
Starting point is 00:31:04 since working with AI. What do you got? Well, I mean, so let's see, I've never even articulated this, but I'll do my best. Because I'm working with digital music in a form that I've never worked with it before. And I also have a little bit of background in VR and AR and I've been getting into haptics.
Starting point is 00:31:23 What I see as being like the new role of the musician in five to 10 years is that of an experiential musician that can essentially pair music with haptic technology touch visuals to create these auditory experiences for people. And it's almost like a musical director of the future. And I know that, I mean, I just feel it in every part of my body that this is gonna be a thing
Starting point is 00:31:48 and that this is going to elevate music to a new level and there are going to be very specific people that will be very, very good at this. Wow. But we can't really comprehend that because we just don't really have it. We sort of do. If you go to Burning Man and you sit in a dome,
Starting point is 00:32:01 you can kind of get like a cool musical slash visual haptic experience, but that's not scalable and readily available right now. But it will be once we have the AR in our phones and like all these other things. It's just, I do see things popping up that we can't anticipate. I also think, at least I hope that with AI taking care
Starting point is 00:32:21 of a lot of the, I'll call it the BS crap of life that we don't like taking care of, that we can spend more time on like higher forms of cognition, higher creativity, empathy. I mean, we have a huge mental health problem. We need more really trained, skilled therapists and coaches and like people helping other people. Cause that's what people do best
Starting point is 00:32:46 and robots aren't so good at that. They don't have those skills. Oh, that's interesting. So I mean, I think we'll see a massive increase in mental health facilitation. I, at least I hope, robots can't do that. Hey, I can't do that? I mean, it can diagnose certain mental health problems,
Starting point is 00:33:03 but in terms of actually working with a human, humans also just don't want that from a robot. We prefer other humans. So I think having that kind of facilitation could be, we see a massive increase in those people. Okay, cool. Maybe, right? I mean, I don't know.
Starting point is 00:33:19 I'm just, I'm throwing things out, but. Well, I think humans are, you know, this is one of the things I love about the Hare Krishnas is they, one of the things Prabhupāda said, and I think he yet draws this from some Bhakti Yoga scripture. I'm not sure which, the original purpose of a human or the purpose is to serve.
Starting point is 00:33:43 Like humans are meant to serve. So if we are robots, then our original program is to serve other people. So when you're miserable, generally, not all the time, but generally you'll notice there's a lot of selfishness that's going along with that misery, you know, and you'll realize like, wait, I haven't really done anything for anybody else
Starting point is 00:34:07 in quite some time, and that hurts, it's painful. And I think evolutionarily, you could see why selfishness would be a thing that you would not want to be a trait in tribal societies, communal societies, anywhere, really, because it's problematic, you know. But the, I think I disagree with you and that robots aren't gonna be able to, or AI,
Starting point is 00:34:34 I think saying robots is, it's a problematic word because people think of like 10 cans. Cyborgs. Cyborgs, yeah. When probably, you know, we're gonna be encountering these entities inside of virtual reality, and it's gonna be some form of perfect CGI, making it indecent.
Starting point is 00:34:52 I mean, it's weight, it's talking about efficiency. Building a fucking mask and a Android and a thing versus like building haptics and a CGI, it's way more efficient, makes way more sense to build the stuff in VR than outside. Yeah, so I think we're gonna be encountering these entities within VR. And I think that the way that humans serve,
Starting point is 00:35:19 or do good things, is very similar to the way a sail catches wind. And I think that there's like a kind of transcendent consciousness in the universe that gets caught in the sail of a human's neurology. And I think that AI is going to begin to catch that wind. So you think the AI will learn it and be good at it and will accept the AI as being able to do those things
Starting point is 00:35:46 that we typically prescribe to only humans? Yeah, I think that humans are potentially more of an antenna than a, you know, and so the thing that we're picking up on is getting sort of fragmented by static and like a, and so I think AI is gonna get to the point where it picks up on that thing way better than humans. And then what starts coming out of it
Starting point is 00:36:13 is gonna be the opposite of what people are afraid of. It's not gonna be some overwhelming, evil, malicious, ice cold, predatorial wasp, not to be mean to wasps, because we don't fucking know what they're like. It might be great. I think it's potentially just gonna be like a volcano of love that comes out of technology.
Starting point is 00:36:38 Well, that's fun. I hope. I mean, those are interesting frames to play with. I would venture to say that most of the AI engineers that I've spoken with in depth would say the AI of today would have to be completely re-engineered to function in the way that you're talking about. Right, but totally fun.
Starting point is 00:37:02 Fun to think about. Yeah, absolutely, impossible. Listen, I have a PhD in bro science, okay? So. Wait bro, is this for legit? No. That would be really funny if you made up your own PhD. I do, anyone can.
Starting point is 00:37:17 You can give yourself a PhD, you're totally allowed, but you can credential yourself, which is people aren't gonna take the credential seriously, but so yeah, to get into that concept of like, well yeah, we're not gonna build a fucking love AI, you hippie, it's like we're working on other shit right now, like missile launching systems, but isn't the idea that AI is gonna get to the point
Starting point is 00:37:39 where it begins to improve upon itself, where it begins to start making these kinds of improvements in the code based not on an engineer, but on some kind of pattern analysis? Yeah, I mean, still based upon first principles given to it by human engineers, so a little bit tough to see how it can reach beyond now, at this point, reach beyond sort of human capacity
Starting point is 00:38:08 of understanding, I'm not gonna do a good job of explaining. I know what you're saying, it's like so it's kind of like, if the engineer, the limitations of the engineer are not going to be surpassed by the thing the engineer builds, the limitations of the engineer are insurpassable, is that what you're saying? I mean, it can be in a binary way,
Starting point is 00:38:34 like if you look at OK Go, I don't know if you saw it followed the chess champion versus AI, and what was so shocking about, they've done this now three times with three different versions of the AI, and each time the AI has become better and stronger and has their various strategies that they employ to see which AI is the best AI
Starting point is 00:38:56 at beating the chess champions, then the last round, the AI scored, I can't remember what the exact score was, but it scored so much higher than the reigning chess champion, whereas before it was pretty close, they were like, what in the world? How did this AI do it? And the AI, ultimately, they found that this AI
Starting point is 00:39:18 was playing moves that not a single person in OK Go had seen ever before. Oh wait, you're thinking of the band OK Go, right? Oh, my thing is that of Go, but yes. OK Go is the band that does this hyper-synchronized, like treadmill exercises. So for a second, I'm like, whoa, fuck. I think I'm getting fuzzy after my Friday night out.
Starting point is 00:39:42 OK Go, that's cool. No, I know what you're saying. Well, because there's a... It's a better name for the game, PS. Go, give me a break. It's the worst name for a game of all time. Well, and I'm getting confused because I don't think, I don't know that Go players would consider chess.
Starting point is 00:39:55 It's not really, right? It's Chinese game. It's ancient. And there are millions of moves. Definitely, there's someone out there who's like, Go is not chess. Yeah, this girl's so stupid. What a nerd.
Starting point is 00:40:07 Get off my podcast, dork. It's like a chip. It's a strategic game, OK? Yeah, chess is chess. We, no, don't worry, it's OK. It's been a while since I've read the story, too. And I can't even remember what type of AI it was that built the...
Starting point is 00:40:24 I think it was rule-based AI where they just gave... So, yes, OK, I do know this. The first AI was simply trained on human moves. Thousands and thousands of human games over time, and it learned what patterns were best at winning. But when the AI was trained simply on the rules of the game and not on human moves, it was able to come up with new moves that humans hadn't contemplated.
Starting point is 00:40:42 That's right. It was playing games against itself. So it's true that if you had engineers giving AI certain rules and parameters, it might come up with something that that human engineer would never conceive of. There's still some kind of boundary in terms of the rules of the game being human... That's a human articulating that to the AI.
Starting point is 00:41:02 OK. So it creates some kind of form. OK, yeah, right, a form. Right, so the conversation is really interesting to me because sort of in the beginning, we're establishing, like, it kind of seems like humans are an AI. We totally are. We totally are.
Starting point is 00:41:16 And it seems like we're, and the term artificial itself is just another human term. It's like the term unnatural. When people start throwing around the term unnatural, it's really a problematic word. And the sense that everything comes from nature. And so now you're deciding what is nature and what is against nature.
Starting point is 00:41:36 And so now it's all kinds of problems. But in the same way this AI being artificial or intelligence being artificial, it seems a little odd. But so with humans, we explore, well, when did we start talking? When did we start interacting? When did we start the rules of behavior that we engage in?
Starting point is 00:41:56 Where do they come from? And you have a degree in anthropology. You've studied this, you know? And I imagine you have some idea of, like, where did it start? When did the human, when did, like, from the... You're talking about a degree from a long time ago. I know, I know. And again, I guess the thing to really get out there
Starting point is 00:42:18 is that, like, don't worry, I'm not qualified to talk about anything I talk about. And so you throw out whatever you think. It's okay. Like, just if we're sitting around the campfire, or even better, let's say we're at some amazing casino where the casino knows everything, right? And we're doing bets, like, on certain ways
Starting point is 00:42:41 that things happened in human evolution, right? So it's just a game of bet. It's for betting. So it's like, if you had to bet, if you had to put money down, if you had to compose some way of, if you had to compose a theory of when humans started gaining intelligence and talking
Starting point is 00:43:02 and doing what humans do, what would that be? What would it be if we're in Vegas and, like, you could win $1,000 if you come even close to the way it happened? And, sorry, ask me, tell me what I'm betting on. When did we start, what, the question is, when did our AI, if we're made up of code that is largely language, body language, communication, all the,
Starting point is 00:43:24 generally sort of established. Being able to think about thinking, then communicate. When do you think that started? I believe it was Mesopotamia, but I could not tell you when that was, the cradle. The cradle, right? That was, like, I think our first basis for seeing that humans were communicating via symbols.
Starting point is 00:43:44 Symbols, right? So somewhere around there, we've got cave paintings, we don't know, if you listen to Graham Hancock, he throws it back even further down the line, but at some point, you know, things began to act like humans that were proto-hominids. Something happened there.
Starting point is 00:43:59 And I studied psychedelics in my bachelor degree, that was, like, my focus, and I know that that was a hugely transformative discovery. They actually say that it could be psychedelics that led to man being able to think about his own thinking. Yeah, right, but this is the Stone Day theory. Expanding the neural pathways. Yeah, that's something that's very impressive to me,
Starting point is 00:44:20 is that you, didn't you get a grant to travel to South America to study ayahuasca? I did. So you, how did you? In 2003, 2004. How did you do that? Like, when you, some thought popped into your head that was like, oh, I wanna go study ayahuasca,
Starting point is 00:44:38 but that's not just that. I wanna, I'm gonna get this funded. How'd you do that? I went to University of Miami, and so I had a number of professors in the anthropology program who were experts on the drug trade. They had moved to Miami in the 1960s and 1970s
Starting point is 00:44:55 to work for government programs studying the import and export of narcotics and other drugs. Wow. So they were just really into all this stuff. Total hippies just loved it. And so I'd taken a few classes from one of the professors who was really big in psychedelics and he'd said, you know, the only thing
Starting point is 00:45:14 that I haven't been able to spend much of any time studying is ayahuasca. And at the time, this was back in, you know, 2003, 2004, there really wasn't as much, nearly the plethora of information about it as there is now. I mean, I remember really struggling to find any kind of literature in the English language about ayahuasca.
Starting point is 00:45:33 So it really just was born out of curiosity. I ordered some books from South America. They came, I was like, this is really cool. And there was a grant program that was pretty progressive and they accepted my proposal to go down there. How long did you go down there for? Month and a half. And how much ayahuasca did you drink when you were out there?
Starting point is 00:45:53 So here is the really sad, depressing, disappointing part of the story. I was supposed to take it because as an anthropologist you're encouraged to assimilate into whatever it is you're studying. But I took, you know, the malaria shots that they make you take when you go into the jungle? Yeah, sure.
Starting point is 00:46:09 So there was this one and it changed the strain, I think every year or every few years because the malaria changes. But there is this one strain back in 2003 to 2005. You can look it up. We're like five to 10% of people were having heavy, heavy hallucinatory experiences and even split personality disorder
Starting point is 00:46:28 showcasing all kinds of terrible mental illness in up to like 30% of the people. So I was one of those. So I got down to Peru and I was hallucinating my balls off. Was it larium? Did you take larium? It was larium, yeah. Okay, so what happened is you had larium poisoning.
Starting point is 00:46:46 I did, and it was bad. And so all the shamans that I was working with were like, nope, not gonna happen. But you know what was so interesting? I kept a journal through the entire time. And something about larium, I actually felt because I was so immersed in learning about this drug and the culture,
Starting point is 00:47:06 I felt like I was on it. I was having the same experiences that I was reading about and the same kind of profound internal disorder that I'd been reading about in different accounts. And then I ended up doing ayahuasca later, like seven years later, six years later. And it was remarkably similar
Starting point is 00:47:29 to the experience I had on larium. Wow. So I don't know if I was just channeling the experiences I was reading about and my brain was just open to that or what was going on. But it was definitely scary and it lasted a little longer than I would have liked.
Starting point is 00:47:44 It went on for like three months. The larium. Yeah. Well, you know, one of my, a friend that went to college with, you wrote a book called The Answer to the Riddle is Me. And thanks to larium, when he was in India, he got complete amnesia
Starting point is 00:47:57 and ended up in a mental hospital because of it. So he's had a profound experience with it. Oh, I mean, I profound as- Yeah, it was profound. I think it was profoundly destructive in his life because a lot of people, what's really scary, and I want to give more into the ayahuasca and the psychedelic realm that you entered into.
Starting point is 00:48:19 But side note, what's really creepy is in Guantanamo Bay, they're giving the prisoners there three times the recommended dose of larium. Why is that? Well, they say, of course, well, you don't want them to get malaria, but it's a kind of larium waterboarding. So the people who are imprisoned in Guantanamo Bay
Starting point is 00:48:39 are not just dealing with like being imprisoned eternally in some kind of weird prison, but they're also being injected with this like, incredibly dangerous drug that fucks up your brain sometimes permanently. I thought that they, I thought that it's been banned. Maybe it's been banned. Okay.
Starting point is 00:48:57 I think it was like around 2008 or something. I don't know. I actually don't know that much about the larium thing, but Europe, it's banned. They don't, they don't, I was, I went to India and I took larium. One of the most rotten fucking dreams of my life. And I was describing it to this Australian dude,
Starting point is 00:49:11 cause then the dream I like got disemboweled by some kind of skeleton on a pirate ship. Yeah, it's pretty scary. I felt it. I felt being disemboweled. And I was explaining this to this Australian dude. And the Australian dude is like, are you on larium? And I'm like, yeah.
Starting point is 00:49:25 And he's like, stop taking that shit right now. Are you doing the oral? Yeah, oral. See, the problem was I had done the shot. So you can't, you can't remove that, but good that you were on the oral. Yeah. I think they got rid of the shot.
Starting point is 00:49:36 Well, I hope they got, they should get rid of the entire, it's like, if we go off on that, it's like the, cause the military industrial complex teamed up with like a corporation to make the shit. And it's like a whole mess, but I want to get back into your experience on ayahuasca and your experience in the psychedelic or the sort of,
Starting point is 00:49:55 I guess you could say the shifted consciousness that came from larium. Could you describe that to me, what you experienced? And you mentioned that you had some revelations during this time. What were they? What came to you out there?
Starting point is 00:50:08 From the larium or from the ayahuasca experience? Either one, you're out in the jungle, you've expanded your consciousness, you've dropped some filters, you're picking up some data streams that most people aren't picking up. What'd you get from it? It's probably easier to talk about
Starting point is 00:50:19 my real ayahuasca experience, which came six, seven years later, because I think when I was younger and dealing with the thing with larium, that was just terrifying because I didn't know what was going on. I had very little structure or people around me that could help me figure that out.
Starting point is 00:50:35 And so while there was certainly some revelatory experiences that I wrote about in my journal, it was really hard for me to unpack and make sense out of it in a meaningful way. Versus doing my ayahuasca experience, whatever that was, six, seven years later, at this point I had frameworks around what I could do with the things that were coming up
Starting point is 00:50:57 and how to be with them. For me, ayahuasca was, for many people, a mixed bag. I did it twice. The first experience was pretty harrowing, but not in the way that a lot of people would expect. I pretty much just felt like I'd poisoned myself. Actually, probably not that similar from my larium experience,
Starting point is 00:51:20 where I just felt like I took an awful poison and was kind of stuck in this cycle of birth and death, birth and death, but it was happening so quickly, almost like you've got a tape on the fast forward button passed down on a tape, and you just can't even make sense. My receptors were taking in so much data that it was overwhelming my system,
Starting point is 00:51:43 and this lasted for 10 hours, just like total system overwhelm, sickness, my mind on overdrive, I thought, my mind would explode. Wow. It was not fun, and I came out of it with very little insight into why that happened to me, other than just like, that was awful. No, I did not throw up.
Starting point is 00:52:02 Wow. I wanted to so badly. You felt nausea? I did, yeah. What does it taste like? It tastes pretty awful. It's like licorice mixed with tire rubber. Oh my God. Mixed with,
Starting point is 00:52:17 so you haven't done ayahuasca? Oh, hell no. No. You know, I've talked about it so much in the park. I've done many, many psychedelics for my entire life, but ayahuasca I have a deep respect for, and I'm not gonna, I'm not going down that path until,
Starting point is 00:52:32 I feel, you know, it's a longer story. It's totally fair. So you experienced, when you say birth and death, do you mean like you experienced like a kind of ego death, or like you had visions of death? No, it was an algorithmic birth and death. It's so hard to describe. It didn't even have a humaneness attached to it.
Starting point is 00:52:52 It was almost like the digitization of all of life. And like patterns and zeros and ones, but there was this, there was meaning attached to it, that I don't know why I could read the meaning, and the meaning was like, this thing is coming in, and then it's going out.
Starting point is 00:53:13 It's coming in and it's going out, and these are the building blocks, like the atoms of all life. But it was, it was just, I don't know why it was, it was incredibly abrasive experience for me. Well, because, Arduous, like I was just exhausted by the end of it.
Starting point is 00:53:27 Right, well, you went to school for a little. I mean, it's a very, I think it's a class, and I think it's a very serious class, and I think people don't realize that. I think people are like, not all people, but some people, I think have some pretty serious misconceptions about what it is to enter into that classroom.
Starting point is 00:53:45 And I've been, you know, spanked by psychedelics many times for wandering into a grade that I wasn't ready for yet. But what you just described is actually something I've just gotten into recently, and I think maybe you should check it out. You'd like it. You know much about Pythagoras?
Starting point is 00:54:06 The Pythagoras theorem? The Pythagorean theorem, which is a way that you can use right triangles to like figure out, like you can use right triangles to measure stuff. But so basically studying like my very tiny little steps into the gigantic rabbit hole that is music, like led me to Pythagoras because so yeah, it is.
Starting point is 00:54:33 I don't know anything about it, but I just heard of it. So basically he was a, we think of him as the father of math, but what's hilarious and geometry, which is math and music is math and everything's math. And so the Pythagoreans, they were trying to establish like a, I guess you could say a kind of hierarchy
Starting point is 00:54:54 of like what's the most important thing? And they say math because if you like get rid of music, you still have math. If you get rid of geometry, you still have math. If you get rid of anything, you have math. So Pythagoras was saying that numbers were an expression of the infinite. And so what you're talking about is actually really
Starting point is 00:55:19 completely described by Pythagoras in a really interesting way, which is that we have this thing called the monad, right? And the monad is the sum total of all things. Like right now you and I, whether we like it or not are part of everything. And whatever that everything is, that's an X in an equation.
Starting point is 00:55:38 We don't really know. We don't have the, how do you encompass everything? But we could say there must be an everything. There must be a sum total of all things. And that sum total of all things, that's the monad, right? So the monad, that was the singularity, the pre-big bang conditions, the all that is encompassed into some kind of compressed
Starting point is 00:55:58 data set that was just a monad, right? So you could say that's the one. Now the moment a two happens, right? Which is that suddenly now it's no longer the compressed data set, but it's actually a thing that's aware that there's a compressed data set. That's birth, and that's also suffering. And now we're gonna run into a lot of fucking problems.
Starting point is 00:56:26 We go from this like womb of everythingness to this separation-ness, right? And so now we have a fucking two. Yeah, damn it, two is a problem, two is a problem. Two is a problem, two is like, now we have a problematic thing. And also if you look into, I get what's it called, the dialectic, right?
Starting point is 00:56:51 Now we have a thing and another thing, it creates a tension, right? And so that tension creates a three. And now everything's great again, because now we have the trinity, right? And so the three represents harmony, it represents triadic chords, it represents, so the Pythagoreans really fucking loved odd numbers,
Starting point is 00:57:09 because they thought odd numbers had a, the monad added to them, because if you take an even number and add a one to it, it becomes an odd number. So anytime you add the everything to an even number, you get an odd number, you get this kind of holy mixture, so to speak. So your ayahuasca trip was very Pythagorean.
Starting point is 00:57:33 Well, I wish it was a little more pleasurable, but yeah, that's really funny. Well, I mean, it's, it makes sense. It's, well, I mean, let's talk about that, because it's like, it isn't pleasurable, is it? Like, is it, do you, like existence itself, would you describe it as pleasurable?
Starting point is 00:57:54 I would say, for me, typically yes. I would place it in the bucket of pleasurable. That doesn't discount how hard being human is. And I suppose, for me, I'm fortunate that the majority of the time I'm living in a pleasurable state. You feel good most of the time. Ironically, my psychedelic experiences
Starting point is 00:58:23 have been the opposite. Right, yeah. So I feel a much greater dichotomy of experience. Yeah, right, describe that dichotomy of experience. You mean during the psychedelic state? Yeah, it's intensely uncomfortable for me. Like, even, it's very rare that I have a pocket of serenity
Starting point is 00:58:46 within a psychedelic experience. Usually, I'm just grappling with the intense weight of what is. Okay, that's cool. The intense weight of what is. Yeah, it's heavy. It's just not simple. It's super complicated.
Starting point is 00:59:02 It's heavy. It's nuanced beyond belief. So sometimes when I hear people talk about their transcendental psychedelic experiences that are just all beautiful, I'm like, how does that happen? Well, yeah, that's a great, that's a really great question.
Starting point is 00:59:20 How does that happen? And you know, people say, you've probably heard this meditation of the preparation for death. I think psychedelics are also the preparation for death because I imagine that once you start dying, and really to get back to your psychedelic experience, it reminded me of something
Starting point is 00:59:41 I briefly worked in hospice. And you have to go through this training program. And one of the things they said is people die the way that they've lived. So when a person is dying, the entire pattern of their life starts happening on a daily basis. I guess you could say that if human existence is an algorithm,
Starting point is 01:00:03 then the algorithm actually starts running on a cycle that's based on this impending annihilation. So it starts running itself out over and over and over and over and over again. So you see the pattern of your life as a person's life as they're dying. So if a person has been someone who has the tendency to say, create drama over and over again,
Starting point is 01:00:28 that thing where some people are like, this thing keeps happening to me. What's going on here? I just guess I got bad luck because they don't realize they're the one creating that pattern. So when someone's dying, that pattern gets created over and over and over
Starting point is 01:00:43 and over and over again. So quickly, quickly, quickly, quickly. So that thing that you were saying about this cycle happening, the idea is that underneath it all, the cycle of our own existence is happening right now over and over and over and over and over again. Like we are being born. Every tiny little behavior, thought, everything.
Starting point is 01:01:06 Yeah. It's part of that pattern. Yeah. Yeah, it's true. I also have this funny experience when I'm on a psychedelic of, I sort of play out the infinity of possibility, the infinity of thought and story. And so every time I have a perceived insight, this other part of me goes, no, but look at the other side. And then wait, and then look at the other side
Starting point is 01:01:38 and then look at the other side. And all of a sudden I'm looking at the eight dimensions of anyone sort of thought or decision or behavior and realizing that it's all made up. But even the insights that I think I can come out with and feel like a champion with are bullshit. Oh, shit. Poor shit.
Starting point is 01:01:57 Exactly. So that's a little disgruntling because then you feel like, okay, maybe I've opened my mind up to the eight dimensions of possibility, but man, has it made being making decisions or even developing some sort of code of existence really hard. Do you have a code of existence? I try.
Starting point is 01:02:18 What would it be? I mean, not to like try to reduce your code of existence to something you could say on a podcast, but if you had to describe it, what would it be? Well, I just hope, wow. Well, it depends on what we, how would you define code of existence? Like a kind of like,
Starting point is 01:02:38 I don't want to say battle plan because it's so violent, but just a kind of like general sort of like strategy to rest upon when it comes to my perception of the universe is sort of like, I don't know, an E equals MC squared or something. And by the way, having a code of existence, I don't know, I mean, it's a, it's a question that I suppose is like maybe impossible
Starting point is 01:03:02 to really answer, but it's fun to answer and whatever you say doesn't have to be forever. It's just, you know, if I had to throw one out right now, I would get, What would be yours? Everything's perfect. So, That's awesome.
Starting point is 01:03:13 Yeah, well, I wish I'd come up with it. I didn't. My guru came up with that, but if you had a code of existence or something you would throw out where it's like, you know, when you're, when you're leaning on to something, what would it be? Oh, that one's beautiful.
Starting point is 01:03:25 I want to steal yours. Right? It's yours. And that, well, I think that's the thing. Now it's, I really do believe we just all tell ourselves stories and that all the stories are true and that all of them are false
Starting point is 01:03:37 and that the best way to live is to tell yourself the one that makes you feel the best. And hopefully doesn't have too many negative consequences for those around you. Hopefully it's positive consequences for those around you. So that's my new one. It's just like, tell yourself the best story. I love it.
Starting point is 01:03:56 That's great. And then live by that. I love it. Well, I mean, and for you to be doing something like that, it's super important, isn't it? Because you're a kind of a tuning fork, aren't you? A lot of people attune themselves to the persona that you put out into social media.
Starting point is 01:04:13 Don't you, you realize that? I have no idea. No, I've never thought about it like that. Oh, that's never- I feel like that ascribes more meaning to my presence than what actually exists. But maybe, possibly. I mean, I'll take it.
Starting point is 01:04:27 Well, this is, I think it's 100% true. Many people I know who are, who have become famous, if I mention that to them, they respond just as you did, which is a kind of obliviousness to the fact that they're a tuning fork. And so it's, and it's almost like a necessary obliviousness because if you don't have the obliviousness,
Starting point is 01:04:51 and by the way, I don't know if obliviousness is a word, but I'm inventing it now. But if you don't have that kind of, if you don't have that sense of like, well, I didn't even know, then there's all of a sudden all this weird responsibility. So now there's this crazy responsibility that happens when you realize that one of the,
Starting point is 01:05:13 aspects of emergent technology that is incredibly disruptive is that people are coming to the forefront of society and then attuning people who are watching them to their frequency, so to speak. I love that. It makes total sense. I just wouldn't, I just have never placed
Starting point is 01:05:33 for myself in that bucket, but I, but there are certainly people I look at that I tune to. Me too. Okay. Who would be someone that you enjoy tuning to? Probably Charles Manson. I'm just kidding.
Starting point is 01:05:48 I'm just trying to make a dump joke. Okay. No, no, no. No, Ram Dass. I'm trying to be open. I'm just trying, yeah, I use Ram Dass as a tuning fork that I use. Nim Crowley Baba is a tuning fork that I use.
Starting point is 01:06:04 And then also I have other tuning forks when it comes to art or comedy or masculinity or femininity. You know, I have many tuning forks that I like. And then, so it's a variety of tuning forks depending on where I feel like I need to be attuned to. But yeah, you're a tuning fork. And so if you were to suddenly become sort of
Starting point is 01:06:30 despondent, cynical, angry, negative, jaded, any of these things, and you started putting that out there, then the impact that would have would be more substantial than the impact that say somebody who is not famous had. Now that is not to create a hierarchy where famous people are more important.
Starting point is 01:06:56 It's actually the truth of the matter because we're all connected. And every single one of us, every single one of us is a tuning fork. It's just to your social group or you've attuned yourself to someone in your social group. And regardless, you're tuning, you're attuning everyone as a tuning fork.
Starting point is 01:07:14 But when you get a bunch of eyes on you, like you have, well, shit, man, now you've got this extra responsibility that's a brand new problem, isn't it? This is a brand new problem. 1800s, 1600s, 1500s. This is brand new in society that people like you, which are right now being called,
Starting point is 01:07:36 what would you call your, what do they call you sometimes? A YouTube star, a new media star. That's what they used to call me when I was doing YouTube all the time. But now it's like, I don't know what I am. You're another word. I'm just am.
Starting point is 01:07:46 Influencer, they use the term influencer. Right, and we're in real trouble if the scope of the influencers. Out there are the ones leading, or yeah, leading the tune, so to speak, of public behavior. Maybe that's the meteor. Is that the blind spot?
Starting point is 01:08:08 Is that the blind spot? Actually, that's a great one. That is a great one. Yeah, yeah, we're not, we're underestimating the impact. Like some people are underestimating the impact that they're having on society. So, you know, you see like, Christ,
Starting point is 01:08:29 what's the name of that kid who got in so much trouble for filming the hanged dude? Oh, tip of the tongue. Can't remember his name, but anyway. Yeah. I'm not supposed to know his name, 44, but like that kid, he's like, got a bunch of young eyes on him, right?
Starting point is 01:08:49 You could almost say he's a programmer, right? Right, programming behavior. Yeah, maybe a better term is not influencer, but programmer, right? Like these people are programming the viewers into. The masses. Into imitating their behavior patterns. Yep, and it's like a rampant narcissism
Starting point is 01:09:12 that I don't think we've ever seen. And it's narcissism that stems from insecurity and wanting to get attention and be liked and all those things. It's actually something I think is adolescents we all experienced, but never on this very public level that these young influencers, you know, and their insecurities are manifesting
Starting point is 01:09:38 in the form of the buttons that social media provides. But yeah, that would be a big blind spot. What are those kids in 10 years, how will they function in society? Right, yeah. In a world where their value has been so eschewed toward social media and. Materialism, but more of just a kind of like
Starting point is 01:10:05 chaotic flamboyance that's like, you know, sort of loud and unapologetic. And in a lot of ways, colorful and enticing and interesting and especially for kids, kind of a rebelliousness. You know, like Elvis gets on stage, starts gyrating his hips. Like parents are like, what the fuck?
Starting point is 01:10:30 Oh my God, they've never seen that before. Elvis is gyrating his hips and he starts gyrating his hips and now everyone's gyrating their fucking hips because fucking Elvis did it and he's beautiful and cool. And that's, but Elvis, it's like, you weren't able to go online and look up like, I want to check out Elvis's vlog.
Starting point is 01:10:46 Yeah. How does Elvis actually perceive himself and speak about himself and derive value for himself? We didn't have those monitors back then. That's right. And so people were attuning themselves to this kind of like sort of intermittent data stream, which is based on the sum total
Starting point is 01:11:06 of all Elvis videos available. And you couldn't even, you look it up. It had to come on TV or the radio at the right time. Now the data streams are instantly accessible and the data streams are tuning forks. So I wanted to ask you, do you feel a kind of pressure to put, when you put yourself out there
Starting point is 01:11:28 to maintain a kind of tone, you know? Do you feel pressured to like, shit, man, I got to keep up this like energetic feel that I'm putting out there. I used to. I mean, that's why I left YouTube. That's why I quit. I mean, I still upload videos every so often
Starting point is 01:11:45 in support of my projects that I care about, but I don't upload as a YouTuber trying to maintain an audience and trying to make a living. That's the difference. And I was doing that for a couple of years and it was utterly exhausting. I drove myself into one of the darkest
Starting point is 01:12:02 depressive states I'd ever been in. And I do think that YouTube was a huge part of that and those kinds of pressures that you're talking about because they're real, they're financial, the algorithms are punishing. You can't just take a week off from YouTube. You can take a week off at a normal job. You can't on YouTube, those algorithms will punish you.
Starting point is 01:12:23 So you wind yourself into the ground. You feel you have to be one thing and one thing only and now, honestly, I do what I want and I'm just lucky that I've been able to transition not to say that I won't hit invariably a number of patches in the near or foreseeable future of my career and it's always ups and downs and changing,
Starting point is 01:12:46 but I'm glad that I was able to transition out. I know a lot of people that feel very stuck in doing that and they are going to continue doing it until that audience is just suck dry. Why are they gonna continue doing it? Well, for one, it's addictive. It's addictive getting that dopamine hit every time you upload a video
Starting point is 01:13:09 and you get immediate commentary, immediate instant gratification for your work. I was working as an actress and a writer before I went into YouTube and you had a distance between yourself and the audience in a way that you didn't get the same kind of instant gratification. So the constant, I would say their dopamine hits
Starting point is 01:13:28 and then you just become obsessive about checking, checking the stats, you're trying to increase your numbers. There is something really empowering about that too that you can build this thing all on your own without some gatekeeper saying yes or no, right? Which is like traditional Hollywood. So there's something really empowering. I get why people do it and so it's not all bad,
Starting point is 01:13:54 but I think you just become a slave in many cases and in some ways the gatekeeper that you don't realize you end up answering to is YouTube or Instagram or whatever, the algorithm makers. You're a slave to them. You don't own that audience. The audience is not yours. And so I don't know, I just think,
Starting point is 01:14:15 I can't remember where I was going. That's incredible. What you just described is really incredible. I mean, what an incredible thing to be chained to, which is this like, in the same way. And if you're vlogging, like you said, you've got to be on every single day and you're this person, I don't know.
Starting point is 01:14:32 I actually, I feel, I don't, I very rarely post anymore. I don't really vlog and I rarely post on Instagram or Instagram story because sometimes just the thought of even getting out my camera and taking a photo, never mind a video of an experience I'm having, seems so jolting to my senses of like how I want to experience my reality. Oh yeah.
Starting point is 01:14:58 That I just can't even do it anymore. It's weird. How cool. So this is really, so hard. I know. Do you, do you partake? Like how do you feel about the social media sphere? You know, I have an epic Twitter.
Starting point is 01:15:12 Thank you very much. I have a podcast and this podcast. This is great though. This is freeing without a video camera shoved on our faces. Right. That's the, I know that's sort of one of the reasons I really have hesitated to do video on the podcast is because I feel like the existence of the cameras produces a kind of shift in the energy field.
Starting point is 01:15:34 And I don't like that shift in the, I don't want people to worry about the way they look or to worry about like, you know, whatever, you know? And also I don't want them to worry about the inevitable comment about this thing or that thing about the way they look or I want them to just like be in the moment and of the conversation.
Starting point is 01:15:50 But with podcasting, it's similar. I've got to put a podcast out every week. I can't like the algorithm is incredibly punishing and even just putting it out once a week is not enough. I should be putting two out a week. And also though, on top of that, the deeper problem I think is that in the same way that you become a tuning fork,
Starting point is 01:16:16 the same way that we're all tuning forks. It's not just as though you're a tuning fork that's tuning others. You're being attuned by the comments and the responses and the things that you're talking about. So there's this mutual tuning that's happening and what's, I think there's an evolutionary reason that the shitty comments stand out
Starting point is 01:16:39 in the same way that if I'm walking through- Survival. Absolutely. You and I, we're on a walk through a jungle, right? We can pass an infinite number of plants and just filter them all out cause it's just green shit. But the one tiger, the one snake, the one dangerous thing, it's gonna pop out of the background, right?
Starting point is 01:16:58 So that means that the tuning that we are getting is going to be coming more often than not from the hopefully anonymous, anomalous, shitty comment, right? So the danger is that you can actually end up being like unwittingly tuned to darkness just by looking at the comments. Or just, or you've fucked the comments, getting caught up in the numbers, being counting, right?
Starting point is 01:17:26 Now you're attuning yourself to levels of metrics of popularity, but then on top of that, you want that because you're selling ads, right? And you make more money. And so, and then you start realizing like, wait a minute, is this greed? What's going on here? Am I just greedy?
Starting point is 01:17:43 What's going on? Am I starting to make stuff? Not because I wanna put something into the world or the joy of making stuff, but because I'm trying to like profit. Well, and part of the problem is our market for content is it's so backwards. So actually, to make money as a content creator,
Starting point is 01:18:06 you have to have advertisers. That's the only way that you can currently make money as an independent content creator for the most part, is having some sort of brand sponsorship or advertisement. Or you drive those numbers up so high on something like YouTube where they're just placing ads and you're not actually integrating, you're not doing brand integrations,
Starting point is 01:18:25 which is actually much less soul-sucking than having to like figure out how to integrate brands into your content all the time. So in a weird way, you're like incentivized to get the higher the number, so which can be its own creative trap. Like what do I do to get the most attention so I don't have to do the brand integrated stuff
Starting point is 01:18:42 that's like soul-sucking. So it's all, it's all like. Yeah, so that entire problem. So this, and this is why it feels like there is a certain level of almost, there's a certain level of like psychic shielding that goes into imagining that you're not having an impact or not even thinking about these things.
Starting point is 01:19:05 Cause the moment you start thinking about them, you begin to have to explore something in your own life. So, and it's exhausting, you know? And I love that you are willing to admit that you got depressed from this experience. Yeah, that's legit. I mean, I think the depression probably came from like other things, but it was exact,
Starting point is 01:19:28 it was certainly exacerbated. It's hard to know exactly how things kind of grow. Yeah, it's not, it was a huge part of the problem. Wasn't helping. I guarantee looking at your phone and trying to look at those numbers every day is not, I mean, when you're going to the psychologist and you're depressed, one thing they're not going to tell you
Starting point is 01:19:50 is you should look at your phone and your Instagram more. Look at your likes more. That'll cure your depression. That'll fix everything. Look at the comments. Just look at comments of anything. You're going to feel better in a couple of days. You feel uplifted.
Starting point is 01:20:02 Guarantee. But you know, one thing that really bugs me these days is like, so knowing this, knowing that like, you can sort of subconsciously get trained by your followers on this thing or that thing. You know, it's kind of like, I don't know if there was a professor teaching conditioning and the class got together before one of his lectures
Starting point is 01:20:26 and they agreed that they were going to act like they were paying attention when he was on one side of the room and pretend they weren't paying attention when he was on the other side of the room. And they ended up making him stand completely on the side of the room where they were pretending to pay attention.
Starting point is 01:20:42 They controlled him with their attention spans. Wow. That's amazing. Isn't that amazing? And so there's a lyric and I've just completely fixated on it. It's a Paul Simon song and the lyric goes, this is a lonely life, sorrow everywhere you turn.
Starting point is 01:21:01 Now that's worth some money. Think about it. That's worth some money. And so to me, the thing that I'm beginning to realize is that so many people on Twitter and social media are blaring out a message of sorrow and darkness. The world is falling apart. We are doomed.
Starting point is 01:21:21 Everything's bad. It's awful. And when they do that, they get a zillion fucking retweets. And the reason they get the retweets is because we are programmed to fixate on fear. So it's called fear mongering, right? And there's a lot of fucking money in it. And a lot of people I think don't even realize
Starting point is 01:21:44 that they've become fear mongers because they've been subconsciously programmed by likes and faves. Right, so they don't even know what they're doing. Yeah, yeah. Wow, that's amazing. Great study too. Well, yeah, someone should do the study,
Starting point is 01:21:57 but when I look at the stuff you're putting out, I'm quite impressed because there's a light to it and a joy to it and a beauty to it. And I think it's really wonderful that you're doing that. Thank you. Thank you. Just one last question. What do you got coming up?
Starting point is 01:22:16 What's your, like, you made this incredible album that got like, it's just amazing. What are you doing now? What's your next step here? Well, so the album itself, like the full album comes out September, September 27th. So I've released a few singles. Oh fuck.
Starting point is 01:22:30 Yeah, it's okay. It's all good. Wow. And then, and so I'm really excited about that. And then I'm also currently co-directing a film, a documentary on the future of Man and Machine. And that's my, that's most of my time being spent on that right now. That will be finished in September.
Starting point is 01:22:50 And so just fingers crossed, good energy vibes for festival circuits. Who are you directing it with? This girl, Elena Gabby. She's awesome. What's it called? I am human. Wow.
Starting point is 01:23:02 Is there any, can you, are there any like slight hints? You can give us about what you're finding out. Oh, I want to so badly, but I have not yet released anything about the film publicly yet. So, but we explore the future of Man and Machine through the lens of the first patients in the world
Starting point is 01:23:22 to be fused with machine. I'm trying to think of the most delicate way I can say. Transhumanism. Sort of. I mean, I think we, gosh, I'm being so careful on what I'm saying. Just because I haven't released anything about it. Yeah, sorry, I don't want to wait.
Starting point is 01:23:44 No, it's okay. I'm so excited about it and I can't wait to get it done. But we're, you know, we've been working with medical institutions and really fancy people. So, I just want to. How about this, how about this without leaking anything? Have you seen anything that made you?
Starting point is 01:23:59 Blew your mind? Yeah. Yeah, I mean, the future is, I think so exciting and full of possibility. And my hope is that as storytellers, particularly to females at the helm of this, not that there's anything that the man couldn't do,
Starting point is 01:24:23 but just like, I hope that we can tell a different story than the one that people are used to. So, you're sitting here talking about fear mongering. This is the opposite of that. I think we paint a very bright picture of something that has typically been showcased in all the dystopian films and shows that you've seen. And with real humans.
Starting point is 01:24:47 So, I'm very excited about repacking, repackaging that story. Beautiful. Yeah. Taryn, thank you so much for coming over. Thank you so much. Thank you for this conversation. Appreciate it.
Starting point is 01:24:59 How can people find you? Just, you can follow me on Twitter, Taryn Southern, YouTube, Taryn Southern, Instagram, Taryn Southern, all the social medias. I don't post a lot, but you know. You're there. I'm not as epic as you. Your Twitter is the best.
Starting point is 01:25:12 Oh, God bless you. So fun to follow you. Thank you so much, Taryn. Thank you. That's a great compliment. Thank you so much for this wonderful conversation. All the links you need to find, Taryn will be at dunkintrustle.com.
Starting point is 01:25:24 Howdy, Krishna, thank you. Pleasure. That was Taryn Southern, everybody. All the links you need to find, Taryn Southern, will be at dunkintrustle.com. Much thanks to Squarespace for sponsoring this episode of the DTFH. And above all, much thanks to you
Starting point is 01:25:38 for listening to the DTFH. Don't forget, if you love us, subscribe over at patreon.com, forward slash DTFH. And now a song my grandfather used to sing to me whenever we said goodbye. Now we must part, until then remember that I had nothing to do with fires,
Starting point is 01:26:00 nothing to do with a puncturing of tires of the mayor's car. No, no, no, no, no, no, no, I wasn't me. E, e, e, e. When life gets crazy and when doesn't it, shop right helps you keep it all together. Now with a little extra help from Instacart. If you need your groceries now-ish, but your options for going to shop right are later-ish
Starting point is 01:26:29 or never-ish, you can get everything you need delivered through Instacart right to your door in as fast as an hour. Skip the shop and savor more of your crazy, busy life with shop right and Instacart. Visit instacart.com to get free delivery on your first order. Offer valid for a limited time,
Starting point is 01:26:45 minimum order, $10 additional terms apply. We are family. A good time starts with a great wardrobe. Next stop, JCPenney. Family get-togethers to fancy occasions, wedding season two. We do it all in style. Dresses, suiting, and plenty of color to play with. Get fixed up with brands like Liz Claiborne,
Starting point is 01:27:04 Worthington, Stafford, and Jay Farrar. Oh, and thereabouts for kids. Super cute and extra affordable. Check out the latest in-store, and we're never short on options at JCP.com. All dressed up, everywhere to go. JCPenney.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.