Big Technology Podcast - Communal Living, Sex, And Silicon Valley's Groupthink Problem — With Ellen Huet

Episode Date: November 26, 2025

Ellen Huet is a features writer at Bloomberg and the author of Empire of Orgasm: Sex, Power, and the Downfall of a Wellness Cult. Ellen joins Big Technology to discuss how Silicon Valley, a place that... prides itself on independent thinking, keeps falling into powerful forms of groupthink. Tune in to hear how group houses, self-help programs, and “high agency” ideology create fertile ground for cult dynamics, and how that same psychology shows up in today’s AGI and AI-safety worlds. Hit play for a wild, revealing look at the stories and belief systems quietly shaping the tech industry’s biggest bets. --- Enjoying Big Technology Podcast? Please rate us five stars ⭐⭐⭐⭐⭐ in your podcast app of choice. Questions? Feedback? Write to: bigtechnologypodcast@gmail.com Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 How to Silicon Valley, a place so known for out-of-the-box and independent ideas, fall so often into groupthink? We'll get into it right after this. Capital One's tech team isn't just talking about multi-agentic AI. They already deployed one. It's called chat concierge, and it's simplifying car shopping. Using self-reflection and layered reasoning with live API checks, it doesn't just help buyers find a car they love. It helps schedule a test drive, get pre-approved for financing, and estimate trade and value. Advanced, intuitive, and deployed. That's how they stack.
Starting point is 00:00:37 That's technology at Capital One. The truth is AI security is identity security. An AI agent isn't just a piece of code. It's a first-class citizen in your digital ecosystem, and it needs to be treated like one. That's why ACTA is taking the lead to secure these AI agents. The key to unlocking this new layer of protection and identity security fabric. organizations need a unified, comprehensive approach that protects every identity, human or machine, with consistent policies and oversight. Don't wait for a security incident to realize your AI agents are a massive blind spot.
Starting point is 00:01:09 Learn how ACTA's identity security fabric can help you secure the next generation of identities, including your AI agents. Visit ACTA.com. That's OKTA.com. Welcome to Big Technology Podcast, a show for cool-headed and nuanced conversation of the tech world and beyond. Boy, do we have a show for you today. We're joined today by Ellen Hewitt. She is a features writer at Bloomberg News and the author of a new book. And here it is. It's called Empire of Orgasm, Sex, Power, and the Downfall of a Wellness Cult.
Starting point is 00:01:41 Wild title, definitely a parental guidance episode. So if you have kids, I would definitely recommend to reverse to our more recent episodes. And if you don't, I think you're really going to enjoy this one. Because we are going to get into how Silicon Valley so often falls into Group Think, whether it is, you know, sort of the off-campus activities they do, or even when it comes to things like this AI bubble and the pet, the pursuit of AGI. And Ellen is the perfect person to speak with us about this.
Starting point is 00:02:10 So, Ellen, great to see you. Thanks for having me. Welcome to the show. Thanks for being here. All right. So you wrote this book. We're going to get into it in a moment. You managed to say the title without laughing. I did.
Starting point is 00:02:22 Let me, I've been reading. Yeah. I've been reading the book. Show's the cover. Well, here's... I should have brought my own copy. Here's what I did, because I've been reading it at my parents' house. And I just didn't want them to see what the title was. But this is the book.
Starting point is 00:02:41 I just didn't want to have that conversation, but we'll have it here. I've been trying to get my friends, now that the book is out, which, by the way, the cover is tasteful. It's like... It's a nice cover. It's just the words on it are the ones that... I like to say it's a book for your sexy intellectual friends, because although it's about sex, it's really like an investigative account, you know, and very intellectual. It is hilarious to me that you rip the cover off because I've been trying to get my friends
Starting point is 00:03:01 to take photos of them, like reading it on the subway and send it to me so I can, you know. So for any listeners who read the book, take a picture of you're reading it in public and, you know, I just think it's a great cover. It won't be me because we have no cover on mine. Fair enough. But, all right, and that is like a good intro into what we're going to talk about because Silicon Valley is sort of like a petri dish for a place, for movements like the one we're going to talk about. And I think it really begins with the group houses.
Starting point is 00:03:31 And this is something that's often mentioned, like tangentially in stories about artificial intelligence or Silicon Valley, that the founders lived in a group house with many other founders. And the stories never really go too deep into what those are, maybe because the writer doesn't want to or maybe because it is too complex of a topic in a sort of do more than a drive-by and a story about something else. But they are core to Silicon Valley's culture. Yeah. So tell us a little bit about what these group houses are within Silicon Valley. So I'll start by sharing a little bit of like my own personal perspective on this, which is for many
Starting point is 00:04:10 years, I lived in a, you know, communal, you would, I would call it either communal living or like a co-living house in San Francisco. It's something that I have a lot of familiarity with like a certain part of that scene, which tends to be less professionally focused. It's more about, like, we live together, but it's not because, like, we're all founders or because we're all working on AI. But in a, you know, that's very adjacent to a, you know, lightweight network of group houses in San Francisco, Berkeley, other places in the Bay Area, where they are a little bit more organized around what you do for a living.
Starting point is 00:04:50 So it's like, do you work in AI? Are you like an AI researcher? Are you a founder? There are also slightly, you know, if I were drawing a constellation, these would all be like clusters near each other. But it's like there's also, for example, HF0, which is a startup. They like to call it like a monastery for hackers. Is there their name?
Starting point is 00:05:15 And that's more of like a temporary co-living situation where founders come together and live together for 12 weeks and focus really hard. A lot of them work on AI projects, and it's more like a live-in startup incubator that I wrote a future about last year. There's like AGI House. There's actually been like a debate over there's two different houses that like to lay claim to the phrase. Okay. But they are professional houses. Yes.
Starting point is 00:05:41 But I think to me, the root of a lot of what we're seeing in AI today comes like the AI debates, the effective altruists, the rationalists. And then, of course, there are some that are just driven. even by pure profit. But it doesn't start right with the professional side of things. It starts with an ideology. And you've seen companies like Anthropic come out of houses like these. So it's to me, and you tell me if I'm wrong here, ideology first. You have a set of beliefs about the world. And then oftentimes people from these houses will go and start companies together. Totally. I mean, the bond of living with someone in a communal living situation like that can really create deep connection and trust. And I can imagine it feeling like, oh, we've already
Starting point is 00:06:26 lived together. We've already, you know, overcome certain hurdles. We know how we show up in this house. Like, I feel much more comfortable like building a company with you. Or you're right, that ideology will often bring people to want to live together and then they might like spin out a company from that. Right. Like take effective altruism. For example, the idea is in some of their, it's earned to give or in some of their ideology it's let's go start a really successful company to then give way the profits or influence the rest of the industry of the industry to sort of take on our values. Yeah. Well, I think you're right that ideology is what underpins these. Like Silicon Valley has always been a place where ideology feels like this very motivating factor.
Starting point is 00:07:11 I think often it is accurately so, but sometimes it might end up being like a common. cover for other motivations. But it is at least- Everybody does want to get rich. Like, let's put that on the table. That is true. And some people are more or less honest about it, or upfront about it. But the truth is, ideology matters a lot, too. Like, it's extremely important for founders to feel like what they're working on matters and is going to, like, have impact on the world. And I think having an ideology that drives you, it almost feels like that is what's going to bring people together. And that's going to, like, lead them forward into, like, wanting to make these really
Starting point is 00:07:45 ambitious companies. Right. So I would say not every Silicon Valley company starts in a house like this. Yeah. But for the broader picture of Silicon Valley, these houses are much more influential than I think we read about it. Totally. And I think the reason that journalists sometimes struggle to write about them and I've encountered the same is that it's just not that easily definable of a category. And so, yeah, that it can be, it can be easier to just mention it without needing to necessarily feel like you're writing the entire comprehensive thing about how these houses work, because it's hard to track every single one. Right. So that's step one, I think. We have to, like, sort of lay the foundation that that's a core part of what we see in the tech world.
Starting point is 00:08:27 Now, let's go to part two, which is who are the type of people that gravitate to Silicon Valley and fill these houses? I know I've spent a lot of time in San Francisco more than six years living there. You've lived there for a long time. Yeah, probably 12, 13. It attracts a certain type of person. Often somebody looking for adventure or didn't quite fit in at home, I think, or someone with a tremendous amount of ambition. And they go to Silicon Valley and they often try to sort out their underlying issues through work. And then often you see people who are either part of these houses or, you know, in their orbit, they attend certain programs. programs, right? Programs like Landmark, for instance, or programs like Hoffman, which you have also
Starting point is 00:09:19 spent some time reporting on. So talk a little bit about, you know, what these, their self-actualization programs are basically programs built for the type of people that gravitate to Silicon Valley. Am I right? Yeah. I mean, I want to, you know, be careful to caveat with like, you know, not everyone falls into all these categories, but I think if we're speaking broadly, like, you're right. People who gravitate to Silicon Valley, they're looking to make a difference. They tend to be very ambitious. They want to do things, you know, from first principles and or slightly differently than maybe how they perceive others in the past have done it. And I think that type of person is drawn to often these programs that I would call personal development or personal transformation.
Starting point is 00:10:02 There's kind of a whole industry of them. And some of them you might recognize, you know, Landmark actually has a decades-long history associated with this like predecessor group called Est. Tony Robbins is kind of like a classic example of these, but some of the ones that I know are popular among tech types right now include the Hoffman process, which is, yeah, something that, like, I am curious about and, like, definitely want to, like, potentially report more on, but that's, like, a week-long, intensive therapy-ish retreat that kind of helps you process some of the stories and narratives in your life. Oftentimes, these programs will help you, like, reassess the stories in your life with the idea that it can help you unlock like a new level of performance. I know a lot of people who have done like conscious leadership group, which again, these all might be like constellations on a map. Like they're not all necessarily exactly the same. And some of them are more popular with Silicon Valley people than others. But they are all getting at the same thing, which is like, oh, if you can immerse yourself
Starting point is 00:11:05 in this intensive experience, you can learn more about yourself and maybe be a more effective leader or more effective professional. Right. There's also this core, I think I read about this in your book, this core ideology to it. I think this is a quote directly from you. It's about making people believe that they have agency over everything, right? So, and also like, I mean, Silicon Valley loves agency right now. Right. Yeah. Oh, well, AI agents. Oh, yeah, no, no, no, not even that. like, yes, AI agents, but the concept of agency as a personal quality is really, for lack of a better word, popular among rationalists and rationalist adjacent people right now. There's even a couple people who are writing a book about agency. The idea of you can just do things,
Starting point is 00:11:55 have you heard that phrase? Yeah. I mean, I think Sam Altman has said that. Oh, certainly. I mean, this is all coming from the same soup. So the idea that you can just, you know, you can just do things as kind of a rallying cry that tries to get people to tap into their agency. And so, but yeah, agency is kind of seen, you know, being like an agentic person, having high agency. Like these are all words that, at least to me, read a little bit coded of like, they're a little coded for like rationalists or rationalist adjacent people, people who might work in AI, this idea that like the world is a place that you can have massive. impact on and that you yourself are like a highly powerful actor in the world rather than someone who is you know responding to the environment around you you can like affect your environment very deeply I mean what is it's interesting that that has to be said out loud like what would be
Starting point is 00:12:57 the alternative be that you sort of go with the flow and yeah or that you feel like you're kind of like a victim of your circumstances um what's interesting interesting to me about agency being popular, in my view, among a certain, like, circle of tech people is that agency is a different way of saying this idea of like, oh, you should consider yourself, like, radically responsible for your life experience and the things that are happening in your life. And that's very much an idea that comes from these personal transformation workshops and lineages. So that is something that you would find at Landmark. That's something you'd find at Tony Robbins for sure. That's something that comes up a lot at One Taste. in conscious leadership group and also this. And so it's sort of been reframed as agency, but it's all getting at the same thing. And I would say the opposite is, yeah, someone, you know, you can imagine someone who laments the circumstances of their life
Starting point is 00:13:49 without thinking about how they might change it, you know? And they're like, oh, I was just born, you know, without the ability to, like, charm people. It's like, well, guess what? If you were more agentic, you would think of yourself and your personality as more malleable. you would think that you could learn skills that you might otherwise like dismiss. So it's people, it's drawing a distinction between people who kind of are like throw up their hands and say like, wo is me, these are my circumstances, it's not possible to change.
Starting point is 00:14:16 And other people who would take a more quote unquote agentic approach, they would or high agency, they would say like, okay, well actually, I'm going to like think strategically about how I'm going to change this about myself. I want to be, you know, it's like maybe it's like you want to find, you're single and you want to find a partner. It's like, well, you could lament that the dating market in San Francisco has a gender ratio that is unfair to you and you could complain about it. Or you could construct like a strategic plan that exposes you to the kinds of people that you want to meet in settings where you're likely to converse with them and dedicate, you know, 15 hours a week to this project and maybe you would be more successful. And at least you would have taken a like high agency stance on this. I mean, I've definitely met people in Silicon. Val, we think that like that.
Starting point is 00:15:05 Yeah, for sure. And I think that kind of, like, what's the word I want to say? It's like kind of ultra-strategic or maybe like really laying out exactly the plan for doing something like that. It does read a little bit of the personality type that you might imagine in certain circles in San Francisco. Right. And so this, so it's so, okay, going back to our big picture, people come in, many live in these houses. as many participate in these programs, they believe they have agency. It's a place that's sort of, maybe that's part of the secret sauce of Silicon Valley because
Starting point is 00:15:42 go ahead and try to go meet your partner or if you really think that you can change things or you can just go do things, start a startup, join a startup. It's like the hotbed for place, for activity like that. Yeah. And it's not even like they're necessarily wrong. Like in some sense, obviously. I'm not arguing with it. Yeah, yeah.
Starting point is 00:15:58 Building a company can be a way that you very quickly have like, yes, significant effect on the world around you if you're lucky and if you're smart and so it is kind of a high agency place i'm not at all surprised that this is like a pervasive belief in that culture okay and so uh this is our we're going to start getting into oom so if you're again if you have kids listening this would be a good time to hit pause um and so but into this culture uh comes orgasmic meditation yeah what is that i'll give you the basic you know the basic explanation so we can on to discussing it. The book that I wrote, Empire of Orgasm, is the story of a company called One Taste, which was started in San Francisco in 2004 by a woman named Nicole DeDone and a co-founder,
Starting point is 00:16:48 but she's really like the leader, the visionary, like the creator of this company. And One Taste, the company sells, the way that it made money was it sold courses on orgasmic meditation. And orgasmic meditation is a practice in which a stroker, usually a man, puts on a glove and some lube on his left index finger and strokes the clitoris of a woman in a very prescribed manner for 15 minutes exactly. And the only goal of this practice is for both parties to meditate on the sensations in their body. And the arc of this company is basically that it grew pretty big in San Francisco and beyond. It was in New York, L.A., San Diego, Austin, London, Australia.
Starting point is 00:17:37 It grew pretty big in the 2010s. It was, like, endorsed by Tim Ferriss, Gwyneth Paltrow, Chloe Kardashian, like, kind of got pretty big mainstream success for an obviously fringe practice. And then in the 2020s, it was the leaders of this group, you know, were basically, there were a lot of cult allegations that emerged about life. inside the company and then they were indicted by federal prosecutors and then charged with the crime and were convicted this summer in a jury trial in New York. And so the leaders are currently in jail awaiting sentencing. That's kind of like the whole arc. But I think what's interesting and relevant to this conversation is the way that this company, you know, reflects some of that, like, it's like a different take on some of that same ambition and ideology and
Starting point is 00:18:34 like just the way that Silicon Valley and San Francisco can be this, yeah, petri dish for growing like ideas that maybe start off in a good place but then can quickly like warp into something overpowering or something that can like warp your thinking away from like rational thought into like stranger corners. Right. And I mean, the root of this was at 7th and Folsom in San Francisco, which is like around the corner from where I worked when I was there. Yeah. And eventually actually, you know, yeah, they for many years were based on one block on Folsom Street, but eventually moved their headquarters to Market Street right across from Uber, right across from Twitter. They were like, if you know, 10th and market, it was basically like around there. And
Starting point is 00:19:24 And, you know, they, you know, it was like the Nicole Daydone, the founder of this company, like, she kind of did some of the things that, you know, a tech CEO in the early 2010s might have done as well. She spoke at South by Southwest. She gave a TEDx talk. They held conferences at like the Regency Center in San Francisco where everyone had their little lanyard. Like, it really reminded me of the earlyish days of big tech developers conferences like Google I.O., Facebook F8. The idea that you would come and spend a few days like, immersing yourself in the world of this startup or this tech company in like a corporate conference they even had two of those in 2013 and 2014 for orgasmic meditation yeah that's okay so this the similar there are those are the similarities yeah yeah the difference is that people yeah come and practice totally so go ahead there's also a side of this that is yeah pretty different from like what you think of as a Silicon Valley company I mean at I owe and F8 people don't have their pants off no at least
Starting point is 00:20:24 by design. Yes, that is fair. I mean, the other thing, you know, in some sense it was like modeled, one taste was modeled like a startup, but in many senses it was also an expression of the rapidly growing wellness industry. So remember, like Goop started in 2008, this company was really taking off like kind of in the late 2000s, early 2010s and throughout the 2010s. And so it was more of like, they also had kind of the like more woo-woo, like wellness angle.
Starting point is 00:20:52 And then, of course, it was a sexual practice. I mean, they called it a spiritual practice, but it's obviously sexual. This is like genital touch. And so the pitch of orgasmic meditation to the public was essentially that if you did this practice regularly, if you did this practice regularly in the same way that you might do sitting meditation, you know, that calling it a practice is like, and calling it orgasmic meditation was very intentional. to kind of put it in the, like, wellness and mindfulness boom that we were seeing in the valley. But basically, if you did this orgasmic meditation practice every day, the pitch was that you would
Starting point is 00:21:32 have better sex, better relationships, but feel more intimacy and connection in your life, have a better connection to your vitality, your desire, your intuition, all sorts of things that truthfully people are often looking for. Like the practice appealed to people who say they were like in a long relationship and they were losing their romantic spark. Maybe they were one of the 10 to 15% of American women who struggled to have an orgasm. Maybe they were like someone who struggled with performance anxiety during sex or just anyone who felt like they had had a complicated sexual history or trauma in their past that they wanted to like use this practice to really address it, which is something that I think people are drawn to and often don't have that many places to talk. about. So that was another way that like one taste brought people in as they were selling something that is very vulnerable thing that a lot of people yearn for and and are often looking
Starting point is 00:22:28 for answers. And that's very common in the wellness industry. Like it it will offer a promise that really pulls at some pain point in your life. Right. So I don't want to get too deep into it, but this is not like goop, right? Goop, you maybe get a moisturizer and put it on your face. Yeah. This is, this, to do this, it sort of, it happens in public, right? So you're most, most of the time. They will sometimes do public demonstrations. Right. And then if you were a serious practitioner, you might gather for group orgasmic meditation sessions, which in some senses, yes, you're doing it in front of other people, but it's not like into, to the public. Not in the middle of market street, but it's still. Certainly not.
Starting point is 00:23:14 So as I'm reading it, I'm just like, how does somebody go from learning about this to saying, you know, yes, I'm going to take my pants off and, you know, have this orgasmic meditation experience in front of others or be the one that's, that's, you know, doing the stroking. Like it's, yeah, to me it was like, how do you get from point A to point B? I'll walk you through like a typical person's experience. And again, I, you know, I interviewed tons of people who were involved in this company. and this is like, you know, an amalgamation of like some of their experiences. So a typical person, let's say you're like a woman in your late 20s, maybe it's hard for you to have an orgasm and this has been okay in your life, but you've been like interested in trying to like, I don't know, figure out, is there a way that I can address this or like understand my sexuality better?
Starting point is 00:24:05 And you might, maybe you're taking like yoga teacher training courses and then someone at yoga teacher training mentions like, oh, if you're interested in exploring your sexuality, like try orgasmic meditation. So you're like, oh, orgasmic meditation, what's that? And you attend one of the like public events that this group tended to host once or twice a week. They had an interesting name for it, right? Yeah, they called them turn-ons at some points or also in-group, which was sort of this like winking name, referring to sort of like in-group out-group dynamics in some of these groups. And the people that came in, didn't they call them something like? Oh, Marx. Yes. Sometimes they would, jokingly, but also it's obviously,
Starting point is 00:24:43 somewhat serious, they would refer to these potential customers as Marx, which is a suggestion of, you know, one of the allegations leveled at this company by many of its former members is that its sales practices were very predatory. So, again, the way that one taste made money was by selling courses. And it wasn't just courses on orgasmic meditation. If you got deeper, it would be courses about, like, how to live your life in alignment with the philosophies of orgasmic meditation and these courses could cost upwards of $20,000 or $30,000 and they might be like two-week intensives or this kind of thing and in that way again those group those transformational in a way those transformational packages and courses are similar to like what you might find at
Starting point is 00:25:30 like you know intensives at like Tony Robbins or landmark or that kind of thing so again follow that path yeah so the typical person like comes in they go to these intro evenings where again, everyone's close day on. And you just play communication games where people like talk openly and vulnerably about their feelings. And maybe there's like a little bit of suggestive or sexual like undertones, but it's not like a sexual experience. But then like the people who work at one taste are so friendly and they like come up to you and make really strong eye contact. They invite you to come back again. And maybe you're like, oh, these people seem cool. Maybe we can be friends. And then you come back and maybe they invite you to come help with a weekend course or come take a
Starting point is 00:26:12 weekend course and that's where you might learn to the principles of doing orgasmic meditation and you might see a demo from like two of the more advanced students who like do a demo to the class and then you know for many people from that point on some people just take a few courses at one taste and they're like great I got what I needed I'm going to leave and for other people they're like no actually I really want to go deeper I want to understand myself in this way I want to like experience the kind of personal transformation that I see the other staff members appear to be like displaying. And for those people on that path, what they typically ended up doing would be moving out of their previous home and moving into a communal residence with other
Starting point is 00:26:52 one-taste people. So again, it's kind of like group house culture, leaving their previous job and starting to work for One-Taste, usually on the sales team, because again, that was the main way that the company made money. And gradually distancing themselves from their previous relationships or friends and family who no longer really can understand this like new orgasmic lifestyle that you are living that tends to just happen, you know, on its own. And so people who then get really immersed in this world, you know, one taste kind of has like it's public face, which is like we sell courses on orgasmic meditation. But for people who move into that deeper, that inner circle, the experience does change and it becomes more intense and it becomes
Starting point is 00:27:32 like, well, this company and this community is all of a sudden your whole life. It's where all you your friends are. It's your coworkers. It's your employment. It's your home. It's your spiritual community. The deeper you get, the more grandiose some of the claims about the power of orgasmic meditation become. They teach it as like a way to access capital O orgasm, which is redefined to no longer mean like the moment of climax, but rather a sort of broad erotic energy similar to the force from Star Wars. This is the actual comparison people have made to me. This idea that you can tap into capital o orgasm to like fuel your life or guide your intuition. So it gradually becomes more spiritual. And then, you know, within that group, there are like serious allegations
Starting point is 00:28:22 that the teaching philosophies, including this idea of high agency and like the, you know, the sort of radical responsibility for your life, that some of those philosophies are then, the allegation is that those philosophies are used to manipulate people, to pressure them into sex that they didn't want to have, to pressure them into having sex with or doing the own practice with an investor of the company, customers who might be willing to sign up for more courses if they have more access to sex. The, you know, people told me that that the teaching philosophies of one taste were used to, yeah, like kind of exploit people financially, sexually, all this stuff. So But that's, that's kind of what leads to all these criminal charges and the criminal conviction
Starting point is 00:29:08 in that. But we can, you know, we can get into it more if you want. Yeah, I actually have one more question about the practice itself. So in your book, and you talked about it here just now, you said this practice can help you focus on work, calm your mind, and unleash your potential. I don't, I don't fully understand how that, how you get, get there. Well, Or how they get. Well, basically, in some sense, again, the main practice is you're doing this partnered genital stroking practice. But while that's happening, you know, there's no goal.
Starting point is 00:29:50 The goal is not to like reach climax. It is to. You're actually not supposed to, right? Yeah. It depends. If you become a more advanced practitioner, it is a little bit looked down on. But I think there's they, it depends on how serious a practitioner you are. But yes.
Starting point is 00:30:04 This is details I'll never get into it, but okay. So the idea is that you spend those 15 minutes meditating on the sensations in your body. And although it's an unusual setup, for anyone, you know, I've done a fair amount of sitting meditation. For anyone who, you know, obviously a lot of people in the Valley do this too, for anyone who has spent regular time meditating, that practice of noticing what's happening in your body, like I do think there's a lot of value to it. It is something that can help you navigate outside of, like, your autopilot brain. It's something that can help you notice your knee-jerk reactions and have, like, a more wise response. So even though the orgasmic meditation is unusual.
Starting point is 00:30:54 Yeah, why does this have to be done through orgasms? Well, I mean, that's kind of, that's kind of their pitch is that this is not just the benefits of meditation, but also the benefits of like connection and intimacy with another person, that you're experiencing this sense of meditation while having this very intimate experience with another human being. So that, you know, that's their own pitch. And like any sort of wellness group is going to argue that their solution is going to like have holistic benefits, you know, beyond just the thing.
Starting point is 00:31:27 But I do think like when they argue like it's going to help you. you know, improve your focus. Like, this is legitimately what people said when they had done a lot of orgasmic meditation. Whether it was all group think or not, we can get into. But basically, like, yeah, part of, part of the idea was that, yeah, doing this practice would help sharpen your mind. And in the 2010s, you know, remember, we had like the Wisdom 2.0 conference. It was like people, people were doing like mindfulness workshops at Google. Like, there was this idea that, mindfulness in the workplace was going to improve productivity and improve focus and, like, corporate performance. I remember chewing on a raisin and just like thinking about all the sensations I was feeling as that.
Starting point is 00:32:12 I mean, call me a little blue. I like that stuff. There's something to, there is something to. Of course there is. It's extremely lindy as some of the, as some people in certain pockets of Silicon Valley would say. Meditation's been around for a long time. Okay. So then just stick to the details of the court case, but where do things go wrong? Well, basically, the company was doing fairly well and had a lot of mainstream success. Again, Nicole, the founder was speaking on stage at the Goop Health Conference in 2017. They had these like endorsements.
Starting point is 00:32:43 They were making money. And in 2018, I wrote this big investigation for Bloomberg Business Week, where it was the first time that people in the company or former members of the group talked me at length about these allegations that the group was a cult. They basically said that they'd been exploited financially by being pressured to take on debt in order to buy more these expensive courses. They said that they had been exploited sexually by being pressured and sort of taught these lessons that, you know, pressured them to have sex that they didn't want to have in order to like further the company's business. And so I wrote this big story and the company kind of went into hibernation in response to that. And around the
Starting point is 00:33:28 the same time, and in all likelihood spurred by the story, the FBI started investigating. And then that led to many years of, like, the FBI looking into whether a crime had happened here. And then in 2023, federal prosecutors charged Nicole, the founder, and Rachel Cherowitz, who was kind of her, like, second in command, the woman who had been head of sales for a long time, charged both of them with forced labor conspiracy, which is like a, I won't get into the details, but it's a specific federal crime that suggests that you conspired to obtain labor unlawfully from people, either through things like threats of serious harm or serious harm. And then there were like, you know, two years of pretrial motions, which is like pretty common.
Starting point is 00:34:12 And then it finally led to a criminal trial this summer in Brooklyn, which I came out and covered in person. And that was like five or six weeks. And at the end of that trial, you know, a jury unanimously decided to convict. Nicole and Rachel of this crime, and so the two of them are in a jail in Brooklyn, awaiting sentencing, which could come later this year or early next, and they could face up to 20 years in federal prison. Okay, so now here's like the core question, right? Silicon Valley, of course, like we've talked about, has this philosophy of being a place of agency, of, you know, you go out and change your circumstances, and, you know, you don't let stuff be dictated to you. You could
Starting point is 00:34:55 just do things. How does a place like that have this fall into a situation or how do people who believe that fall into a situation like this? Well, it's interesting because the idea of agency showed up in one taste in this particular form, which was that Nicole and other leaders of the group would share this philosophy that basically, yeah, you were 100% responsible for your experience, your life and the flip side of that is they would also say that to have a victim mentality was kind of like looked down upon it was like that was a that was a kind of like a critique that you could give you could give of someone like oh you're having such a victim mentality about that that's really small-minded of you and what's interesting is that that philosophy of having radical
Starting point is 00:35:54 responsibility for your life is often for many people quite helpful. It's going to help you take charge of your life. I think there's a lot of benefit to that belief. And at the same time, in certain circumstances, within one taste, many people told me that they thought that that was taken too far, right? And so in one taste, as in life more broadly, you can have a good idea that is good for a long time. And then if you take it to an extreme, it starts to become harmful. And so in many cases, there were people at One Taste who were so taught this idea that you could never be a victim and that to see yourself as a victim was so shameful that when they experienced what they would now call serious exploitation, they had a hard time even calling
Starting point is 00:36:46 it that. Like, they would have a hard time recognizing that maybe they had been hurt. And so I think that the idea of radical agency or radical responsibility was used as a cover, you know, to make people think like, oh, well, if I'm having a hard time, it must be like my fault that I couldn't possibly blame someone else. I mean, it's sort of sounds like people, you know, I've lived in it. I mean, I'm using Silicon Valley as a broader term for San Francisco and the valley itself. But when you're out there, it seems like a lot of people just want to believe in something, right? there to do something. You want to believe in something. And so once you get going and you believe you have agency, it's tough to stop. Totally. And it also, once you've dedicated a lot of your life toward a belief to change your mind and say, actually, that belief might have been misguided or it shouldn't have been implied in the circumstance, there is a lot of cognitive dissonance or sunk cost
Starting point is 00:37:48 that comes up in a situation like that where like there were lots of people at one taste who, who had tough experiences, and they had a moment where they could have thought to themselves, actually, fuck this place. Like, I don't want, you know, this place is hurting me. But in order to say that, they would have had, they would have had to grapple with the cognitive dissonance of like, well, I also had invested five or six years of my life into this group. It's a really hard thing to admit to yourself.
Starting point is 00:38:13 And also sunk cost. It's like, I've already dedicated so much time of time, money, energy, social connection in this group. Like, to leave it feels very. costly. It feels very painful. And so you're totally right that, like, ideology has this stickiness to it, which is the more that you orient your life around a belief, if it becomes part of your public or professional persona, if you know, if you are like, yeah, the startup founder who is fighting for, you know, X, Y, Z, the more you put yourself in that direction,
Starting point is 00:38:46 the costlier it is for you to change course. Right. And I'm just like jumping out of my seat here, because, well, first, caveats, because it's important to caveat here, not everybody in one taste or everyone who practices orgasmic meditation is in tech. Like there are people in San Francisco who don't work in tech, believe it or not. But there's so many parallels to this pursuit of AGI where like you want to believe in something. And for the record, and I've said this on the show a lot of times, but AI is a real technology. Totally. But there is this sort of maybe parallel, like belief that put you, much of the tech industry has put a lot of money and effort and faith into this idea that within a couple years, the current AI technology will turn into artificial general intelligence or super intelligence.
Starting point is 00:39:40 And and it's tough to say, you know, it's very tough for people involved in that to say, hmm, maybe not. And I wonder if there's, oops, maybe we were. Yeah. Yeah. And I wonder if there's the chance of a similar group think situation in that scenario. Well, I totally agree that there are strong psychological parallels here. And one of the places that I saw that come up in my own reporting was like when I was writing about the intersection between like AI safety, the rationalists, and effective altruism
Starting point is 00:40:16 in this sense that, you know, you might call it like doomerism, which I know has gone up and down in its popularity over the last couple years. But if you think about what AI doomism felt like maybe a year or two ago, it is, it is really obvious to me that there are some parallels. And in fact, like when I wrote a big story about this a couple years ago and when I spoke to some people who had spent time really deep in the like doomer mindset, they described it to me in their own words as a cult. They said like, oh, it has a lot of parallels to a doomsday cult. And think about it this way.
Starting point is 00:40:52 It's like an ideology that tells you in this case that humanity as we know it is going to either be destroyed or, you know, significantly crushed within five to 10 years. And like our life as we think about it, the world as we know, about is going to change dramatically. And one of the hallmarks of a cult is like an ideology that is sort of overarching that tends to, once you adopted, it tends to affect like many parts of your life rather than just one. And then this kind of natural isolation that ends up happening between you and your previous relationships in part because of like adherence to this ideology or because of the social culture of the group. And so people describe to
Starting point is 00:41:35 me, you know, if you really truly in your heart of hearts bought into the idea that AGI was going to, like, lead to the destruction of humanity in a short time frame, not only is this a belief system that affects every part of your life, like this affected where they thought they should spend their time and energy, their professional work, who they should spend time with, what they should spend their time emotionally caring about, choices about like whether they should think about the future, raising a family, saving for retirement. They also described to me this natural sense of, well, I just couldn't talk to my old friends anymore who didn't worry about AI safety because what kind of conversation you're going to have with someone who
Starting point is 00:42:16 doesn't agree that the world is going to end soon? And so it did create this like sense of isolation and insularness. And yeah, people who are like in that world, it tends to just be like, oh yeah, you would go and like work at Mary or you'd like work in AI safety. It was just, it was impressive to me how much of an overwhelming viewpoint it could be. Like once you believe this, many, many things in your life change. And it's kind of interesting you're going to the Dumer side, which totally makes sense on that front. And, you know, can the same be said for those who are true believers? And AG, I mean, if you think about it, like, I mean, Silicon Valley, of course, is a place where they make big bets on things that are uncertain.
Starting point is 00:42:55 But this belief that that AGI is going to be reached has led to many hundreds of billions of dollars. I mean, of course, AI will be a useful technology, even if AGI isn't reached. But the money is there because there is a belief that it's going to get there. So do you see that parallel on the sort of the non-doomer but pro side? And I think keeping in mind that another really, really important and sometimes underestimated aspect of a cult, is that being part of a cult gives you access to a sense of awe and wonder. And one of the characteristics is also the sense that you have discovered or have access to special knowledge that not everyone else has realized yet,
Starting point is 00:43:43 that you're kind of like an early understander of something really big and that it gives you access to... Sounds familiar. And that it gives you access to this sense of awe and wonder. So again, to draw these sort of odd parallels within one taste, the access of awe and wonder in the sense of special knowledge, like these people truly believed that orgasmic meditation was going to heal the world, that teaching people this connection practice was going to, you know, fix our, you know, fix loneliness, make people's lives more like vitalizing, like improve people's sex lives and like really just generally like uplift humanity, which is not out of character for the 2010s. remember we works whole like elevate the world's consciousness thing like this was all yeah you're a real estate company come on yeah this was all happening in the same suit but but and then the sense of awe and wonder was like they really believed that this was a spiritual practice that
Starting point is 00:44:32 gave them access to like transcendence so setting aside obviously that's like an unusual example but you can see some of the parallels even in the doomers but yes also in these sort of like AI true believers true believers this idea that like AGI I mean people speak about it with this sense of quasi-divine wonder. And that's true whether you think of it as this like Rokos Basilisk, this kind of thing that's going to like destroy all of us or this thing that's going to like radically transform every part of life or like replace humanity or like even like supersede us as a new species.
Starting point is 00:45:09 Like that sense of awe and wonder is there. And then of course, you've been laughed in recognition when I said it, that feeling again a hallmark of a cult, this sense of like, I am among the few to realize something really special that is, it gives you a sense of, like, mission and purpose, which is, again, another key quality, this feeling of like, I'm working on something that really, really matters. Like, even when I was talking to some of these AI doomers, they used phrases like, I felt cosmically significant, you know, this feeling that the next 10 years could radically alter the trajectory of humanity, which again, is like, depending on which corners of silicon value, you're poking around in, could be quite a commonly held belief, that infuses people with a sense of mission and purpose, which at our core human hearts is something that we all want.
Starting point is 00:46:08 Right. All right. I got to take a break, but I want to come back and speak to you a little bit more about this. We'll be back right after this. Capital One's tech team isn't just talking about multi-agentic AI. They already deployed one. It's called chat concierge and it's simplifying car shopping. Using self-reflection and layered reasoning with live API checks, it doesn't just help buyers find a car they love. It helps schedule a test drive, get pre-approved for financing, and estimate trade and value. Advanced, intuitive, and deployed. That's how they stack. That's technology at Capital One. This holiday season trained smarter, not longer. The hydro rowing machine,
Starting point is 00:46:49 delivers the best results in just 20 minutes a day. It works 86% of your muscles in one seamless motion, twice as effective as running or cycling. Hydro is your go-to for the ultimate full-body workout. How ultimate? It works 86% of your muscles, arms, legs, and core, and it's twice as efficient as cycling or running. Just 20 minutes is all it takes to feel the results. All workouts are led by Olympians and world-class athletes filmed in breathtaking locations around the rolled. With the largest library of rowing workouts, Hydro keeps motivation high all season long, and it shows 90% of customers are still active a year later. Head over to hydro.com and use code big tech to save up to $600 off on Hydro rower during
Starting point is 00:47:35 this holiday season. That's H-Y-D-R-O-W.com, code big tech to save up to $600. Hydro.com code big tech. At Capital One, we're more than just a credit card company. We're people just like you who believe in the power of yes. Yes to new opportunities. Yes to second chances. Yes to a fresh start. That's why we've helped over 4 million Canadians get access to a credit card.
Starting point is 00:48:04 Because at Capital One, we say yes, so you don't have to hear another no. What will you do with your yes? Get the yes you've been waiting for at Capital One.ca.ca. slash yes terms and conditions apply and we're back here on big technology podcast with ellen uitt features writer and bloomberg news i'll show you the spine if you're watching on video author of the empire of orgasm sorry for doing that ellen it was not personal it's totally okay but again didn't want to just a little juicy for home life yeah for the parents understood uh if they're listening uh guys i hope you turned it off a half hour ago um anyway book is empire of orgasm sex power
Starting point is 00:48:43 and the downfall of a wellness cult. So let's just go back to the main question I asked at the beginning of this episode. If Silicon Valley is a place, how does Silicon Valley exist both as a place that is a fountain of original thinking and also a place where sort of susceptible to group think? I mean, I feel like people have had this discussion a lot. Like often group think like is a critique of, you know, the investing world or like to sort of, you know, trends and stuff in Silicon Valley. I think the truth is both of these things can exist at the same time. But the group think is, I do think it's an outcropping of this ideology.
Starting point is 00:49:26 Like, of course, humans everywhere want to have an ideology and something to believe in. But I do think in Silicon Valley, it's heightened. And so this idea that you might be driven by an ideology or a belief is socially rewarded here in a way that then makes people, like, be more ideology-minded. And when you have a strong belief, it is tempting to then cluster with other people who might share that or to feel like, oh, if I adopt this belief, I become part of a community. like I definitely don't want to underestimate how much the sense of wanting to belong to something is part of is part of this like humans want to belong humans feel like they want to have mission and purpose they want to feel agency over their lives and like adopting an ideology that is
Starting point is 00:50:19 shared with other people in your immediate network is a great way to feel shared mission and purpose and to feel like you belong so like if you're an AI researcher and you're like unsure of where exactly you want to, like, throw your ideological weight. Like, you might end up believing something that, you know, you were open to in the beginning, but it also happens to put you then in a community with people that you feel connected to, that you feel like you belong to. Like, there is a social, there's a human social aspect of group think. Right. So you've done a lot of reporting on open AI. What's your, like, assessment of where open AI stands in terms of of like thinking through this, like, is it like a kind of a group think situation?
Starting point is 00:51:05 Normal company, do they have the goods? Is it just, you know, them whipping us up into a frenzy about this technology because they made, you know, this really good chatbot. Yeah, very powerful chatbot, which we'll get to. But I think, I mean, I think as these companies have gotten bigger, and I'm sure you have your own sense of this as well, so I'd be curious what you think. You know, Open AI 10 years ago, or I guess nine years ago, open AI in its very early years like was explicitly formed around an ideology right and it was meant
Starting point is 00:51:39 to be this nonprofit with a mission and the mission was what was going to like differentiate it from other groups and it and it did have this very strong sense of like well we're going to be open we're going to be not for profit and that is going to shape the type of work we do onto something early yes onto something early they were they were and they were and like What's interesting to me is that that has been watching that ideological vision change over time in response to market forces or other, you know, I'd be curious what you think are some of the things that have shaped open AI. But I think indisputably, obviously, it has drifted in a very different direction than what you might have, you know, to go back and read those stories about the founding of Open AI from like 9.
Starting point is 00:52:30 years ago, um, is to look at the company and be like, whoa, is this really like how it started? Like, I think a lot of people who are not familiar with the history would, would be, um, right, but on the other hand, they have, they've brought everybody along on this vision. Wouldn't be by everybody, like people at their company or people outside of the company, too. I mean, I would say the general public. Oh, totally. I mean, I think like, in some sense, it's like, well, they've obviously been very effective at transforming into a, like, tech giant. Yeah. And in the process of doing something. So, like, you also need to grow your workforce enormously.
Starting point is 00:53:04 And so my guess is that the, like, average person who joins Open AI now is, like, thinking about it much more like, oh, what was it like to join Google in 2011 or something like that? Like, it's a very different thing. And then what's interesting is, like, it became different enough that obviously Anthropic or like the original founders of Anthropic decided, like, this isn't going in a direction we like, we're going to spin off and do. something that started off again with the real glue and motivation being this ideology that we're going to do things a certain way and I think you know I know a little less about like the up to the minute nuanced debate about like where is anthropic now compared to like their starting ideology but if you zoom out what's obviously important here is that the ideology is the powerful driving motivator at the beginning of these companies right and then it becomes a
Starting point is 00:54:00 question of like how does that ideology like morph and live on or evolve as companies get bigger and have to deal with bigger problems. Yeah and you've been like I'd say one of the foremost chroniclers of this this notion that companies can sort of in their companies individuals in their belief in this big idea can can sometimes be blind to you know some of the simplest errors are simplest vulnerabilities that might take down the whole house of cards. Here is one of my favorite stories that I've read the first couple of paragraphs that you wrote along with a co-author of Bloomberg. I wonder if you know where this is going. One of the most lavishly funded gadget startups in Silicon Valley last
Starting point is 00:54:48 year was Jucero. It makes a juice machine. The product was an unlikely pick for top technology investors, but they were drawn to the idea of an internet-connected device that transforms single-serving packets of chopped fruits and vegetables into a refreshing and healthy beverage. Doug Evans, the company's founder, would compare himself with Steve Jobs in his pursuit of juicing perfection. He declared that his juice press wields four tons of force enough to lift to Teslas. Okay, but after the product hit the market, some investors were surprised to discover a much cheaper alternative.
Starting point is 00:55:24 You can squeeze the juicerob bags with your bare hands and get the same juice out. I mean, isn't that characteristic of this, like, the guy's calling himself Steve Jobs, believe in this bigger idea, you're early to something, which you have to. And by the way, it's been an ideology that's led to many successful companies, but on the other hand, it's always the other side of this. It's like, you never really, if you're so caught up in that belief, you just, you can miss things. Well, in some sense, what Doug Evans is doing here is paying homage to the power of, like, a really strong, motivating idea, right? Like, on some level, at the time that he was pitching Jucero, he recognized that to make this idea of the juice press feel exciting, like you've discovered, you know, some special secret that you have a mission and a purpose. one of the ways to do that was to not just pitch it as like oh we have like a juice press but to say this is like a marvel of engineering with enough force to lift two teslas and to connect
Starting point is 00:56:35 it to Tesla to Steve Jobs to make it part of this like tech mythology is a really powerful thing like stories are powerful like stories are essentially what you know what have spun off so many of these companies, this belief that, like, oh, we need, you know, the story of opening AI at the beginning is, like, we need to build a, you know, a research lab that's going to do this particular function with these particular lack of financial incentives in order to, like, save humanity, right? Like, what a story. You can just do things. You can just do things. And especially if you're going to do it to save humanity. Yeah. Through juice. I mean, yeah, the Jucero story is like, Look, rarely have I ever written a story that could be really summed up in one sentence
Starting point is 00:57:21 and completely, you know, it was like they built this juice machine to squeeze the juice packs and you could just do it with your hands. You had this video where you were squeezing them. That was me. Squeezing in the bag. We can, you know, you can throw up a little overlay if you want. But the, you know, so rarely is it that simple. Like part of what makes Juicerro an enduring fable of, of,
Starting point is 00:57:45 Silicon Valley is that it is so neatly encapsulated in the idea that you can just squeeze it with your hands. Usually it's a lot more complicated than that. But you're right that I do think it speaks to this the potential pitfall of story. It's like everyone recognizes how powerful narrative is in rallying investment, employees, like the narrative of your company. me. And even more so if it can tap into these, again, these human desires of like, I want to feel like I'm on a mission, purposeful, working together with people towards something of greater importance than myself, feeling like, yeah, part of a group that has discovered something new. Like, that is how powerful a story is and that sometimes it's so powerful it can mask
Starting point is 00:58:40 something really hollow at the center of a company. Yeah, no, so this is why I wanted to do this episode, A, because I think your work is fascinating, and B, because we are in this moment where a lot of this AI story is being driven by narrative, and there's nuance there. And I think folks should understand where these narratives come from and what the root of a lot of what we see come from Silicon Valley is really sourced. It's not all in the areas we spoke about today, like the houses, but the power of belief and the type of person that gravitates to Silicon Valley, you've seen it, I've seen it, certainly worth paying attention to. And yet, admit it all, it works for the most part. Yeah, yeah, yeah. I mean, a lot of, I'm going to say this generally
Starting point is 00:59:27 about cults, but a lot of cults, I mean, every successful cult has at its center, like, a significant amount of wisdom because otherwise the teachings would never, like, get off the ground. And, you know, a lot of the successful companies, even if they might lean heavily on mythologizing their story, they would never get that far if they didn't have something extremely valuable also to offer. So it's often that combination. It's like you have to have something of value and to have a story around it that is deeply psychologically motivating to people. Like, that is the combination that can take you really far. Definitely well.
Starting point is 01:00:13 Ellen, you know, this is my third podcast. The first one was called On the Program It lasted for like six episodes. The second one was called Delete Your Account, and you are actually a guest on Delete Your Account, Micropod. But it's lovely to have you here. Yeah, thanks so much for having me. The real feeling. And we hope you have to have you back. Thank you.
Starting point is 01:00:33 All right. Thank you, Ellen. Thank you, everybody for listening and watching. We'll be back on Friday to break down in the week's news. Until then, we'll see you next time on Big Technology Podcast.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.