Ask Dr. Drew - Ryan Hartwig – Facebook Moderator Turned Whistleblower – Ask Dr. Drew – Episode 51

Episode Date: October 22, 2021

For almost 2 years, Ryan Hartwig was a moderator at Cognizant, which was contracted by Facebook to review posts and take action against content that violated its policies. With millions of new posts e...very day, the platform has struggled to find a balance between free speech and the regulation of misinformation and violence — claiming that it moderates content with fair and reasonable rules. But after getting an insider’s view of the platform’s enforcement practices, Ryan became alarmed by what he claims is a pattern of censorship: with rules being applied inconsistently to favor one end of the political spectrum over the other, and vast powers wielded by biased moderators who control what users are allowed to say. In 2020, Ryan secretly documented his findings for a Project Veritas exposé and his book “Behind The Mask Of Facebook.” More from Ryan Hartwig: https://ryanhartwig.org  Ask Dr. Drew is produced by Kaleb Nation ( https://kalebnation.com) and Susan Pinsky (https://twitter.com/FirstLadyOfLove). SPONSORS • BLUE MICS – After more than 30 years in broadcasting, Dr. Drew’s iconic voice has reached pristine clarity through Blue Microphones. But you don’t need a fancy studio to sound great with Blue’s lineup: ranging from high-quality USB mics like the Yeti, to studio-grade XLR mics like Dr. Drew’s Blueberry. Find your best sound at https://drdrew.com/blue  • HYDRALYTE – “In my opinion, the best oral rehydration product on the market.” Dr. Drew recommends Hydralyte’s easy-to-use packets of fast-absorbing electrolytes. Learn more about Hydralyte and use DRDREW25 at checkout for a special discount at https://drdrew.com/hydralyte  • ELGATO – Every week, Dr. Drew broadcasts live shows from his home studio under soft, clean lighting from Elgato’s Key Lights. From the control room, the producers manage Dr. Drew’s streams with a Stream Deck XL, and ingest HD video with a Camlink 4K. Add a professional touch to your streams or Zoom calls with Elgato. See how Elgato’s lights transformed Dr. Drew’s set: https://drdrew.com/sponsors/elgato/  THE SHOW: For over 30 years, Dr. Drew Pinsky has taken calls from all corners of the globe, answering thousands of questions from teens and young adults. To millions, he is a beacon of truth, integrity, fairness, and common sense. Now, after decades of hosting Loveline and multiple hit TV shows – including Celebrity Rehab, Teen Mom OG, Lifechangers, and more – Dr. Drew is opening his phone lines to the world by streaming LIVE from his home studio in California. On Ask Dr. Drew, no question is too extreme or embarrassing because the Dr. has heard it all. Don’t hold in your deepest, darkest questions any longer. Ask Dr. Drew and get real answers today. This show is not a substitute for medical advice, diagnosis, or treatment. All information exchanged during participation in this program, including interactions with DrDrew.com and any affiliated websites, are intended for educational and/or entertainment purposes only. Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 BetMGM, authorized gaming partner of the NBA, has your back all season long. From tip-off to the final buzzer, you're always taken care of with the sportsbook born in Vegas. That's a feeling you can only get with BetMGM. And no matter your team, your favorite player, or your style, there's something every NBA fan will love about BetMGM. Download the app today and discover why BetMGM is your basketball home for the season. Raise your game to the next level this year with BetMGM. Download the app today and discover why BetMGM is your basketball home for the season. Raise your game to the next level this year with BetMGM. A sportsbook
Starting point is 00:00:30 worth a slam dunk. An authorized gaming partner of the NBA. BetMGM.com for terms and conditions. Must be 19 years of age or older to wager. Ontario only. Please play responsibly. If you have any questions or concerns about your gambling or someone close to you, please contact Connex Ontario at 1-866-531-2600 And here we are, everybody.
Starting point is 00:00:58 Thank you so much for joining us. We are going to have a guest in just a moment. We are also taking calls out of Clubhouse and watching everyone on Restream, making sure we're all sort of piling in right now. My guest today is Ryan Hartwig. He was a moderator at Cognizant, and during that time, he became concerned by some of the censorship rules over at Facebook. He began sort of compiling and documenting what he was seeing for Project Veritas. And his new book is Behind the Mask of Facebook. We're going to talk to Ryan right now. Our laws as it pertained to substances are draconian and bizarre.
Starting point is 00:01:38 Psychopaths start this. He was an alcoholic because of social media and pornography, PTSD, love addiction, fentanyl and heroin. Ridiculous. I'm a doctor for. Where the hell do you think I learned that? I'm just saying, you go to treatment before you kill people. I am a clinician. I observe things about these chemicals.
Starting point is 00:01:55 Let's just deal with what's real. We used to get these calls on Loveline all the time. Educate adolescents and to prevent and to treat. You have trouble, you can't stop, and you want help stopping, I can help. I got a lot to say. I got a lot more to treat. If you have trouble, you can't stop and you want to help stop it, I can help. I got a lot to say. I got a lot more to say. I want to give a shout out to our good friends at Blue Mics.
Starting point is 00:02:16 If you've heard my voice on this show anytime over the past year, including right now, you've been listening to Blue Microphones. And let me tell you, after more than 30 years in broadcasting, I don't think I've ever sounded better. But you don't need to blue microphones. And let me tell you, after more than 30 years in broadcasting, I don't think I have ever sounded better. But you don't need to be a pro or have a fancy studio to benefit from a quality mic. You may not realize it, but if you've been working from home
Starting point is 00:02:33 or using Zoom to chat with friends, you probably spend a lot of time in front of a microphone. So why not sound your best? Whether you're doing video conferencing, podcasting, recording music, or hosting a talk show, Blue has you covered. From the USB series that plugs right into your computer to XLR professional mics like the mouse or the blueberry we use in the studio right now. Bottom line, there's a Blue microphone to fit your budget and need.
Starting point is 00:03:03 I can't say enough about Blue mics, and once you try one, you will never go back. Trust me. To take your audio to the next level, go to drdrew.com slash blue. That is drdrew.com slash B-L-U-E. Anyone who's watched me over the years knows that I'm obsessed with Hydrolyte. In my opinion, the best oral rehydration product on the market. I literally use it every day. My family uses it. When I had COVID, I'm telling you, Hydrolyte contributed to my recovery, kept me hydrated. Now, with things finally reopening back around the country, the potential exposure to the common cold is always around. And like always, Hydrolyte has got your back. Hydrolyte Plus Immunity, my new favorite, starts with their fast-absorbing electrolytes and adds a host of great ingredients. Plus, each single serving Easy Pour Drink Mix contains 1,000
Starting point is 00:03:44 milligrams of vitamin C, 300 milligrams of elderberry extract. Hydrolyte Plus Immunity comes in convenient, easy-to-pour sticks that rapidly dissolve in water, make a great tasting drink, has 75% less sugar than your typical sports drink, uses all natural flavors, gluten-free, dairy-free, caffeine-free, non-GMO, and even vegan. Hydrolyte Plus Immunity is also now available in ready-to-drink bottles at the Walmart next to the pharmacy, or as always, you can find it by visiting hydrolyte.com slash drdrew. Again, that is h-y-d-r-a-l-y-t-e dot com slash d-r-d-r-e-w. Be sure to use the code drdrew25 for a special discount. Welcome to the program. Website is ryanhartwig.org, H-A-R-T-W-I-G. So we just thought it was interesting on the heels of the recent so-called whistleblower, and this is where I want to start our conversation, testifying before Congress.
Starting point is 00:04:39 What did you think of her testimony, her story? Yeah,is haugen uh testified last week before congress and she she also uh did the 60 minutes interview on cbs but uh yeah i i really didn't think that she's really too authentic to be honest like this here's someone who's worked in silicon valley her most of her adult life and now she's saying facebook wasn't doing enough like the other companies were doing to censor more content. So that was the gist of her message. She talked about how she lost a friend to basically on Internet conspiracy theories. I know her friend didn't die.
Starting point is 00:05:15 They just stopped being friends. And so I don't know. I don't really see how it's damaging Facebook asking for Facebook to censor more content. So that's my takeaway on that. So in other words, how would your position juxtapose to hers, let's say? Yeah, so my position is Facebook is censoring too much content. And what I found, my evidence that I found in my book is that they're censoring conservatives and they're censoring political speech.
Starting point is 00:05:45 So I wouldn't want people on the left to be censored either for their political speech. So they're doing too much. They're going out of something out of bounds of Section 230. And so that's that's my it's my conclusion is the direct opposite of hers. She's saying the censor more. I'm asking that Facebook to censor less content isn't isn't her sort of thrust to try to undo 230 to to incur liability so people have to think about what they put in other words make things freer but to put a put a certain amount of liability on what's published i think uh some of her stance and what she requests is that Congress take action
Starting point is 00:06:27 and create some kind of a regulatory body that would oversee Facebook. So in general, I don't like the government being more involved in the internet in general. So I mean, one of the common principles of the founding principles of the internet is that it should be decentralized, that it shouldn't be controlled by a single regulatory body or by the government and so you know really section 230 if it was interpreted correctly by the supreme court the way it was written it would be good um but you know what does that mean yeah in your opinion what does that mean it means that they should be able to restrict some content, but they shouldn't be able to promote certain content. They shouldn't be able to produce or boost certain content over other pieces of content.
Starting point is 00:07:14 Yeah, I get you. It sounds like you're essentially a libertarian, right? I mean, that's sort of the libertarian attitude about the internet. Yeah. And, but isn't it the case that true libertarianism can exist only because of markets? Markets are sort of the, you know, markets and law are sort of the balance to, you know, libertarianism becoming frankly anarchy. And why not? Why not? See, the one thing that's always bothered me about the internet is
Starting point is 00:07:45 people can say anything feeling at their liberty to say it with no no um it's the only place on earth i know that in our reality that you can do anything you want to another person regardless of how it harms them with not incurring with no one nobody incurring any liability of any type. Yeah. And so those are some of the principles. So one principle that's related to that is from John Stuart Mills, who's a philosopher. So it's called Mills Harm Principle. And Facebook actually tried to incorporate some of this into their policies. I believe they removed it later, but I did see that as part of their policies. So John Stuart Mills, basically the mills harm principle is talking about if i say something to someone something mean will actually what's
Starting point is 00:08:32 the actual chance of of it causing real world harm you know what is that going to lead to violence so he's of the opinion that most speech should not be restricted so this whole idea of hate speech is kind of invented i don't agree with that term at all. I don't I don't think hate speech is doesn't even even exist. It's just a concocted definition that they use to censor people. But yes, there's John Stuart mills. And then there's that whole, you know, if I yelling, yelling, crowded enough, yelling fire in a crowded theater um so there's
Starting point is 00:09:07 which that case was disproven or was kind of was kind of moot um so those are some principles but but what i yeah what i found was more related to political speech so trump supporters receiving disparate treatment um the treatment of for example white trash the phrase white trash at one point we deleted it while i was there facebook changed their stance and said no white trash is okay it's not it's not hate speech so just kind of that double standard is what i analyzed in how facebook treated uh political supporters of trump don't trump is are you generally are you a liberal are you conservative are you a libertarian where do you sort of fall on the spectrum for yourself forget forget the internet you don't have to tell if you
Starting point is 00:09:50 don't want by the way i'm more conservative but i definitely have some like libertarian tendencies um i would say more conservative so like a okay so sort of libertarian right there's a libertarian left too right i think i think i'm sort of on that side of things because whenever I see problems, I want the government to fix them. So that's not a true libertarian kind of... I met some real libertarians and I realized, oh, I have a heart.
Starting point is 00:10:18 Oh, wait a minute. I just want to fix stuff. I worry about people. And those real libertarians let it fly. I am trying to remember stuff i worry about people and those real libertarians will let it let it fly i i am trying to remember the um the economic diathesis that mills this is a common economic construct and i cannot well in any event you know mills was utilitarian right uh and he was brilliant he taught himself greek and latin by the time he was four or six
Starting point is 00:10:46 or something crazy like that and he he was uh you know an intellectual leader of his time but you know his his alchemy is all about greatest good for the greatest number with least harm and you know the reality is that things people say do harm them. They harm their reputation. They harm their ability to function in the world. I mean, they make it, I mean, they really hurt people. I heard Jon Stewart the other day say that there's no such thing as the cancel culture. I thought, oh my God, there are people losing their jobs every day. They are harmed.
Starting point is 00:11:22 Even in Russia, back in the darkest hours of the Soviet Union, most of how they exerted their power was economic. If you didn't fall in line, you would lose your job. It was like old-fashioned ostracism from back in the Greek democracies. It's a really harmful, dangerous, profoundly impactful phenomenon. And now we're starting to see people kill themselves on the heel of it, too, which we've seen. But so to say that it doesn't have any impact on people in reality, it's I don't know if I can say worse than yelling fire in a theater. But it has certainly a more sustained effect if all you are is injured climbing out of a theater when somebody yells fire.
Starting point is 00:12:10 Right. Yeah. And so there are examples. And as a content moderator, I was there for almost two years and I deleted suicide statements, I would click a button and send resources to someone who was having suicidal statements, I would see imagery of cutting or slicing, really graphic imagery. And as a content moderator myself, I had to deal with the mental health aspect of being a content moderator and my coworkers as well. So we had a counselor on site, a psychiatrist on site 24-7, and we could call a number as well.
Starting point is 00:12:41 So I was looking out for my coworkers because we were affected by seeing all the violence and the gore, cartel videos, beheadings, terrorist content. But yeah, as users, obviously if a teenager is having suicidal thoughts or sharing something about weight loss, there was a hashtag like Thinspiration or Thinspo, we would delete that. But really what it comes down to and part of francis haugen's testimony last week was you know congress needs to do more to protect the children
Starting point is 00:13:11 facebook to protect the children but my question is like where are the parents of these children i know the the technology is new and it's it's hard to manage and regulate and to control to be safe for children but the internet's never going to be safe for children. So I guess that was my, one of my takeaways was, okay, yeah, the government can do certain things to an extent. Maybe Facebook can do certain things to an extent, but, but primarily the responsibility falls with, with the, uh, with the parents of these children, these teenagers. Um, so can there be harm? Yeah is there are harmful things so those are the more extreme examples of things that i agree that should you know there should be a way to for
Starting point is 00:13:50 facebook to have the that authority delegated to them from congress to be able to censor certain types of content definitely you know suicide is it authority or like any other publisher should they accept the liability the responsibility of being a publishing platform? Yeah, that's a good question. So if they're promoting content, and Frances Haugen talked about how there should be, what did she say, that there should be more quality content produced. Like Facebook should do a better job of creating quality content. And they're not a quality content creator.
Starting point is 00:14:24 So they shouldn't be creating, if they want to be an information, the law second to 30 talks about being an information content provider, um, versus a servicer. So in your interactive computer service and not an information content provider. So they are an interactive computer service. So, so they, if they want to be a publisher, they, they can, but they'd have to accept all that civil provider. So they are an interactive computer service. So if they wanted to be a publisher, they can, but they'd have to accept all that civil liability. So they probably wouldn't exist for very long. So what happened with Section 230 is the courts in California misinterpreted
Starting point is 00:14:58 or reinterpreted Section 230 and it gave Facebook more leeway. So the way they interpreted it means that the Good Samaritan provision doesn't apply to all parts of the law, which it was intended to. So that's what Jason Fick, who sued Facebook in the last couple of years, his case went to the Supreme Court in January. He was hoping that the Supreme Court could reinterpret it and make that Good Samaritan provision apply to every part of the law.
Starting point is 00:15:30 So that's how they're getting away with some of what they're doing. So yeah, the suicide and that kind of harmful content, I view that very separately from censoring political speech. So if I say, if I call you a Trump humper, or I call you a feminazi, Facebook has a double standard. So I view that very separately from things like suicide that clearly can be harmful. Well, I would argue I'm not particularly comfortable with them censoring mental health content. I mean, we need to be identifying mental health. We should be identifying people with mental health crises and referring them on to people, not censoring them. And why do we censor? They don't censor people with other opinions about medical conditions online. That could be very dangerous.
Starting point is 00:16:06 But why mental health? I would take your argument straight over there. I mean, you're not used to, you know, they put a non-mental health professional overseeing it. This must have been very scary for you. But for a professional, this is all day, every day. People are suicidal all the time. And you deal with it. You manage those ways to help them.
Starting point is 00:16:24 You don't silence them. That's the last thing you do. That increases you manage those ways to to help them you don't silence them that's the last thing you do that increases the probability they're going to go do something so how are their parents supposed to see it or maybe you can notify the parent there just be there'll be a million creative solutions to things like that you can get the national institute of mental health involved get the national institute of national alliance for the mentally ill involved get you know there's a lot of things they could have done other than just censoring so on one hand i i think yeah i think this and and of course when they censor things they're going to do things within bias because all cognitive processes are biased right and that
Starting point is 00:16:55 and the and on a scale like that the biases are going to become particularly evident when lots and lots of things are being adjudicated and these that's not their... I mean, you as a cognizant employee, what do you know about that? Why would they put you in that position? It's irresponsible. It's ridiculous. And so I agree with you that the way in which they... I'm sure he did a great job. No, I'm sure he did the best job you possibly could.
Starting point is 00:17:20 But they put you in a horrible situation. I mean, it's just not it's like being a calm in combat and not being trained to tolerate combat you can't you when you feel out of control like that you know and you're not or you either you either numb and you become numb to it and it just doesn't mean anything to you or you feel horrible and out of control and and and you know not able to manage it and it's both are bad um so all right so so i i on one hand i'm sympathetic to your doctrine here that uh less less uh uh what's the word i'm looking for guys censorship censorship less censorship is is is a good goal but i feel like if they had some skin in the game they would do a much better job of what they do do in other words if you could say
Starting point is 00:18:14 oh that kid killed himself i can hold facebook accountable for that they were really completely who advised them to do with the way they did why did they why did they do it that way if they're going to censor it who do they consult to determine what their policies were? And then again, the inequality or the lack of evenness with which it would be applied, I'm sure, was just all over the place. So why not put some of the liability of, maybe not all the liability, you called it the civil liability of being a publisher, but why not some liability or at least liable, some liability to the people that provide the content, maybe not directly to Facebook. Maybe the content providers themselves have some responsibility for the things they say and share.
Starting point is 00:18:59 Right. No, I think, I think I agree with you in some regards. So like, like thinking, thinking about section two 230, okay, so there's this law and Facebook is using it to be able to have this platform and not get sued, right? But when have we ever seen any consequences from misdeeds? When Facebook is screwed up, which they've screwed up a lot, have there been any consequences? So I think that's the whole idea of some kind of a regulatory board, kind of like an FCC, but for Facebook. And I may be able to get on board with that. Jason Fick, who is a Section 230 expert, he believes that that could be a solution, creating a regulatory board. Because Facebook right now, they're acting kind of like a de facto government agency anyways.
Starting point is 00:19:39 I mean, you're having, you know, Jen Psaki saying they're flagging content and giving it to Facebook for review. So maybe having some kind of oversight wouldn't be a bad idea. And let's be clear, they get money and tax breaks, too. So they are in collusion with the government and then taking direction from the government. So that's a concern. So if you had a magic wand, your friend I see has one solution. What would you do? What does your book tell us? If you had a magic wand that could solve the problem in Facebook, what would that look like? Yeah. So if I had a magic wand, I could
Starting point is 00:20:15 solve this problem. I think the biggest issue, and I mentioned this in my book, is antitrust. I mean, just looking at the sheer magnitude, the sheer volume and size of these companies, Facebook and Google together, it's not just a monopoly, it's a duopoly. And so I think antitrust would have been the best solution, just kind of cut these companies down to size, make them break up, split Facebook and Instagram. So that would be the ideal solution. I don't know how feasible that is, but of course, if I had a magic wand, that would be the first solution. Cut these companies down in size using antitrust. The next solution would be just having, you know, forcing the Supreme Court and we can't force them, but forcing them to interpret it the correct way. Clarence Thomas.
Starting point is 00:21:04 Hoping they interpret it the correct way. What would that look like? What is the correct way. Clarence Thomas. Hoping they interpret it the correct way. What would that look like? What is the correct way? As I mentioned earlier, so that the good Samaritan provisions apply to every part of the statute. I'm not sure. You said that. I'm not sure I understand what that means. Explain to people what you mean by that.
Starting point is 00:21:22 Yeah. So there's a, I don't have the law pulled up right now but there's section 230 c2 and c1 and um the way it was interpreted by the ninth circuit court in california basically interpreted so that they don't have to be a good samaritan for certain parts of the law um so that that's my that's my understanding of it. That basically it gave Facebook additional protect shielding or protection where they didn't have to act, be good, I guess, act in our best interest. Um, so, but at this point, honestly, so the section two 30 is the communications decency act, right?
Starting point is 00:22:01 From 1996. But here we are like almost 25 years 25 years later, and so much has changed with the digital revolution. I wouldn't be against crafting some new legislation, but my only concern is when you have people on the left in control who are actively censoring, who are already actively censoring content. We had Senator Cory Booker ask, request that Facebook do something about Trump, and then a month later he was banned from Facebook. That's my only concern right now is crafting legislations. When you have these legislators in the pocket of Facebook, that could be an issue.
Starting point is 00:22:37 Right. So as I sort of listened to your solutions, it's just essentially, it's a problem of excess concentrated power that's the bottom line here right isn't that the root cause all right yeah and and it's because there's so much power and it's so new it's not we don't know what to do with it we don't know how to regulate it we don't know what the best solution is we all have a sensitivity to free speech we all wish there could be a libertarian diathesis on the internet. That was the intent of it. Turns out people aren't so good all the time. Don't do what they're supposed to do or hurt other people or
Starting point is 00:23:14 don't adjudicate with great equity. Yeah, no surprise. Let me take a question here and see if anybody wants to ask you anything. If you're not asking Ryan a question, I'll address other questions later once I let Ryan go. But if this is specifically about Ryan and his experience over at Facebook, we're doing this because of having been on the heels of the congressional testimony. Let's see. I'm going to try to call chuck chuck up chuck all right as usual i have i have uh clubhouse doesn't come directly up oftentimes it takes a minute and i was i was actually a guest on somebody's clubhouse today and i realize
Starting point is 00:23:57 it is it's a little cumbersome if you're trying to come up and speak at the podium there's a lag but you know what they do is they get a bunch of people up. Hey, Chuck. Hey. Hey there. Hey there. Thanks. My really question is two pieces. I think you
Starting point is 00:24:16 reiterated a couple of these things. By the fact that they go ahead and make changes to people's posts, meaning flagging their posts, doesn't that make them a publisher? And doesn't that mean that then section 230 should not apply to them? And this, I think what Chuck is bringing up is sort of kind of, I was sort of tilting my hat at that direction as well. I'm sure you heard me kind of going that way. Because I feel like if they're not publishers,
Starting point is 00:24:52 they at least have some of the responsibility of publishers. And maybe we're too narrow in how we define a publisher. And Chuck is pointing out that they're editorializing, essentially, by directing our attention one way or another, moving things up the page, whatever it might be. Do you disagree? Yeah. That is my point. Yeah.
Starting point is 00:25:13 Go ahead. Oh, yeah. So they can't pick or as a – they're not in the law. There's another term called access software provider. So they're not allowed to pick choose analyzed or content or or prioritize content um so so yeah if they're if they're singling content out in it you could argue that they are promoting it in a way um how is that's a really good point how is and it's also point, which is how is removing content qualitatively different than moving content into a higher position?
Starting point is 00:25:52 It's a valence issue. It's the same function. One is a negative function, one is a positive function, but it's doing the same thing. It's changing the content and moving it about, right? Yeah, it's like a sin of omission versus commission yeah it's it's not it's even it's even more explicit than that it's it's going right versus going well it's going up versus going down we're still moving we're still in movement when we do this uh chuck anything else well i mean so the other question I have is internally to Facebook.
Starting point is 00:26:25 I mean, generally, I believe most people have good intentions. And increasing communications, I think, always helps bring people together. So, you know, the points that you made about are there any programs that they're planning or were they discussing to go out and help people? Right. You know, like you said, you know, I would think in my mind that there would be something to, there would be plans to make sure you identify people that are actually posting things. They're the publisher. So that way you have a valid publisher, someone to act upon.
Starting point is 00:27:03 And then, like you were saying, the suicidal piece or just not even suicidal, just the general piece to help the well-being of people. Right. And let's, Chuck, let's even shine a brighter light on your point, which is there are apps that do that. Why can't Facebook do that? There are lots of apps that do that same kind of thing. Why can't they partner with do the same thing whatever it might be why did that why do they go to censorship and run away as opposed to take the the responsibility not even responsibility just do do what's right yeah like there's apps like uh even for watching watching movies there's an app called vid angel
Starting point is 00:27:41 that edits out the obscene words, the curse words from a movie. So you can watch an entire movie without any curse words. And there were some lawsuits against them for a while. And in growing up, actually, I grew up in a very religious community. And there was a guy who I knew growing up with who had a video store who did that, who took videotapes, and he edited out the curse words to sell it to the market, the religious people growing up here. And he got sued by Hollywood. So finally, they won some lawsuits. But yeah, so something similar for Facebook where you can just install a filter or have some kind of...
Starting point is 00:28:17 I'm not even talking about that. I'm talking about interventions and referrals for the mental health stuff that you were silencing. There are apps for that kind of thing. I'm saying it's all over. That's what people are always trying to do is sort of not automate so much as make a seamless product where people can use the internet or the electronic media to get help and to be identified and referred on.
Starting point is 00:28:40 And as opposed to silenced, which you can't imagine how that sounds to me. It sounds wild that you take a suicidal person and you put them over there. That'll fix them. That'll work. We don't want anything to do with that. I mean, that's where I would start my lawsuit right there if I were taken on Facebook. I think it looks like Jason has some skin in the game on this topic too. I'm going to bring Jason up to talk to you. Let's see what he's... Jason, hi there. Yeah, a friend of mine is actually in this group and they called me and said my name was mentioned. Brian, it's Jason. Good to talk to you, buddy. This is Jason Fick.
Starting point is 00:29:24 Yes, this is Jason. Good to talk to you, buddy. This is Jason Fick. This is Jason Fick. Yes. Yeah. This is Jason. Yeah. I heard you talking about it. I thought I might be able to expand a little bit on some of the things you were talking about. Go ahead, please. So Ryan and I have been in touch.
Starting point is 00:29:37 And Zach Voorhees, a lot of the other whistleblowers for a long time now, we've sort of looked into it deep into Facebook. And just for background with everyone, I have been in litigation against Facebook since 2018. We took it all the way to the Supreme Court. The Supreme Court kind of failed us and didn't hear the case. It was just kind of amazing. Although our case actually turned around because a lot of people were talking about antitrust and, uh, and Ryan was mentioning the antitrust and antitrust is, is, is already there, believe it or not. Um, and I wanted to tie together what he was saying about the Good Samaritan, right?
Starting point is 00:30:16 A lot of people know because they, a lot of people think that it's 26 words. It's not, um, realistically what we're being told by many, many, many, many people out there, including tech attorneys and everything else, like, that is wrong. We've been in this for literally years. So there's some fundamentals here that people need to know. Section 230C,
Starting point is 00:30:38 the very first thing it says, protection for Good Samaritan blocking and screening of offensive materials. And anybody that thinks they know about it, I ask them a very simple question, why are there quotes on Good Samaritan blocking and screening of offensive materials. And anybody that thinks they know about it, I ask them a very simple question. Why are there quotes on Good Samaritan? Nobody ever knows. There's a real reason for it. Good Samaritan is what's called an articulated and intelligible principle.
Starting point is 00:30:57 Effectively, what happens is in administrative law, when Congress lays down any kind of law like Section 230, it has to come with an intelligible principle, meaning the foundational, easily understood, and the reason it's in quotes is because it's articulated right there. Now, most people have no idea why Lemon v. Snapchat or Enigma v. Smaug beats Section 230. There's a reason reason the fundamental thing that has never been asked by these courts and because remember everybody california is the only court's handling this it's not it's not being done anywhere else in fact uh the domen case which was a second circuit case they just moved it away from 230 230 is no longer involved in so because they don't want to end up with a circuit court conflict
Starting point is 00:31:45 which is what would push it to the supreme court so the point of this is that good samaritan is your first question did they act as a good samaritan in the case of snapchat they were negligent like blatantly negligent they ended up killing somebody basically this is about the snapchat part is about the drug distribution right no lemon versus snapchat was the i believe was the one the app where the faster you get going the more pictures that are the more um reward pictures and so forth i have less experience with that one because it's less pertinent to my case, but Enigma, they actually, this is a direct quote out of Enigma, and it said, and this is very pivotal here, they said, the Good Samaritan provision of the Communications Decency Act does not immunize blocking and filtering decisions
Starting point is 00:32:36 that are based upon an anti-competitive animus. If Congress and everybody else was right, you can be a publisher and do whatever you want. Well, that wouldn't stick because that would be a publisher function. So there's a conflict going on. And the reason is, is because the courts have simply overlooked the basic principle. And where Ryan and I haven't worked very hard, matter of fact, he's one of the people i speak to a lot um and he gotta go read his book good stuff right it's we are about to challenge it the way that it should have been done in the first place so i'm going to give all of your audience it's kind of a insight that nobody knows because i was essentially denied a day in court, I never got one. I mean, I didn't get oral arguments.
Starting point is 00:33:25 It was pathetic. These courts don't want to give you process, right? Your due process. And a lot of people don't realize that administrative law is where they can restrict something of yours. And it's right in the law. It says 230 C2A says that they can restrict your materials. Well, that's your liberty, your property. So
Starting point is 00:33:46 there is a definitive line between what a private entity can do under the first amendment. Sure. But they still have the duty of the restriction. It's what's called agency authority. Who's they? And meaning the Congress, when Congress wrote it in 1996, they said, they said that these private entities can essentially have agency authority. And we dug and we dug and we dug. And I finally, and actually I have to read some of it to you. This is where it's going to come down.
Starting point is 00:34:15 This is a Fifth Amendment constitutional challenge that's going to be coming now. But this is a case. It's called, you know, because everybody's looking at Section 230 case law. This is not Section 230. This is a Fifth's a case it's called uh you know because everybody's looking at section 230 case law this is not section 230 this is a fifth amendment case this is carter versus carter coal mining it's a case that and i and i this is coming from something where i substituted the proper names in the case for what it does with social media it says uh the carter versus carter coal mining case justice sutherland delivered the opinion, quote, the power conferred upon the provider, which would be the service provider, is in effect the power to regulate the affairs of the unwilling user. This is legislative delegation in its most obnoxious form.
Starting point is 00:34:56 Section 230, in this case, does not confer to an official or an official body, presumptively disinterested, but to private persons whose interests may be often or adverse to the interests of others in the same business. For example, my case, the difference between operating an interactive computer service in this circumstance or an advertising service and regulating by restricting its production materials is of course fundamental the former is a private activity the latter is necessarily a government function so it's split right since it is the very nature of it one person may not be entrusted with the power to regulate the business of another, and especially of a competitor. And a statute which attempts to confer such power undertakes an intolerable
Starting point is 00:35:49 and unconstitutional interference with personal liberty and private property. The delegation is so clearly arbitrary and so clearly a denial of rights safeguarded by the due process clause of the Fifth Amendment. So what would the outcome of that case, what would you be hoping for in the outcome of that case? So where we sit right now is that we've essentially, because they denied me due process, it grants me the standing to bring a due process claim against the United States, meaning I don't have to fight in California anymore.
Starting point is 00:36:21 I'm actually, I'm a resident in South Florida, so I can bring a case in Southern District of Florida and bring it, bring it all the way up through the Fifth Amendment, you know, Fifth Amendment case and you'll end up
Starting point is 00:36:32 with what's called a tribunal. Essentially, three judges come in and we first have to notify the governor and the attorney general of the state
Starting point is 00:36:40 to see if they want to put up opposition. And what was your original case? Again, I've lost track of your original case. My original case was Fick versus Facebook. And it was an anti-competitive case.
Starting point is 00:36:51 We asked the courts, can they commit anti-competitive behavior? They wiped my business out, you know, somehow claiming my content was offensive. And then four months later, I went to a competitor who did bigger business. And I said, hey, can you guys see if you can get my stuff back? And Facebook came back to them and said, no, we're not going to do it for him. But if you own it, we'll do it for you. So I sold it to them. And sure enough, it somehow no longer violated the community standard.
Starting point is 00:37:19 And all of the content was restored. Meaning, on its face, we proved it had nothing to do with content. It had to do with the value of the company. They wiped me out because I just wasn't valuable to them. Interesting. And so the problem is for many people, and what I would say is it is not Section 230.
Starting point is 00:37:38 The problem is actually the courts. They don't know how to read. I mean, if you read the language, Brian was actually talking about development. Most people don't know how to read i mean if you read the language uh brian was actually talking about development most people don't know that it says that if if they're responsible in whole or in part in insignificant for the creation we know that that one what about development that's modification when they identify content send it over to a fact checker to get it basically laundered and sent back. That's development. But the courts turned in part, which is insignificant, into material
Starting point is 00:38:11 contribution, which it doesn't say, which means significant involvement. It's the courts that are the problem. What if the legislators get ahead of this? I mean, it felt like during the testimony that there was unanimity of desire to do something. Can they get ahead of this? I mean, it felt like during the testimony that there was unanimity of desire to do something. Can they get ahead of this? So what we've come to realize right now is because this is starting to seep out through lots of different channels, because if you dig into what we're doing here, it's right. It is unconstitutional under the amendment when you break it all down. And I have broken it all down and i have broken it
Starting point is 00:38:45 all down because of course that's what we're doing um you know the only people that are putting up opposition to me which is just kind of interesting is congress because they want to always make more bills more regulations more everything and i'm sitting there going and i mean i i'll be honest i spoke with matt gates over the weekend and he told me and this was amazing he said to me that section i'm gonna have a hard time proving that it grants agency authority well cases have literally said that right so i mean it's not going to be hard to prove and then you know after he got through all this i said well it doesn't really matter. I said, because I don't need Congress.
Starting point is 00:39:27 We're challenging the law. And if Congress tries to get ahead of it and change bills, it won't matter. A private entity at its core cannot regulate another private entity. It's fundamentally wrong under the non-delegation doctrine of the Constitution. As I said, it's the most obnoxious form of delegation. And on top of that, you also have void debates. Ryan, go ahead. So what Frances Haugen is asking Congress to do is different than the outcome of what your case would be, right, Jason?
Starting point is 00:39:59 So she's asking Congress to take action. Yeah. She's asking for a regulatory board, and Jason is asking for constitutional solutions. Well, if you want to, now, I'm sure somebody will think I'm a conspiracy theorist on this one, but if you consider that we hinted at this in a motion 60B that we were coming the due process route, one of the fundamental problems with this as a Fifth Amendment challenge is that there is no congressional oversight. It doesn't exist. Right.
Starting point is 00:40:38 And because of that, that's why it's unconstitutional under the Fifth. Do you think it's any coincidence that they're trying to get it? And so what will happen as a result of this? That's what I'm saying is if they're trying to get that oversight, because without it, it can actually be revealed. They're desperate to get it. But if you win, Jason, what's the impact? Yeah, what's the impact of you winning a constitutional challenge like that? So on a constitutional challenge, you only have,
Starting point is 00:41:07 like you only challenge the pieces that are unconstitutional. What's the impact though? What's the, what are the, you're getting into a lot of legal weeds that most people aren't, aren't going to be able to be able to follow. And so they just want to, how is this going to affect them? How, how am I, what am I going to say? Fully understand. But what I was saying is,
Starting point is 00:41:25 is that there's only one to be one piece that really is unconstitutional. And that's one piece of law, the part where they restrict you. Okay. So the part where they're not responsible for what somebody else did is still going to remain there. That's what I'm saying. That,
Starting point is 00:41:38 that will stay because that's not unconstitutional. That just says, well, you can't be held accountable for what somebody else did. Now they're going to have to fix the language language but that's a different issue so it's essentially going to take them their their their ability to alter and censor content will be taken away restrict content will be taken away yes and no what will actually have to happen for to be what they call constitutionally sufficient because the courts i won't even get into legalities but basically
Starting point is 00:42:06 what will have to happen is this and ryan touched on this is that an independent or official regulatory body you heard some of that in carter versus carter coal mining court i gave you an official regulatory body like the fcc the sec right they have to be elected officials under or and if they're not an elected official it has to be official regulatory body because they're given all sorts of guidelines like the APA like all of the things so that would actually have to be set up I mean it would be effectively the FCC for the internet and then the regulations are set up universal I get it and so you end up in sort of the same place that the, I'm blanking on her name every time we bring it up.
Starting point is 00:42:48 The woman that testified in front of Congress. Frances. To write it down. Frances what? Frances. Haugen. What was her last name? J-A-U-G-E-N.
Starting point is 00:42:56 Haugen. Yeah. Yeah. So Frances, you end up in the same place that she seems to be sort of wanting things to go. Well, Jason, thank you for stopping by here. I'm going to move you back to the audience. We appreciate it. Good luck with this.
Starting point is 00:43:08 It feels like, you know, and my son who went to law school is always telling me the same thing, that it's the courts that is going to solve all these things, not legislation, not probability. Do you agree with that, Ryan? Yeah, I think the courts can be a stumbling block for our republic right now. I think it's funny when they always say, save our democracy. And even Frances Haugen said in her testimony that Facebook should save society and save our democracy. But we're a constitutional republic, which as a libertarian, most people would point out to you, we're not a democracy per se, because it's not mass rule. But yeah, there's some different solutions there. And actually, Frances Haugen, this week, she's going to talk to the Facebook Oversight Board.
Starting point is 00:43:49 So I'm curious to see what she tells them. I'm actually sending copies of my book to members of the Oversight Board as well. She's going to talk to legislators in the European Union. I've talked to a legislator in Argentina, and he's basically like, hey, if i have an if i have an issue with facebook there's no local argentina office of facebook they're you know
Starting point is 00:44:09 overseas so it's kind of hard to get in touch with them but uh it's definitely yeah the courts are definitely a problem are are any other countries challenging things in an effective way in a way that might lead to change i think there are some countries that are you know kind of stepping up to um stepping up to facebook france is doing something aren't they is it france i believe i heard that yeah flance i believe i believe they're taking actions i know hungary hungary and poland uh had a lot of concert because they were standing up against the European Union and against all the immigration. And there was a lot going on with conflict with those countries. But yeah, I think each country should take a stand against Facebook because I was also moderating content in Latin America.
Starting point is 00:45:00 And just this is a quick example. There was a post. This is when Juan Guaido was trying to do the revolution against Maduro in Venezuela. And yeah, and so I saw a post that said, you know, hey, we're going to delete this post about an armed revolution. So people literally saying, you know, let's take up arms against the government. And Facebook is saying, delete that post. So it just goes to show the impact, the amount of influence that Facebook has. I'm deleting posts about someone, you know, trying to have a revolution in their country. So it's just, yeah, it's insane how much power Facebook has. And is, I'm imagining there's a similar phenomenon at Snapchat and Twitter and other places, or at least in terms of the centralized power? Yeah, I would say so. And, uh, I feel like they copy from each other's playbooks. Um, I'm very familiar with Facebook's internal policies. I studied it
Starting point is 00:45:55 for two years. It's very, it's legalese. So I kind of felt like a paralegal, uh, applying these, you know, each specific situation and applying the law to it. But for example, you know, Alex Jones was deleted the same day across the board, you know, across all the social media platforms. And it was like an emergency update for Facebook. So definitely some coordination there. And are there, you say it's predominantly on the right. Are there any voices on the left being silenced that are advocating dangerous things? I believe there are some.
Starting point is 00:46:29 So they have their hate figure lists. And of course, there's people on there like Tommy Robinson and Gavin McInnes who were on the hate figure list. I believe they added Louis Farrakhan to that list as well for his viewpoints. So there's a few people on the left who haven't been censored. And I give credit to Facebook. I try to be as fair as possible. So Facebook at one point, at one point, if you said,
Starting point is 00:46:55 keep Canadians out of the United States, that would be considered hate speech. And Facebook changed that and allowed for more speech. So if you're talking about politics or immigration, they allow that phrase. Or the phrase Muslims did 9-11. You're not really generalizing. You're talking about a specific event. You're not saying all Muslims are terrorists, which would be deleted for hate speech. But at one point, Facebook deleted Muslims did 9-11, and they changed that
Starting point is 00:47:19 to allow that phrase, which I agree with. Because if you're just talking specifically about 9-11 factually yes there were muslims from you know saudi arabia who would commit that i'm gonna interrupt you let's take take that particular example who's how does that work what is their ai looking at it is it is there a room of google you know of, of Facebook executives? Is it programmers? Is it political consultants? Is it international theorists? I mean, who's making these decisions? What's the process?
Starting point is 00:47:53 Yeah. So it came from up on high. So our client was Facebook and basically they just decided. Who's the, who did Mark Zuckerberg looks at the list every day and he decides? Is there a committee at Facebook that's always looking at these things? I mean, it seems so bizarre. Yeah, there's a committee. There's a policy team at Facebook headquarters in San Francisco that oversees these decisions.
Starting point is 00:48:28 It's very nuanced um but um like and i remember one example where they they changed so they changed their bullying policy or i think it was the phrase white trash or attacks on cops and they the only the only rationale for changing it was because of how the term is used in the north american market so they really don't have to give too much explanation or justification but it's this small group of people at facebook headquarters that are making these global policy decisions because the policy do they have some sort of do they have a policies and procedure manual a bible that they follow of philosophical sort of guidelines or is it just again for people in a conference room yeah i mean for example like i think it's just for people in a conference room making decisions we would we would have our supervisors communicate with them and do teleconference and we would give them feedback
Starting point is 00:49:15 so they changed their global policy based on our feedback regarding credit card scams so people are doing like carding and selling credit cards online so we said hey they're using these these code words in spanish and we know we know what they're talking about so they finally after like eight months they updated their policy so we could finally delete these code words for for credit card fraud um weird uh what what do you want people to learn from your book? What will I learn when I read it? I think, well, to begin with, since you're a mental health professional and you deal with that a lot, I think the main thing is just how divisive and how dangerous this content can be.
Starting point is 00:50:00 So as a content moderator, you know, I had to go to counseling on multiple occasions to deal with all the content. So it gives you a really great perspective into what we go through. So if there's any way to automize that or make it better, better experience. And then the second takeaway is just how many exceptions Facebook made, how much political content they censored and tracked. Because they were tracking that constantly. I personally raise up many, many trends, political trends regarding Boogaloo or civil war, purely political themes and Facebook spent a lot, great deal of energy focusing on those political themes.
Starting point is 00:50:38 Well, we've sort of heard this before from people that have worked at Facebook. I mean, this is not your, you know, it's clearly not spurious what you're reporting or what you've seen. It seems to be a pervasive phenomenon in all aspects of what they're doing there. And I'm sure they, much as Francis Haugen said,
Starting point is 00:50:57 I'm sure they think they're doing the right thing. I don't believe they have evil in their heart, but there's a, I think it was a famous is it stoics some phosphorus said you know um i forget who said it but uh evil is almost always done by people believing they're doing good right yeah and then the road to hell is paved with good intentions right so that's another right well, I'm going to take
Starting point is 00:51:25 some general questions now. Yeah, I appreciate you stopping by here. We will get the book. We'll read the book. I recommend that you do so as well so we can all stay. It's behind Ryan there. And it's behind the mask of Facebook.
Starting point is 00:51:38 It is behind Ryan on the picture there. There it is. Behind the mask of Facebook, a whistleblower's shocking story of big tech bias and censorship. And we will certainly keep an eye on, was it Jason? Jason's progress. Jason Fick, right? Am I getting that right? Yep, Jason. And his story. And I hope it's something we can follow in the press. But Ryan, thank you so much for joining us. And if people want to get more, it's Ryan, R-Y-A-N, Hartwig, H-A-R-T-W-I-G.org.
Starting point is 00:52:07 And we're going to take a little break. I'll be back with some more general calls and comments. Here with my daughter, Paulina, to share an exciting new project. Over the years, we've talked to a ton of young people about what they really want to know about relationships. It's difficult to know who you are and what you want, especially as a teenager. And not everyone has access to an expert in their house like I did. Of course, it wasn't like I was always that receptive to that advice. Right, no kidding. But now we have written the book on consent.
Starting point is 00:52:37 It is called It Doesn't Have to be Awkward, and it explores relationships, romantic relationships, and sex. It's a great guide for teens, parents, and educators to go beyond the talk and have honest and meaningful conversations. It doesn't have to be awkward. We'll be on sale September 21st. You can order your book anywhere books are sold. Amazon, Barnes & Noble, Target, and, of course, your independent local bookstore.
Starting point is 00:53:00 Links are available on drdrew.com. So pre-ordering the book will help people, will raise awareness, obviously, and it'll get that conversation going early so more people can notice this and spread the word of positivity about healthy relationships. So if you can, we would love your support by pre-ordering now. Totally. And as we said before, this is a book that both teenagers and their parents should read. Read the book, have the conversation. It doesn't have to be awkward. On sale September 21st.
Starting point is 00:53:28 And we are back. Thank you all for joining us. We're going back getting calls on Clubhouse. And I'm calling up somebody whose name I can't pronounce because I don't know where you're nitty. Am I getting that right? Hey, Dr. Drew. Yes, it's nitty like nitty gritty. Fantastic.
Starting point is 00:53:43 When it's on your picture here, I can't tell where your first name ends and your last name begins. So, Nitty, thank you. What's going on? a question, actually. I'm a licensed clinical social worker, a therapist in the mental health field myself. And we know that Facebook has had a ton of disinformation from a medical standpoint, particularly in regards to the vaccine. And of course, Facebook has some culpability there and a responsibility to be monitoring the content that's being posted on their platform. But I was curious to hear from your perspective as somebody who also falls under a governing body, like I'm under the American Social Work Board. We all have boards that are governing us. Do you feel like there's a role for the boards that are governing different medical providers that are putting out there this disinformation that is actively causing harm? Just was curious to hear your thoughts.
Starting point is 00:54:45 The boards have, the medical boards have been stepping in when people have been way out of line. The problem with medicine is that people that have strong opinions one way or another usually end up being wrong. And so people that have equally strong opinions that everybody over the age of five should be mandated for vaccines. That's also an extreme position that may have some light, some problems associated with it. We don't know. So for the board to step in, um, or that that's a tough putt for them. Uh, there has to be a complaint. And then in medicine, you provide published literature
Starting point is 00:55:28 to substantiate your position. And so Scott Jensen, who's a family practitioner of 25 years, was twice in front of the board for really spurious stuff. And so the problem is it starts bleeding into people that aren't actually making outlandish opinions and people can put anybody in front of the board for anything. And it becomes abusive very, very quickly. So for them to be doing, I'm not sure, it's a hard, I'm glad I'm not sitting on the board of medical quality assurance trying to make those sorts of decisions. What do you think? Yeah. I mean, I think you bring up some really salient points because I do believe, you know, it can become a slippery slope really quickly, right? Where it becomes like censorship.
Starting point is 00:56:13 But it's not censorship. It's encumbering people's licenses. Their ability to practice their profession is aborted. They can't practice. And they have to spend hundreds of hours building a case in front of the medical board. And usually, physicians have a reason for the opinions they hold. It may be wrong, but if they can substantiate their opinion, then the medical board can't really do anything. It's that person's concerted opinion based on 50 citations. And by the same token, if it's an abusive, nonsensical, spurious allegation, the doctor has to go through the same procedure. In the meantime, his or her license can be encumbered. It's a deeply troubling area. It's a deeply problematic unless they come up with some way of monitoring something
Starting point is 00:57:08 different than their usual process, which is, which is prone to abuse, prone to abuse. That makes sense. Absolutely. Well, and so, you know, something that you had mentioned earlier, just as a follow-up in regards to, you know, on Facebook that, that people that were suicidal were kind of pushed aside, right. Resources weren't provided. You know, you know, on Facebook that, that people that were suicidal were kind of pushed aside, right. Resources weren't provided. Yeah. You know, you'd mentioned some automation of that.
Starting point is 00:57:31 Is AI a potential solution where certain keywords, phrases like, and my life. People are doing that. Of course. The question is, what do we do with it? I don't have a strong opinion about how you manage it, but I, I, when he was saying that we just sidelined it, like that did not seem like the right thing to me to do, at least,
Starting point is 00:57:48 at least provide some resources or try to connect. I, I think, you know, when you identify from, from your, my standpoint, if we were interacting and something like that came before us,
Starting point is 00:57:57 we would have an obligation to follow it through. We would have an obligation. Whenever I've been on a phone call or something and somebody is suicidal, I call the local authorities. I do a wellness check. I send the cops in, I keep been on a phone call or something and somebody is suicidal. I call the local authorities. I do a wellness check. I send the cops in. I keep them on the phone until you just, this just comes with the territory. You have to do that.
Starting point is 00:58:12 And, um, but, uh, I was, I was thinking again about this, um, censorship thing where I had a strong idea and now it escaped me again. My stupid brain has been my working memory as I've gotten older to hold something in mind while I'm talking to you about something else has become challenging. All right. I'm going to think of it after I let you off. It was something to do with the, with the, oh, I know what it is, is that most, this is something you wouldn't know. Most of what you see on social media that's viral about something somebody said is not what they said. So people will be held accountable for misinformation that is not misinformation, or it may not be right, but it's not what social media virality claims they said. Things that are actually said and are argued are rarely
Starting point is 00:59:06 made viral they rarely move around what what what made something viral is somebody will pull something out of the argument take that piece out of context say here's he is a hateful person or she's trying to look how stupid she is for saying this even though in context that's not at all what she was saying and that then becomes viral and that is what would generate the complaints so we have that problem also which is it's so much of what is misinformation is not even what the original provide original posting had to do with. Wow. I had no idea. That makes complete sense. Almost always viral content is false. It's at least in terms of from its original source. I've seen this happen so many times now, I can't even count. I had my son helping me doing
Starting point is 01:00:01 content for Twitter and he put some things out and he watched what happens in terms of what people said about what he wrote, which wasn't what he wrote. And then what they said about what he wrote became the viral content. He actually stopped helping me. He couldn't stand it. It was too disgusting. It was too scary to the mob behavior is so irrational and so effusive when it gets going. It just is very, very destructive. And it's not. So he quit. Yeah, he quit because it was too painful when he would just raise a question.
Starting point is 01:00:35 Hey, what do you think about this? Just trying to get some engagement. You're an idiot. Don't you know? And then that becomes you're an idiot that you didn't know. That was happening all the time. And then that is what somebody can take and go, hey, I'm going to complain because that doctor didn't
Starting point is 01:00:50 know X. That doctor's competent. That's how it works, everybody. It's a freaking mess out there, which is why I'd like to see. It seems to me a simple solution is if you actually make misinformation, you have to
Starting point is 01:01:05 have some accountability and some liability in having created that. There should be some way that the liability, you incur some liability when you say irresponsible things. What about that Chappelle story about the trans woman? Yeah. Have you seen the Dave Chappelle special? I haven't actually, but I've heard a ton about it. Right. Go watch the special, formulate your opinion about what you think
Starting point is 01:01:29 he's actually saying. He's actually, watch the whole thing because he's building a case. I watched a criticism of him today where he was on a major newspaper outlet and the editorial said, I stopped watching after 20 minutes.
Starting point is 01:01:41 I was disgusted. He builds a, he's making an argument across the entire 90 minutes or whatever it is, two hours. And at the end, he's saying something very, very specific. He's telling you why he went through all this painful material. And if you don't get that, you're not listening, number one. And you're not thinking, you're just reacting emotionally which is hysterical which is histrionic which is a fearful an idiot that's where we've gone these days is into
Starting point is 01:02:10 histrionic traits and what you will notice how far the commentary that's out there is from what he was actually saying just take take a look at that it's a good example of some of this stuff and maybe you'll agree with somebody you know, you'll agree with some of it because he goes to awful places. He just does. But he's a comedian pushing the envelope in order to make a point that he sort of builds towards at the end. And it's challenging. I'll grant you. I understand why people are upset.
Starting point is 01:02:39 But it's not worth it. It's also on brand for him, though. Yeah, yeah. That is Dave Chappelle? I mean, I thought this was particularly interesting because he was saying something. At least I heard something very, very specific and,
Starting point is 01:02:55 uh, not wrong by the way. He's a lot smarter than I thought he was. Oh my God. Well, he's a genius. And that's, that's that.
Starting point is 01:03:02 Okay. Thank you so much for raising some questions. Appreciate it. I didn't think it was smart, but I just, I re he really, he really dug deep and it was, it was intense. Yeah. Yeah. I think that's why I had nightmares last night. Kelly, go ahead there. You did have nightmares last night too. Hi Kelly.
Starting point is 01:03:24 Hi. Can you hear me? I do now. Okay. Awesome. Well, first of all, thank you for having me. My name's Kelly and, um, you were speaking with Ryan earlier and I, you know, I heard Ryan make a statement about what you guys were talking about suicide. And he said, uh, you know, where are the parents and all of this? Oh yeah. I, I sort of, I sort of, I sort of stepped aside from that comment, but go ahead. Hey, I think it's a valid question because I think it's one that I'm seeing repetitively, especially after the Facebook whistleblower or Francis, you know, when she spoke the harmful content, it really should come as no surprise to anybody in here that this was happening.
Starting point is 01:04:06 But anyhow, my son actually died by suicide last year. And what I ran into and what I have been dealing with for the last year and a half has been the biggest nightmare. You would never even believe it. Anyhow, I would like to let you know that I have spoken with Blackburn, Warner, Wicker, Cruz, Cornyn, Fletcher, Hawley. I mean, Fletcher's a state representative over here in Texas where I'm at. And he happened know, he happened to run into a website that, you know, was encouraging it very much so. And I've been for the last year and a half just gathering up victims so, you know, we can make a case somehow. And, you know, what I am seeing on Facebook, like in these suicide prevention groups, because I've been studying those as well,
Starting point is 01:05:04 like who's moderating these? Right. Right. Who's making the decision? What is going on? Yeah, that was my first interest. Are you, are you, have you gone to any patient-centered advocacy groups like the NAMI, National Alliance for Mental Illness? Oh, that was the one first thing I did. Eric Bauer, who helped shut down Backpage, which I'm not sure if you're familiar with that website, but that whole takedown ended up leading to FESTA-SASA, which is federal regulation against encouragement of suicide. Initially, I was very heavy into Section 230, especially if you go into Section 230C, Jason, I know Jason, I'm the one that called him and he's really helped me, you know, kind of along the way, understand like the legal, because I'm just a mom here. I've been swooped into a tech world that's crazy. You know, it's hard to kind of, you know, understand all of this, the way all of that works. You know, I'm noticing right now, and if you guys are following Michael Schallenberger, but he has been building advocacy for parents of children that have died fentanyl overdoses.
Starting point is 01:06:19 And that's the way he's going at homelessness. And it's working. And it seems to me that you could have a similar model here. Parents of children who were materially harmed by social media. Well, yeah. What is Facebook doing with these Facebook groups? I mean, they have all these mental health. If you go in there and you join those groups and you look at what's happening, that
Starting point is 01:06:45 is the worst. They go in there and these people are private messaging them. I mean, people like predatory people are messaging these people taking, I mean, it's, there's no tech. I sat in a Twitter space the other day for two hours and listened to the World Health Organization and two other, you know, very, two other very large organizations. And when the question was asked, what app can I use if I'm feeling depressed? Where can I go if I'm feeling depressed?
Starting point is 01:07:12 It was crickets. Nobody knew anything. And there is a huge gap in technology for people. I mean, here we are. We can see the direct correlation between social media, I mean, internet usage really as a whole, and depression and anxiety. And there's this huge neglect for the people that are becoming victims. By the way, this isn't just children. This is also adults.
Starting point is 01:07:36 Yeah, we will look upon all this the way we look at tobacco now. There's no doubt in my mind. This will be the tobacco of the future. But I don't know what we're going to have to go through to get this thing tiger by the tail. I just keep thinking that people need to be held accountable. Well, they do need, and I am actively searching for the people that are like, you know, behind this little shenanigan on my, on over here. But I mean, I'm just speaking more in general terms and not so
Starting point is 01:08:05 much specific to my situation because, you know, my son had just turned 18. I mean, he, you know, he, and, and, you know, people say, well, where's the parents? Well, have you tried taking the phone away from a teenager? I mean, there's, that's especially, yeah, no, listen, I would, that was, I know you're going there, Kelly. And then this was, first of all, I am so sorry you're going through this, but I'm so grateful that you're taking it as your challenge, as your crucible to carry, because your son will, if you keep going, this will not have happened in vain. It will give you a source of meaning and saving others. It is an amazing thing you are doing. As it pertains to taking the phone away
Starting point is 01:08:46 i have colleagues that work in this field that the they're literally if you if you want to know a good website it's uh dcakids.org digital citizen academy dcakids.org and she has lots of resources there for helping schools and people about dealing with, with the digital safety. And in her family, she has two teenage kids. Her kids are given the phone and have been since they were kids one hour a day, one hour a day. And that's it.
Starting point is 01:09:13 And that's highly monitored. And you have to start that. It's hard. That is a, it's only, you have to have seen catastrophes to, to have the motivation to, to fight those fights.
Starting point is 01:09:23 The average parent just doesn't have that. It is impossible to get the phone away from them. I understand that. And that's what I wanted to tell Ryan, but I thought intuitively, I'm not sure he has kids. I'm not sure there's a good discussion for right now, but, uh, well, I think it's, I'm glad you said that because I think that a lot of, I mean, it, because it needs to be talked about, you know?
Starting point is 01:09:43 And so I'm really, it has to start when they're younger though. I think that's right. And, and, you know, like, right. Um, and it, it, the, the lion is out of the cage. Like when our kids were little, like I, I was the monitor for all the parents because I could use a computer and I, I had, I was friends with all my kids' friends and I would keep an eye on everybody and, you know, they knew it. I will never forget all my kids' friends. And I would keep an eye on everybody. And, you know, they knew it.
Starting point is 01:10:07 I will never forget. I will never forget. I was seventh grade. And I raised my hand at a parent's meeting. I said, how long? Because our kids were on E-bombs World and Snapchat. Or not, MySpace and things. And I said, how long do you think is healthier for a child to be on a computer? And their response was,
Starting point is 01:10:25 that's a very personal decision. Every family needs to make that individually. And I thought, you're telling me how many hours to sleep, how many hours of homework, how many hours, how many minutes of television. And this is the one thing I don't know how long to monitor. And that one's a personal decision. I will never forget it. It was the worst. I hope I, and it was somebody I really respected that said that to me too. And I hope she feels to this day guilty about that because they should never have sidestepped that. We had an opportunity to really go at it when this is now,
Starting point is 01:10:53 that would have been like 2005. Half the parents didn't even know how to use a computer. Susan was the only one. So she was the hall monitor for the entire class. I was. Well, and another thing is it makes it really hard. Cause I've spoken with,
Starting point is 01:11:06 you know, this one guy that's running bark, which is a, you know, an organization that helps parents monitor their, their children's internet use to see, you know, what,
Starting point is 01:11:15 what they're using. And these, these kids, they know all about VPN. Oh yeah. I mean, it's a lot, it's a lot harder to understand.
Starting point is 01:11:24 They probably would be the onion router and they're on the phone. It's like the seventh grade math. Well, he told me that it's so hard because Google and everybody makes it so hard for them to even help the parents. Like if a child goes to create a Google account and they can see that a parent's trying to access it. He's had heat. There's such a way, like a lot of them where Google will go email the child and say, just to let you know, you know, you're in it.
Starting point is 01:11:51 Like, really? Oh my God. Oh my God. That's wild. Well, there was, there was a lot of monitor back then.
Starting point is 01:12:01 No, no, listen, I, if you, no, this is intense. Yeah.
Starting point is 01:12:04 I was going to say, send Susan an email, identify yourself at contact at drdrew.com so you guys can kind of link up. And if there's stuff we can do to help you, please call on me. Yeah, and also Lisa... Lisa Stroman, who I'm doing a little presentation with her in about a week. That's the woman I was talking about at Digital... What was it? DCAKids.org, Digital Citizens Academy. Okay. But do hook up with us. I see a potential in your cause because it's these narratives that people hook on to that makes the American public go, enough, enough already. I can't stand it. I can't hear about this.
Starting point is 01:12:39 It's too painful. We have to stop this. Well, I contacted Google after this happened, right? I contacted, and lo and behold, there's no way to even contact Google. You can't report the page or anything. So me being me, I'm on Twitter trying to find out, well, who's who and all of that. And, you know, Jason will tell you, I've definitely gotten myself involved with the higher ups. I mean, one of them being AFSP and I'm not going to say his name, but in DC, I told him what was going ups. I mean, one of them being AFSP, and I'm not going to say his
Starting point is 01:13:05 name, but in DC, I told him what was going on. I spoke with him for an hour and he would promise me that he was going to help me. He knows all the senators. He knows this, he knows that. I was begging him. I was like, please, I need a break here. I can't do all of this by myself because I was explaining to him what was going on. You never heard on. You need to build a coalition. And there are now also parents that have lost kids to drug dealing on Snapchat. So that's been going. I know. And that's the other way.
Starting point is 01:13:35 Prostitution and everything else. Oh, everything. Oh, yeah. Yeah, for sure. What? Yeah. Oh, my God. Oh, yeah.
Starting point is 01:13:42 I mean, it's rampant. I mean, all you got to do is go to Twitter, type in a certain hashtag. You'll see children advertising themselves on there to go and direct them. I mean, it's everywhere. This is everywhere. Even before this happened with my son, I was very concerned. I was very concerned with the social comparison that he was doing. I kept telling him,
Starting point is 01:14:05 Junior, this is not real. You're looking at things that aren't real. You cannot, that's not the way life is. Well, now look, now look at what we've created. And, you know, I'll just, I'll stop there. But I just feel like it's really important for mental health specialists. Like we need them to really step up and, and look at what's going on. And, and, and, and we need to have places,
Starting point is 01:14:32 you know, I was listening to the global safety head of safety for Facebook. I watch all of these hearings, by the way. Um, and I was listening to her tell, I forgot what Senator it was that when people are suicidal, that it will redirect them and they can talk to a live person. That is simply not true.
Starting point is 01:14:53 I've been doing this for the last year and a half, day and night. I've lost my job, you name it, doing this. And that's not what happens. I have never seen that. So it's, I don't know, but we need to get more involved as just in the whole advocacy and making sure that these people that need help, they get it.
Starting point is 01:15:14 I don't know what to do. I've listened, Dr. Drew. And there's, believe me, there's layers and layers to this. Where do we get the help? How do we provide the help? How do we fund the help? There's a lot of stuff, a lot of layers to this. Where do we get the help? How do we provide the help? How do we fund the help? There's a lot of stuff,
Starting point is 01:15:26 a lot of layers to this. There's a lot of red tape to even moderate this stuff. And it's like, so I do agree with you 100%, 1,000% that suicidal people do need a place to talk, but it needs to be safe.
Starting point is 01:15:42 They need to invest some technology and to like, when they go into these rooms, they have to be moderated by a professional and they need to automatically go into anonymous mode when they're in there. So they can't be contacted. I mean, I have a whole slew of ideas, but anyways, I do, I definitely will contact,
Starting point is 01:15:58 you said contact at drdrew.com. Yeah, contact. And then the website, I just looked it up. It's not DCA Kids anymore. Now it's digitalcitizensacadrew.com. And then the website, I just looked it up. It's not DCA kids anymore. Now it's digitalcitizensacademy.org. Digital citizen. Oh, no, singular, singular, digitalcitizenacademy.org.
Starting point is 01:16:13 All right. Okay. Thank you so much. All right. Listen, we'll stay in touch and good luck with all this. It's somewhere. I need a shorter name.
Starting point is 01:16:20 That's hard to remember. Digital Citizens Academy. Is it? No, it's not citizen. Is it singular? Oh, Digital Citizen Academy. All right, Kelly, I put you back in the audience. Thank you. And I'm going to end the Clubhouse room right now. Thank you all for having joined us here and for listening and being so attentive. Great questions. Good participation. Very interesting material today. Thank you so much. We'll be back tomorrow with
Starting point is 01:16:46 Mark Robert, who is a journalist, a longstanding with a lot of interesting ideas. We're going to pick his brain a little bit. And I believe we'll be back in the clubhouse a week from tomorrow. Susan, is that about right? We don't know the guest yet, but it'll probably be Wednesday. We'll be Tuesday next week. We'll do Ask Dr. Drew. Oh, Tuesday next week also. We moved all our asks to Tuesdays so that we could do it every week at the same time. Okay. Because we're traveling a bunch this coming up.
Starting point is 01:17:10 So that was the thing. Yeah, a lot. We're traveling a lot. A lot. So we're ending that room. Thank you so much. We love you, but we have a lot of work to do. Okay.
Starting point is 01:17:19 In terms of, but we will get the streams in. I talked to Ken today and he told me we are obliged to three a week. And it's very important we do that. No. Three. Two. Three. Three.
Starting point is 01:17:29 Three right now. We'll make it up. Don't worry. All right. Fair enough. So on the restream, I'm watching you guys. And thank you, Caleb, for producing today's show and Ask Doctor Who. We appreciate it very, very much.
Starting point is 01:17:39 Hey, Hitler Toms out there. I see you. And pay no attention to the Hitler references. We're going to go see Christina P and the Dove in Austin. There's going to be a reunion on Thursday. Thursday. We're leaving. Look out for that on Dr. After Dark,
Starting point is 01:17:53 your mom's house. We'll see where that ends up. More shenanigans. After I get over my COVID booster shot. I was so glad you did it on Monday because I was afraid you'd be dragging. I'm a little tired. I'm a little, I forgot to push record. So now Caleb has to do more work for the podcast.
Starting point is 01:18:14 But, and if you want to listen to this show on podcast form, you can find it on iTunes or, you know, wherever you listen to your podcast. Ask Dr. Drew. Fair enough, Caleb. Share if you care. We love you. How's the baby, Caleb? Oh, he's doing so well. He slept a full night. Can you show us a picture? Oh, he's doing so well. He slept a full night.
Starting point is 01:18:25 Can you show us a picture? Oh, snap. Do I have one ready? I don't know if I have one ready. Do you have anything to say about what we just heard, by the way? Because you're a digital wizard. And anything occurred to you about the conversation we just had? Now that you have a child.
Starting point is 01:18:41 It's such a struggle. Because it's like, I know my opinion is going to change based on the technology that is available whenever my child gets older and starts using it. I feel like it's going to be VR stuff and it's going to be 10 times as dangerous as what they have now. But I get stuck between where it's like,
Starting point is 01:18:58 I want everyone to have total freedom of speech, but also some of that speech turns out to be so dangerous and I don't know where the solution is i try to find a solution but i can't why not why not put the liability on the people who say it like we like everywhere else in the world you know everything we do you're you're you're held accountable for things you say that hurt other people why can't we do that on the internet it would seem that that would make sense that would seem like it would be the uh the thing that would work but it's uh yeah everybody has different opinions on what's good and what's not everybody's opinion is so different that it's like well that seems to be
Starting point is 01:19:33 what facebook is trying to do they're trying to moderate things based on i don't think they're trying to do anything harmful i think they're just trying to moderate stuff based on what they believe but they believe things differently than all the other half of people that are out there posting the stuff so it's it's it's it turns into a slippery slope but then do you want to just allow people to say things free for all any there's always platforms it's like i think for example it's like you know places like 4chan where it's it's these internet forums that they start out and it's complete free for all wild wild west and even in these dark corners of the internet some people go too far for them and then they have to start adding rules and the moment they add one rule it's no longer the wild west
Starting point is 01:20:15 it's like youtube used youtube used to be the wild west it was free free speech everybody you came from television and radio over to youtube because you could say things on youtube that you couldn't say anywhere else now youtube has become more more, more legitimate. They've tried to set all their rules and now you have to hop over to other platforms because YouTube is now, it's just going to, everybody keeps on having to hop further and further out to the wild west. But, but I would argue the wild west was not the wild west forever for a reason. Right. You know what i mean yeah i mean that's what happens uncivilized wild west is synonymous for uncivilized you know the
Starting point is 01:20:52 uncivilized behavior not a good thing not generally a good thing i'd say so right so it's as i get older the more i run youtube i'm fine yeah yeah his dog's barking he's in the cage but um he's been there for a while it's not like he was out here he comes he is not in the cage so i guess he got uh all right we'll wrap this thing up we thank you by the way everybody i had the third vaccine and i feel fine i was just a little sleepy at about two o'clock and i needed a nap yeah she's doing she's actually doing better with the third vaccine than with the second. It's the same. It's not different. The same, which I feel like shit like this all the time.
Starting point is 01:21:28 So it doesn't matter to me. I'm good. I'm just glad I got it over with. Fair enough. Me too. And I had today to just chill. Okay, guys,
Starting point is 01:21:35 we will see you tomorrow at two o'clock, three o'clock, two o'clock or three o'clock. Oh, tomorrow's three o'clock, three o'clock. See you tomorrow. Three.
Starting point is 01:21:44 Ask Dr. drew is produced by caleb nation and susan pinsky as a reminder the discussions here are not a substitute for medical care diagnosis or treatment this show is intended for educational and informational purposes only i am a licensed physician but i am not a replacement for your personal doctor and i am not practicing medicine here always remember that our understanding of medicine and science is constantly evolving. Though my opinion is based on the information that is available to me today, some of the contents of this show could be outdated in the future. Be sure to check with trusted resources in case any of the information has been updated since this was published. If you or someone you know is in immediate danger, don't call me. Call 911.
Starting point is 01:22:23 If you're feeling hopeless or suicidal, call the National Suicide Prevention Lifeline at 800-273-8255. You can find more of my recommended organizations and helpful resources at drdrew.com slash help.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.