The Dr. Hyman Show - How Your Free Will And Data Are Being Hacked By Micro-Targeting Of Your Personality with Andy Russell

Episode Date: March 11, 2020

When I was growing up we had news sources we could trust. Under The Fairness Doctrine, news on the radio, TV, and in print was partial and fair. But in 1987 during the Reagan administration, this was ...repealed, leading to alternative media and more misinformation than ever before. And now with the internet, we are truly in the wild west of sharing information, collecting data, and trying to maintain privacy. Big companies take advantage of that from all angels with their marketing, which is now ultra-targeted (and often in ways you wouldn’t even be able to recognize).  Today on The Doctor’s Farmacy, I sit down with Andy Russell to talk about what data really means, who has yours, how they got it, and how they use it. Andy is a digital media, ad-tech, marketing-tech, and data science innovator and pioneer as well as a self-taught Behavioral Economist. He has invested in, incubated, or run over 50 technology companies, including Daily Candy, Thrillist, Tasting Table, Idealbite, PureWow, Zynga, Betaworks, Business Insider, Sailthru, RapLeaf and LiveRamp, SpongeCell, AdRoll, and Bounce Exchange. He is the Founder and Chairman of Trigger Media, InsideHook, and Fevo.  This episode is brought to you by ButcherBox and Joovv. I recently discovered Joovv, a red light therapy device. Red light therapy is a super gentle non-invasive treatment where a device with medical-grade LEDs delivers concentrated light to your skin. It actually helps your cells produce collagen so it improves skin tone and complexion, diminishes signs of aging like wrinkles, and speeds the healing of wounds and scars. To check out the Joovv products for yourself head over to joovv.com/farmacy. Once you’re there, you’ll see a special bonus the Joovv team is giving away to my listeners. Use the code FARMACY at checkout.  Now through March 29, 2020, new subscribers to ButcherBox will receive ground beef for life. When you sign up today, ButcherBox will send you 2 lbs of 100% grass-fed, grass-finished beef free in every box for the life of your subscription. Plus listeners will get an additional $20 off their first box. All you have to do is head over to ButcherBox.com/farmacy. Here are more of the details from our interview:  How much do sites like Facebook and Google watch our online actions and what are they doing with the information? (8:30) Why we have an unregulated internet (14:14) What does data really mean, who has yours, and how did they get it? (18:26) The repeal of The Fairness Doctrine and the regulation that alleviates digital platforms from responsibility for third-party content posted on their sites (24:42) The sale of your information by “Big Data” to Facebook and Google (27:27) How a computer or database can “know you better than you know yourself” (34:04) The current state of our personal data and political digital advertising (39:50) What we can learn from the Rwandan genocide (1:04:51) What you can do to know whether you’re being falsely advertised to (1:09:51) Should Facebook and Google get rid of political ads? (1:17:55)

Transcript
Discussion (0)
Starting point is 00:00:00 Coming up on this week's episode of The Doctor's Pharmacy. What we need to do is be a lot more discerning about the stuff we take as gospel that we read, that we see in the internet, and realize that for right now, for this moment in time, nobody's regulating it. Nobody's saying whether it's truthful or false. Hey guys, it's Dr. Hyman.
Starting point is 00:00:24 I'm willing to bet that you're like me and that one of the first signs of aging you noticed was on your skin. Now those fine lines and wrinkles and uneven spots start to creep in and we can see an overall decline in the texture and firmness we used to have in our skin. And you'll notice it takes longer for wounds to heal too and that scars might be more noticeable. It's actually more than just appearance. It's a sign of the aging processes happening below the surface of the skin in our cells. And one big reason for that is that as we age, we produce less collagen. Collagen is the most abundant protein in the body and it's really like the glue that holds everything together.
Starting point is 00:01:01 So supporting collagen production can make a huge difference for helping our skin and the rest of our body stay young. So I always love checking out the latest science for anti-aging and healthy aging. And I recently discovered a new device called Juve. It's a red light therapy device. Red light therapy is a super gentle, non-invasive treatment where a device with medical grade LED lights delivers concentrated light to your skin
Starting point is 00:01:25 it actually helps your cells produce collagen so it improves skin tone and complexion it diminishes the sign of aging like wrinkles and speeds the healing of wounds and scars and it couldn't be easier i just stand in front of this relaxing red light for a few minutes a day and that's it not only is juve great for your skin but it can also help you recover from injuries, aches and pains, and help you sleep better. To check out Juve products for yourself, head over to juve.com forward slash pharmacy. That's J-O-O-V-E dot com forward slash pharmacy. That's F-A-R-M-A-C-Y. And once you're there, you'll see a special bonus.
Starting point is 00:02:01 The Juve team is giving away to my listeners. Just use the code pharmacy, F-A-r-m-e-c-y to check out this is one of the simplest tools you can use to fight aging and to boost collagen production that's available to use right in your own home and i hope you check it out to see why i love using juve with all the factory farm animal products at the supermarket it's hard to find meat that actually supports your health goals instead of hurting them feedlot cows are pumped full of antibiotics and hormones and they're fed corn that makes the meat not so healthy and that is why i love grass-fed grass-finished ground beef from butcher box ground beef is probably the most dynamic protein out there i love how quickly it makes up a healthy meal without a lot of fuss
Starting point is 00:02:40 or muss grass-fed meat is also better for the environment so you can feel better eating it in fact grass-fed and grass-finished beef actually helps put carbon back into the soil reversing climate change and with butcher box i get 100 grass-fed and grass-finished beef delivered right to my doorstep they also offer wild-caught salmon which i love including in my weekly meal plans too and grass-fed ground beef is the first protein I recommend for people who are trying to get more comfortable in the kitchen because you can just throw it in a pan
Starting point is 00:03:10 with some salt, herbs, and spices, and it just makes a great meal. My favorite way to cook grass-fed ground beef is by sauteing it with lots of fresh garlic and onions and peppers and cumin and chili oil and oregano and tossing it over a bed of greens for a super easy homemade taco salad. Just talking about this makes me hungry. So if you've been looking for a way to
Starting point is 00:03:30 get higher quality protein in your own diet, be sure to check out the grass-fed beef from Butcher Box along with all their other humanely raised meats that are never given antibiotics or added hormones. They make eating well easy, delicious, and accessible. And now, through March 31st, 2020, new subscribers to ButcherBox will receive ground beef for life. That's right. Sign up today and ButcherBox will send you two pounds of 100% grass-fed, grass-finished beef free in every box for the life of your subscription. Plus, listeners will get an additional $20 off their first box. So to receive this limited time offer, go to ButcherBox.com forward slash pharmacy. F-A-R-M-A-C-Y.
Starting point is 00:04:13 That's ButcherBox.com forward slash pharmacy. Now let's dive into today's show, the next episode of The Doctor's Pharmacy. Hi, everyone. Just wanted to let you know that this episode contains some colorful language. So if you're listening with kids, you might want to save this episode for later. Welcome to the Doctor's Pharmacy. I'm Dr. Mark Hyman. That's pharmacy with an F-A-R-M-A-C-Y, a place for conversations that matter. And if you care about digital privacy and your data, this is a conversation you should listen to because though you may think you're making
Starting point is 00:04:42 choices on your own, you may not be, and your free will is being usurped in ways that are invisible, insidious, and nefarious. And the guest we have today is the guy to talk about it because he was involved at Ground Zero, Andy Russell. And Andy, I met at a party a number of weeks ago, and we had a brief conversation that just shocked me, and I wanted everybody to hear this. Even though it's not about health and medicine or food it is about our mental health and it is about our brain how it works and how it's being undermined by various activities that are happening across the technology world that
Starting point is 00:05:16 you may not be aware of and I certainly think of myself as well informed but I was not really aware of the depth of the use of our personal data against us to make us do things that we think are our choices but may not be. So Andy is a digital media ad tech, marketing tech, data science innovator, as well as a pioneer and self-taught behavioral economist. He went to Cornell, as I did, and he took the same psychology class. He went further with it and veered off into medicine. He's incubated and run over 50 technology companies, including many you might have heard of, including Daily Candy, Thrillist, Tasting Table, Ideal Bite, Pure Wow Zinga, I heard of that,
Starting point is 00:05:56 Betaworks, Business Insider, and on and on and on. He also was a founding partner with Bob Pittman, who is an incredible brand guy you might have heard about in the private equity firm Pilot Group. And he's co-founder of one of New York's best hotspots, a restaurant lounge, Moomba, which he did when he was 26, which is impressive. I don't know what I was doing when I was 26. I was in medical school. And he's the chairman of Treat House.
Starting point is 00:06:18 He also does a lot of philanthropic stuff, working on the board of Mount Sinai, Department of Children Analysts and Psychiatry. He's on the advisory council for the New York Stem Cell Research Foundation and founding chairman of BUILD New York and has been a keynote speaker at the Prentice School that specializes in teaching students with dyslexia. And he's a recipient of Network's 2020 Global Leadership Award, on and on and on. So the fact that you went to Cornell is awesome because I went there and you graduated in abnormal psychology. I thought that was a joke in your bio, but it's actually a thing. We'll talk about what that means. And you also went to Columbia Business School. So you're a smart dude. And the reason I invited you on this podcast was to talk about what's happening underneath the surface of our smartphones and our web browsers
Starting point is 00:07:04 that people just aren't aware of. Everybody hears about data privacy. We hear about our 50 million emails. I mean, 50 million Facebook profiles being stolen by Cambridge Analytica that's used against us. And we hear about all these hearings in the Senate and Congress from Mark Zuckerberg and Google. And yet most of us don't probably understand it very well, including me. So I invited you here to educate me and us about what's actually happening in this technology landscape. And the conversation that got me really intrigued was you said that this technology, which is sort of agnostic, it'd be used for good or bad, to target people using digital marketing and advertising was being used against us in ways
Starting point is 00:07:47 that we weren't aware of for political purposes and that were also being used in a one-sided way by Republicans and the Democrats you hinted weren't up to speed on what they should be up to speed on to have an even playing field. Whether you're Democrat or Republican doesn't matter. But if one political party has a technology, sort of like a smart bomb, that the other group doesn't, we're in trouble, right? Because then it's not a fair democracy, right? It's not a fair democracy. And that's what worries me. And I want to sort of get into to sort what first let's start off with sort of the the high level um you know what happens when
Starting point is 00:08:32 we go on our phones and our websites and we're browsing around and doing stuff like how much is google and facebook and all these apps watching and what are they doing with all this data they collect? Like, tell us about how this works at sort of start at 30,000 foot level because I don't think people understand. Sure. So basically imagine your worst nightmare of being watched.
Starting point is 00:08:59 Every action you do, every feeling you have, any emotion you might have, not just the things you purchase, but deep into your soul, your insecurities, are there ways of pulling those out of you? So using your worst traits against you? Well, so it's just technology, right? It's just tech. So there is a parallel universe where the internet knows so much about you that the internet knows the things that you want, should want, will make you happier, have a more joyous life and have less frustrations. There's a parallel universe.
Starting point is 00:09:50 That sounds good. Yeah, sounds amazing. And this is what we were trying to build. Like all of us who've been part of building the infrastructure of the internet and digital technologies. Imagine a world where companies create services and products that are very valuable to segments of the population, and segments of the population have needs, desires, and wants, and there's a connectivity or a curation where it's easy for those consumers to find those things that they want, making their life better and easier.
Starting point is 00:10:29 And it's easier for companies to find those consumers, making their businesses grow with less cost, less inefficiencies. That's what we are trying to build, like really trying to build. Yeah. like really trying to build yeah so if I create my broken brain docu-series and I want to share with people on you Facebook ads I can actually get it to people who might be suffering and benefit from it that's a good not bad we do that I do that I mean 100% like you actually provide real value to real people and the real people out there who maybe don't know that you exist but wow once they found out that you exist, their lives would be better.
Starting point is 00:11:06 Yeah, okay. That's fantastic. Now, the flip side of that is we do live in a capitalist society, and I'm fine with capitalism. As a venture capitalist, you'd imagine you would be. Yeah, and I believe in hard work, right?
Starting point is 00:11:26 And competition. And that hard work and competition and capitalism drives some innovation, right? But when greed and power overstep their bounds, the same technologies that could be used for very good purposes, satisfying needs of people, lowering their anxieties, making them overall happier, less angry, less frustrated, less hate. The same technologies can tear us apart and make us paranoid of those around us. And we've seen it. So this isn't a new thing with sociology or anthropology. You see it all throughout history.
Starting point is 00:12:15 Yeah, I mean, you have demagogues. I mean, you can activate people like Hitler did. They had propaganda films. They had more crude technologies. But now... This is just propaganda on steroids. Yeah. And opposed to a single message
Starting point is 00:12:30 going out to a greater population, it's going directly to you because in your news feed or the ads that you see or the content that you see, whether you put a search into Google or you're flipping through your Facebook feed, your Instagram feed, your Snapchat feed, anything, it's personalized to you. Now, why is it personalized to you? Well, obviously, those big tech companies know a lot about you. Now, in a perfect world, that's great. You have a personal curation service
Starting point is 00:13:05 of valuable information to you. But in an imperfect world, which is what we obviously live in, it turns into manipulation. And it can get so dangerous, and it already has, the way it turns into almost hypnotism. Right, so here's the thing. Advertising, when there's an ad on TV,
Starting point is 00:13:28 or there's a radio ad, or there's an ad on a podcast, you know you're being sold to. It's pretty obvious. 100%. This is news masquerading as, well, this is ads masquerading as news, right? This is insidious ways of communicating that you don't know you're being targeted.
Starting point is 00:13:46 But it's not only ads. It's content. It's content. Yeah. It's that news article that seems like it's an authentic news article. 100% because anybody can put up a website and can put up a Facebook page or can put up a Twitter handle or a LinkedIn page or an Instagram page and all of a sudden can put a name to it that sounds somewhat official
Starting point is 00:14:11 and can start making up their own news. And this has sort of become completely unregulated because we didn't really understand that this was going to be a problem. There's been zero time to regulate it. It's happened so fast that it hasn't been, people don't even know what's going on, right? Not a clue. And what's very upsetting, at least to me,
Starting point is 00:14:35 seeing I've been in this industry for over 20 years, specifically in this industry. Digital ads and marketing. Digital ads, targeting. Yeah. Persuasionasion but more around advertising to get people to buy certain things um uh the laws uh that were written um for television and for print and for radio um don't uh apply uh to the laws that are written for the internet. So we're completely in the wild, wild west. And to make matters worse, people who make those laws
Starting point is 00:15:14 and who have policy are not digital natives in the slightest. In other words, the 60-year-old lawmakers didn't grow up with technology and don't have a clue how the algorithms work right and uh what you know zuckerberg's uh senate hearing what was it like two years ago yeah um being somebody who like literally i go hands on keyboards i look at the dashboards um i know exactly where uh everybody is what they're doing, how to target, da-da-da. It was painful to me to watch the senators ask questions because it was so blatant that they were just completely clueless.
Starting point is 00:15:56 Now, I don't blame them for being clueless. That's not what they're experts at. It's kind of like having regulators looking at astrophysics or satellites or rockets around the technology behind it. Or the mechanics of a car. I can't build a car. I have no idea how to go in there and make a car work. And to make things worse, a lot of the journalists who cover the space aren't real experts in the tactical aspects of how information flows, how data flows. And so they get stuck using big words. I'm sure everybody listening to this has heard the term big data lots of times. Yeah.
Starting point is 00:16:54 What does that mean? Do they really know what it means? Yeah. Algorithm. Okay, cool. But what does it really mean? Machine learning. Interesting word.
Starting point is 00:17:02 But what does it mean? AI. Artificial intelligence. Ooh, sexy word. Cool. what does it mean ai artificial intelligence oh sexy word cool what does it mean yeah tell us what do these things mean because i you know i i i'm hearing things and i don't know if they're accurate or not but i mean i heard one woman give a talk and she said you know there's up to three billion data points on every person that that these companies collect all of our activities, where we shop, where we go. They have location tracking on our phone.
Starting point is 00:17:28 They know where we are. There's geo-targeting. They can literally ping people's phones if they're at a rally and be able to then target them later. You're on your phone almost and you're talking to a friend about something and then some ad for that thing you're talking about
Starting point is 00:17:40 pops up on your phone. Are they listening? Is the iPhone listening? Are Google and Facebook selling your data to Are they listening? Is the iPhone listening? You know, are Google and Facebook selling your data to third parties that are using it against us? Are third parties selling it back to Google and Facebook? Like, what's happening?
Starting point is 00:17:52 I read the other day that when you go to WebMD, WebMD sells the data on what you're searching on if you have heartburn or you have cancer or you have constipation or a headache. It sells that data back to Google and Facebook, which then can use it to target you. And that just seems like a real invasion of privacy. All right, so let's break some myths here.
Starting point is 00:18:12 Okay, go. Hardcore just breaks a myth. So number one... By the way, I'm a doctor. I don't know anything about technology, so I'm here with everybody listening, trying to figure this out. So that's why I brought Andy.
Starting point is 00:18:24 So let's have fun. What's data literally oh my god they have all my data oh we should have data rights and data privacy like what kind of data are you talking about well they search for a vacation in mongolia that i wanted to buy a you know new iphone that whatever you know like. They know that. Okay, so let me just break it down for you. All right. So you know how every morning or so you get a piece of maybe two or three or four pieces of direct mail, snail mail, the old-fashioned stuff
Starting point is 00:18:58 in your mailbox? All right. The ones I don't open and throw in recycling? Yeah. So there are 10,000 direct mail campaigns, different campaigns that happen every day in this country. I'm assuming you don't get 10,000 pieces of direct mail. Huh. You ever wonder why? Because they know shit about me.
Starting point is 00:19:23 Because they know a lot of shit about you. Yeah. Like a lot, right? So for, since the birth of the credit card and even before then, okay, they're big data companies. Companies such as Experian, Epsilon, Oracle, Alliantiant lots of them okay and what they've done for decades is against you as a persona profile right create a profile on you so how the hell does some company which is just like a company with a name like Epsilon. They buy. They buy what?
Starting point is 00:20:05 Your data. From who? From all the credit card companies. Yeah. So every credit card company sells companies like Axiom, Experian, Epsilon. Information on every purchase you've ever made with those credit cards. And this has been going on for decades. Decades. purchase you've ever made up with those credit cards and this has been going on for decades decades like my wife buys all these cat toys and cat things and cat exercise wheels and so she gets
Starting point is 00:20:32 targeted all the time with cat stuff no but let's come back to the old-fashioned stuff free internet yeah because this is not like a new phenomenon yeah it's just more so uh who else sells uh this information to the big data companies? The banks. Like you have your money in a bank, right? So all that information. Your tax records. So how much money you make.
Starting point is 00:20:56 How much taxes. The IRS sells your tax data? No. All of it is sold to the big data companies. All of it. How do the governments sell your tax return data? Most of it's put on public record. That stuff.
Starting point is 00:21:10 So how come we don't have Trump's tax returns then? Good question. That I can't answer. I can't answer that. But what airlines you fly, that you went to Cornell, that you're a doctor, every address you've ever lived at. And then what they do, because they've got hundreds of thousands of individuals that they have that level of detailed data on. So that's the kind of data we're talking about for offline stuff yeah right then you can append to it like oh what television shows are watching your home that's good right
Starting point is 00:21:53 what kind of car you drive yeah how long you've driven the car might you now be in the market for a new car yeah because we know that you have whatever kind of car and you've been driving releases up soon and you've you've only bought tires like kind of car and you've been driving for X amount of time. Your lease is up soon. And you've only bought tires like six years ago and you had your oil change. I mean, they know everything. Yeah. So then the data companies take all these personas, people, individuals, human beings, and all this, let's stop calling it data, information about these people. Yes, okay.
Starting point is 00:22:28 And run what's called predictive models against these people. And say that there's like 50,000 people who have like 60% of the same purchases that you've made for the past two years. And then of those people, the next purchase they made they went on to get a Amex Platinum card yeah but you haven't yet gotten your Amex Platinum card yeah so if you model all this purchase behavior off of
Starting point is 00:22:56 all these people who have similar purchase upbringing educational income levels as you huh now it's worth sending you a piece of direct mail offering you the opportunity to get a platinum card. And by the way that piece of mail cost 72 cents. Yeah, it's a lot. It's a fortune to get to you and the estimate of the industry is that only you're like oh they throw it away, I throw it away, throw it away. 2% of the industry is that only you're like, oh, they throw it away, I throw it away, I throw it away. 2% of the time, you open it. 2%.
Starting point is 00:23:29 So if it's 2% of the time and it costs 72 cents, it means to get you to open that is $35. Yeah. Holy shit, that's a lot of money. Yeah. Per person. Per person. So now you can extrapolate, huh,
Starting point is 00:23:46 if I have a lot of information on someone a lot and i compare it to a lot of information on lots of other people and therefore i can predict what your next purchase might be yeah therefore it's worth 35 for me to get that piece of mail in your hands and for you to open it up. Yeah. And the number of... And this is all analog. You're not even talking digital yet. I'm just talking analog because this is the birth of the whole thing. Yeah.
Starting point is 00:24:14 So they know how to do this. They've been doing it for years. And then the digital revolution comes along and all of a sudden... Everybody freaks. It's like a million X on that. Everybody freaks. Oh my God, my my data my personal data holy
Starting point is 00:24:25 cow people know stuff about me really well they did but it can't but but you know you weren't getting you know news feeds that were fiction correct right you weren't getting you know some fake newspaper in your box where it got really bad and i And I just want to back up a little bit because most people don't realize that there was sort of a pivotal moment in the media with the repeal of the Fairness Doctrine. And the Fairness Doctrine was a regulation law that basically mandated that all news was fair and was, you know, honest and equal. And that's when you had, you know, Walter Cronkite, Dan Rather, Tom Brokaw, Peter Jennings, you know, ABC, CBS, NBC. That was it. It wasn't CNN.
Starting point is 00:25:14 It wasn't anything. Right. And I remember those days because I'm that old. I remember those days. And, you know, you could trust it. You know, they were trustworthy. It was fair and partial. That got repealed under Ronald Reagan.
Starting point is 00:25:25 And that led to the birth of alternative media and Fox News and other channels of distribution information so that all of a sudden now when you read the news, it may not be news. I mean, I used to live in China and it was a totalitarian state where the People's Daily was fiction. Yeah.
Starting point is 00:25:46 You know, it was fiction. And what the people mostly did with it was wipe their butt because they didn't have toilet paper with it, which is probably the best use for the People's Daily. So I think that's actually happening now, but we don't know it. But that's also... People knew that the People's Daily was fiction. You know, Chinese citizens understood that it was propaganda but we don't know that yeah anymore so i'm forgetting the date it was either 1995 or 1996 there was an article put into i think the communications act
Starting point is 00:26:14 that said that uh any digital platform that digital platform uh that has content on it that is provided by a third party. The platform itself is not responsible for the content that is on the platform. So Facebook puts up a jihadist post and it's not responsible for that. Yes. And by the way, it's like a 16 word article in this document. That's it. And that's the loophole that protects Facebook, Google, or any digital publisher that takes third-party content. That's it. And if it weren't for that one little article in that document,
Starting point is 00:27:03 Facebook would be responsible for its own content. What's really concerning, though, it's not that third parties put stuff on Google and Facebook and other digital platforms. It's that then those platforms, Google and Facebook and others, sell our data to third parties. Oh, it's worse than that. To other groups that may use it for good or may use it for bad. Do not really. No? not really. No? Not really.
Starting point is 00:27:27 So what happened in 2014, okay? So think about it. You log on to Facebook with one email address, right? But you probably have like four or five or six different email addresses over your lifetime. Yeah. Right? And you log on to Google with an email address, but you probably had six or seven different email addresses. And same thing for YouTube and same thing for Instagram.
Starting point is 00:27:53 So in the years kind of 2013, 14 and 15, all these big data companies, the one I just told you about that were buying all your credit card information, all your financial information, your travel information. They started, it's called appending to your file, adding to your file, all of your email addresses. Yeah. All of them, right? So in 2014, all of the big data companies then went to- They were analog.
Starting point is 00:28:24 They then got digital. They were analog, went to Facebook, and went to Google. And because Facebook owns Instagram, same thing with Instagram. And because Google owns YouTube, same thing with YouTube. If the email address that you logged on to Facebook with, they had as one of their set the data companies had as one of their seven email addresses they were able to find you and they they sold all the data your offline data behind the scenes to Facebook and to Google so it was like a swap. It's a big data swap. No, it was a cash sale.
Starting point is 00:29:07 25 million bucks. So they sold it but it was basically Facebook and Google gave them their digital data. No they didn't. They didn't give them the emails? No because remember you logged on to Facebook with one email. Yeah. If the data companies had that email address as one of their seven, they were able to find you, and it was a financial transaction. The data companies sold to Facebook all of the offline data on you plus up to six additional email addresses on you. Okay, so this is just fine
Starting point is 00:29:48 if that information was being used to better curate what we think we might like, right? Because I don't mind getting ads about things that I might want to buy that I didn't know about. I can sort of sift through that. But what's really happened, and what I really want to get into now,
Starting point is 00:30:06 is the political piece of this. So there's a movie called The Great Hack, which I'm sure you saw and know about. I did a screening of it. It did a screening of it. It's on Netflix. For those of you who haven't watched it, please watch it.
Starting point is 00:30:17 It's a must watch. But essentially... I mean, it's like a must, must, must, have to watch for any human being in society who cares about how they make decisions and how we're influenced by good or bad actors. Yeah. So this is really important that you said. I agree.
Starting point is 00:30:36 Everybody should watch it. And the reason it was so disturbing to me, because I understand the food system really well, and I understand how they use their tactics, which I wrote about in my book, Food Fix, coming out in February 2020, how they use their tactics to drive our eating behavior and our food choices and the food that's grown and all that. And I think that's one of the biggest threats to humanity
Starting point is 00:30:58 and the planet today. I agree. But when I started learning about what's in The Great Hack, when I heard this woman, Brittany Kaiser, give a talk, I was like, in talking to you, I was like, wait a minute. There's another big threat that is not so obvious. Everybody understands it. Maybe even worse.
Starting point is 00:31:12 People drink Coca-Cola, we get that it's not great for you, right? But this is invisible for most people. Yes. We kind of know that we get curated ads and stuff, and we're sort of aware of it. But what this this movie highlighted was how this company cambridge analytica created a personality typing yeah based on all these data points whether it's three million or three billion i don't know how many it
Starting point is 00:31:36 is but there's a lot of information on us yeah from all these sources from the analog sources from the digital sources from every activity we do i mean i went on my iphone and like you can stop the tracking of of your phone because they track your phone and what you go on and then they target ads to you it's like and then they sell that i don't know if iphone is listening or not to us it seems like it is sometimes and they sell that data to google and facebook i don't know that's true but i heard that somewhere and and what's concerning me is and they use this for more nefarious purposes it wasn't selling you a new jacket, a new shoe, or something you might like. It was selling you political ideas. Yes.
Starting point is 00:32:10 And it was using the personality typing to target messaging. And it's not just political ideas. It is seeding a fear inside of your head. Yeah. And then it is watering that seed of an idea, fertilizing that seed of an idea, surround sounding you in an echo chamber, reverberating that fear back into your head
Starting point is 00:32:36 until you take an action. Exactly. Yeah, that's what was striking to me was that they know us better than we know ourselves. They know our weaknesses and fears and insecurities better than we know them. And they use that data to create customized messaging to manipulate our behavior so that free will becomes a fiction. And then they're doing this not just in the last election where they created campaigns
Starting point is 00:33:01 targeting, for example, certain voters with certain weaknesses to get them to shift their allegiance. But they're doing this globally. And so they're activating despots all around the world. And I don't understand their motivation or what their motivation was to start to put in, you know, right-wing despots and fascists across the globe. But that's what they've done.
Starting point is 00:33:20 And then they've also activated... Are you speaking just about Cambridge Analytica? Well, that's all I know about. I'm sure there's more going on that I don't know about. But what struck me really was that they were able to activate different groups. So in other words, if there's a riot of African-Americans and white supremacists that show up in a certain place at a certain time, it ain't spontaneous. It literally can be orchestrated and people can be driven to show up
Starting point is 00:33:45 to activate those riots or those conflicts that are manufactured that then create a whole downstream set of consequences that are not good so absolutely that's terrifying to me and people who show up at those rallies things oh i this is my idea this is my belief this is what i feel but it turns out it may not be so So nobody wants to think that they are influenceable or persuadable. We like to think that our brains are so magnificent and we have free will and we make our own decisions and we're rational human beings. So a really famous psychologist, Daniel Kahneman, who actually
Starting point is 00:34:29 won the Nobel Prize in economics based on something called cognitive biases, insecurities, and how we make decisions. He said something at a speech once that really freaked me out. He said the human being remembers about 0.05% of their memories. Of their memories. 0.05%. Of their memories. Yeah. And a memory's period of time, moments of time, are defined as three seconds intervals. Because that's how long it takes for the neurons to actually fire and create a memory.
Starting point is 00:35:01 Yeah. Three seconds. If we're only remembering 0.05% of all of our memories and all the actions we've done, a computer database that remembers all of it, including, yes, where you take your cell phone, right? As you go on an E-ZPass, as you use Google Maps, plus all your offline purchase data,
Starting point is 00:35:23 everywhere you've ever lived, everywhere you've ever lived everyone everywhere you've ever had a meal you go by yes yep vitamins you you take everything like that plus uh it's studying uh what content you read what videos you watch uh what magazines you subscribe to um holy shit so now it's like wait wait a second, what do you mean Google or Facebook or YouTube knows me better than I know myself? Well, excuse me, Mr. or Mrs. Human Being, you only remember 0.05%. And the computers, the databases remember all of it. And therefore, as they're studying that stuff uh it becomes
Starting point is 00:36:05 very easy scarily wickedly easy to tap into your insecurities and your fears by playing you information that they already know will rattle you and trigger you yeah and it's one thing if they're selling you some stuff to buy it's another thing if they're selling you some stuff to buy. It's another thing if they're selling you political ideology or hate. Or hate. Or fear. Or fear. And I'll just give you a really nice example. And by the way, I consider myself a centrist when it comes to politics. But I'll give you a really good example.
Starting point is 00:36:41 I'm a hardworking American. I live in Texas, right? And I have kids and I have to, you know, clothe them and medical expenses and all that. And I've worked my butt off for years, right? I'm trying to save a little bit of money for retirement, but I've got like another 12 years of work. And then I just have to say to you the immigrants are coming across the border. Right, they're going to take your job. And they're going to take your job.
Starting point is 00:37:12 Well, that's my greatest fear that I'm going to lose my job and not be able to take care of my family. Yeah. Like that's a real fear of mine, right? The aliens are coming. But not just that they're crossing the border, they're coming for your job.
Starting point is 00:37:29 And, hmm, now let's just say they're coming across the border, they're gonna have no money, and they're coming from violent places, now they're gonna do violent things to your family. Well that's another real fear of mine. And those are legitimate fears. Right? I want the safety for my family.
Starting point is 00:37:46 I want to have my job, to be able to take care of my family. So it's very, very easy to use fear tactics around things that aren't true, might be a little bit true, might be like a small chance of being true, but like, huh. Yeah, I remember the rally the white supremacist rally in virginia and it's like the white supremacists were
Starting point is 00:38:12 yelling you know jews will not replace us and i'm like wow well first of all there's not that many of us jews and i'm like what and i'm like we probably don't want those jobs they're not going to do those jobs. Whatever. Like, I just don't, I didn't understand. So here it's Jews will not replace us, right? And then it was immigrants will not replace us. And then in Europe, it's Muslims will not replace us. All right? Holy cow.
Starting point is 00:38:38 Others are coming to take away what's mine. And I've worked really hard for. The psychology. But that's just one example of a whole series of things it's not just people who you know yes might not be you know worried about their job but there'd be other ways of targeting fears and insecurities absolutely around political action so the defeat had crooked hillary whether you're for against hillary or not the methodology that was used was to target Hillary voters, people who were for Hillary, with information, misinformation, to actually sway those voters to not vote for Hillary or stay home or vote for Trump. Hillary's running a child sex slave operation out of a basement of a pizzeria. I remember that.
Starting point is 00:39:26 Yeah. That wasn't true? Actually, so I follow up on all this stuff, right? And people went to that pizzeria to free those poor children. Yes. And there was no basement and there were no children. Yeah. But damn, does that work?
Starting point is 00:39:46 It works, right. Holy, I mean, just completely false. Like completely false. So where are we at now? Because if this is happening. Yeah. You know, we're facing another election. Yes.
Starting point is 00:39:56 And one of the things you said to me was that, you know, one team, the Republican team, understands this technology, knows how to use it. And the other team, the Democrats, don't really know what's going on. So what we've discussed... And so we're facing an election coming up, which we think is about representative democracy. Yeah. But it's turned into a little bit of a totalitarian process. And by the way... Is that overstating it?
Starting point is 00:40:23 That's understating it okay well give us the real deal then all right so what we've discussed so far so far is we've now changed this term data to being just information your personal information and just like everything you've ever done and everything you read and everything you care about and just information about you it's not data it's just like everything about you but who owns that we thought we own it but we don't own it that's a whole we'll talk about that okay we've never owned that right we've never owned it but now we want to own it so that's a whole other conversation yeah um so we've talked about the big tech companies, okay, knowing a lot about you, like a ton. And then we talked about psychologically driven advertising messages, which are political. Advertising, right, trying to convince you to have an idea, do something, vote.
Starting point is 00:41:24 And then we spoke about the fact that it can be just totally made up bullshit stuff. We know that it will resonate, nauseate you. Slave trade and pizzeria. At the core. And by the way, if any of that were true, it should nauseate you. Of course. Right? And by the way, if people really were coming to take your job
Starting point is 00:41:46 and do damage to your family and Hillary really was running a child sex slave thing in a pizzeria, like, yeah, this is all real stuff that you should be concerned about. Yeah.
Starting point is 00:42:01 So now we've discussed like the fakeness of it, right? And by the way way who would i be targeting that hillary sex slave stuff to hillary voters well not just hillary voters how about people who have good morals and ethics yeah and like really care about like uh ending sex slave or who run um orphanages or who are teachers um or who care about real humanitarian stuff. Well, if I can actually convince these people that Hillary's really running a sex slave operation, then there's no way in hell they're voting for Hillary.
Starting point is 00:42:35 And it shows up not as an ad. It shows up as a news story. And that's the whole point. It shows up as like 30 different news stories across 30 different sites. Therefore, it validates one or the other who's writing this nonsense um people get hired um because people are greedy and uh people who want to win elections so they're hired by political parties yeah with all the donation or not not necessarily the political parties right right? So like Breitbart. Super PAC.
Starting point is 00:43:05 Yeah, well, super PAC stuff or people who just like want one candidate to win really, really badly. So they'll just put their own money up and just become journalists. But like journalists that publish fake stuff. I mean, just completely fake. Like the ALA Martians have landed in kansas and like the national choir yeah all right so um so we've discussed or the brad uh you know brad pitt is you know actually a martian he's not so we've discussed all all kind of that which is kind of like the basics underlining of it. So why are the Republicans so damn good at it?
Starting point is 00:43:51 And Democrats are so damn bad at it. Right. So now let's get away from the fake news part. Yeah. Let's just talk about like political advertising. Yes. All right. Every political year,
Starting point is 00:44:09 a bill over the history of politics, over a billion dollars has been spent on political advertising on television. It's a lot of money, like a lot of money. Right. Um, so seeing that the Democrats and Republicans, now literally forget about about the fake stuff.
Starting point is 00:44:28 And that was interesting. During the last election for president, the Democrat, Hillary, spent far more than Trump on winning, but lost. On television. Yes. And Trump spent like $50 million on Facebook ads. Right. So, Trump's got a secret weapon. His name is Brad Parscale.
Starting point is 00:44:50 Brad Parscale is a phenomenal, phenomenal digital media marketer. And it's not, he doesn't usually, I mean, yes, he ran some lies about Biden, right? I mean, we all read about that. We know that. The corruption scandal, you mean? Yeah.
Starting point is 00:45:09 Yeah. I mean, those were ads that he ran. For the most part, Brad Parscale does not run fake news or lies about Trump or about the opponents. For the most part, he doesn't. Yeah. But what he's damn good at is using the platforms.
Starting point is 00:45:28 And it's not just like, oh, there's Facebook, and then there's Google, and then there's Instagram, and there's Google Display, and there's YouTube, and there's email, and then there are different landing pages. All are separate things, right? So if you think of the internet as a whole, it's made up of a whole bunch of different puzzle pieces, lots of different puzzle pieces, only one of which is Facebook. And by the way, all the tools that are, got to be pretty sophisticated to know how to really use those
Starting point is 00:46:07 tools and nobody in the old school democrat or republican side uh had ever rolled up their sleeves got on hands-on keyboards and bought actual ads on facebook, Google, Instagram, nobody with power or money or in the old system had done that. What Donald Trump did is he hired a digital media native, Brad Parscale, who was designing websites. Who was 25. There is a real methodology, a very specific methodology, otherwise known as a way of putting the puzzle pieces together that turn this type of digital marketing persuasion into a weapon. Yeah. So you call it a weapon of mass destruction. Full on weapon of either mass destruction or mass good. Yeah.
Starting point is 00:47:11 All right. And the Democrats just missed it. Nice and simple. Because what Brad Parscale did for Trump only became possible technically, like with all the technology that exists, in 2014. Yeah. two years after Obama's election and exactly two years before the 2016 election but you're saying even now that Democrats still haven't clued into this
Starting point is 00:47:36 Democrats are trying and but it's only in the past first of all once again the movie the great hack everybody has to see it because however much you think But it's only in the past. First of all, once again, the movie The Great Hack, everybody has to see it. Because however much you think you know, nobody knows. Watch this movie. It's a really good insight. After that movie came out. Go have a bottle of tequila.
Starting point is 00:47:59 Yeah, you're going to want to like. Move to New Zealand. Yeah, you're not going to be very happy. But it's good to be aware. I mean, if we don't, if we are aware, that's the first step I think in 100% in protecting ourselves from those
Starting point is 00:48:10 influences, you know, if you, if you don't want to be manipulated, if you don't want to be persuaded to do stuff that you don't think is in your best interest, then actually learn. It's like,
Starting point is 00:48:19 you know, it's like when you get these emails now that from the King of Nigeria that left you $5 million and all you have to do is give me your bank account information. And you're like, you know, people kind of go, okay, I get that. It's like when you get these emails now from the king of Nigeria that left you $5 million and all you have to do is give them your bank account information. You're like, you know, people kind of go, okay, I get that's a scam. But they don't get that this is a scam. 100%. I mean, you just don't know. You really just don't know.
Starting point is 00:48:36 So first and foremost, people should see that movie. And second is four things happened in 2014 that made it possible for the first time ever to do the type of very sophisticated digital marketing that Brad Parscale does. Number one, all those big data companies, Axiom, Experian, Epsilon, Oracle, blah, blah, started appending up to six email addresses. That was late.
Starting point is 00:49:03 It seems late for that to happen. Yeah. Then they all sold all that offline data plus the emails behind the scenes to Facebook and Google, Instagram, and YouTube and everybody else. Once that happened, something called Facebook Custom Audiences
Starting point is 00:49:22 got introduced, which meant if you have email lists or email newsletters, which is what I had, Daily Candy, all these curated email newsletter lists, you could take segments of those people who react to certain content in a certain way, upload those, and find those people on Facebook. But why can you find them? Because you might not have the email address that somebody signed up to Facebook with, but now Facebook has up to seven email addresses on you. Ah, so whatever they sign up to your email newsletter with, as long as it's one of those seven that are on Facebook, you find those people on Facebook. Yeah. So now you can target those people.
Starting point is 00:50:07 Yeah. That happened in 2013, 14. And then the next thing that happened is just like I told you, the big data companies would study all of your behavior and say the next thing that you might purchase is a Amex Platinum card. Now, Facebook gives you the ability to use something called lookalikes against that custom audience. Yeah, so you get...
Starting point is 00:50:30 Because once you put in that custom audience... Sounds like, right. They look at all the data they have, offline purchase data, where you've lived, everything, behavior data, what you read, blah, blah, blah. See what these people, this cohort of people and custom audience have in common. Yeah.
Starting point is 00:50:43 And then Facebook will align you with other people on Facebook who have common data sets. Right, so you seem like, you know, you and I seem similar, so then we have the same values, look at the same stuff, so they're gonna target us. And now what's really fascinating, it's so much more powerful than just the offline purchases. Yes, they have that, but they have your behavioral data
Starting point is 00:51:06 on what type of content you watch, see, da-da-da, right? So now that's the psychological profiling. Yeah, and people don't realize this. When you click on a website or click on an area in a website article, all that data is being tracked. Yes. And so they know what you're interested in,
Starting point is 00:51:23 what you're looking at. If you click on the Tesla ad, they know what you're interested in, what you're looking at, if you click on the Tesla ad, they know if you're reading an article about climate change, you're reading an article about immigration, or you're watching a video. I mean, great, Netflix knows which movies I like, I don't mind that, that's okay. But this is a different purpose, right?
Starting point is 00:51:38 100%, so that was the next thing that Facebook did. It said, oh, if you have a small group, now you can plug it in and find everybody else on Facebook who will react to the same emotional storytelling, otherwise can be brainwashed around the same type of fear tactics. And then you remember in 2014, the ice bucket challenge? Yes. Ah, so that's actually the first time video went into the live feed of Facebook.
Starting point is 00:52:06 So yes, I am the guy who freaking noticed, seeing I used to own television stations and knew that there was a billion dollars of video political advertising being done on television. Well, holy cow, what if you get a bunch of these email newsletters and then you could study what people care about by what subject lines they open up on, what content they read about, and you study that psychological behavior for long enough. And now you can upload that into the Facebook custom audiences, find all those people are similar, and now you can play them political ads. Yeah. So I was building the first ever political advertising agency with video, with hyper-targeted psychological tactics to then grow a billion dollar business out of it for both Democrats and Republicans because it was just a new media channel. Yeah.
Starting point is 00:53:05 Because that was your job. You were a digital media, ad tech, marketing tech, data science guy, innovator, right? So you saw, okay, all these technologies are coming together. We can use this to enhance the political process and help people get to voters and help people activate the democratic process. But you didn't know it was going to be usurped. So unbeknownst to me,
Starting point is 00:53:23 the person who I was doing this with ended up being a very close ally of our now president. And he took the entire game plan tactically all the way down, execution, whole nine yards, handed over to our now president, and is the exact game plan that Brad Parscale, who was Trump's digital marketer for the 2016 election, and now he is his campaign manager across the board. And if you watch Brad Pars uh in any of his interviews because he loves press now um he just says oh i'm just treating the election like marketing like this is just direct marketing yeah and i can study the behavior of all these individuals and then collect their email addresses which you'll see just about every single Trump ad.
Starting point is 00:54:26 So it's sort of like a football game where, you know, one coach gets the game plan of the other team and then the other team doesn't know they have it and then they're using it against them. Yeah, and unfortunately, after this all happened, I didn't have any relationships up at, high up in the Democratic Party to be like, although I tried really, really hard.
Starting point is 00:54:50 Hey guys, here's the game plan. So you saw this happening in real time? Oh yeah. Before the election? Oh yeah. And I was screaming from the rooftops. From the rooftops. But no one was listening.
Starting point is 00:54:59 Well, think how silly it sounded. Yeah, I mean, business partners with this guy who's, you know, kind of has a reputation for being a little bit shady. There's this thing called Facebook. There's email newsletters and this thing called data. And then this real estate tycoon slash TV personality named Donald Trump. And he's going to use this weird methodology of psychological manipulation. And he's going to win the election.
Starting point is 00:55:34 I mean, I hate to say this, but Trump is not a digital native. So how did he come to understand this? Or was it just the people around him? So the person who I was doing this deal with is a very, very, very smart publisher. Yeah. Old school, long time, and became very smart at digital. And what does Trump do?
Starting point is 00:55:59 He hires really, really well. So we hired a guy named Brad Parscale, who it pisses me off because he's two inches taller than me and I'm six foot six. And Parscale has the exact game plan and I watch him every day using it. What it turns out is it's a lot more complicated than one would think, a lot more complicated than one would think to execute on to perfection. So currently, just about every one of the democratic organizations out there are now, after the Great Hack and then after the New York Times
Starting point is 00:56:34 ran a couple of large articles about the Democrats being far behind, now they're like, oh, we must spend more money on digital and on Facebook. But if you don't spend it the right way, you're throwing away money. So who's advising them? Yeah. So there is one group who's actually really, really good at this. And they don't do this in a nefarious way whatsoever. They just use the communication channel the best
Starting point is 00:57:06 way. So Courier Media? Well, it's Courier Media and a group called Acronym. And they're just very good at this. They're the best that the Democrats have. But listen, we're under a year out from
Starting point is 00:57:21 this election, and Brad Parscale has been doing this for five years. Yeah. And has email addresses, data, and has brainwashed a fair chunk of Americans. And by the way, I don't hold it against the Americans are good people. Like really good people. But it's ripping apart our country. It is.
Starting point is 00:57:44 Of us versus them. In college, I remember reading a novel, I mean a play called Rhinoceros by Ian Esco. And the play was about Nazi Germany and how could upstanding German citizens with good upbringing and good morals stand by while in their backyard there were crematoriums and concentration camps and do nothing and say nothing yeah and how could
Starting point is 00:58:15 they all fall in line behind a clearly crazy fanatical totalitarian leader like hitler and and it in it you know it was the banality of evil you know hannah And it was the banality of evil. You know, Hannah Arendt talked about the banality of evil. And I think we're in that moment where our democracy is being threatened in many ways, but particularly in this way, and the democracies around the world are being threatened. And it's this problem that isn't organic, it's manufactured. So we're manufacturing hate, we're manufacturing's manufactured so we're manufacturing hate we're manufacturing divisiveness we're manufacturing yes disconnection and i think that's what really worries me is that you know people are generally good i think i think people are generally wanting
Starting point is 00:58:56 good things for themselves and their families and their communities and their nation but when that gets shifted through this manipulation yes that invisible, that's coming through our technology, and that there are big players who understand this, who use it against us, what do we do? So I've spent a great deal of time not only seeing all the ads that that Trump runs on Facebook and if any of your listeners want to see the ads you can just go into Google and type in Facebook library and put in there Donald Trump and you can see all the ads mmm you can see all the ads for any of the politicians yeah to see what they're running Facebook has now made that transparent, which is a good thing to, people should see it. It's the Facebook library, right?
Starting point is 00:59:50 And something you just said just really resonates with me because I've spent a lot of time not just studying the ads, but watching a lot of the Trump rallies. And I actually want to go to a Trump rally because I think it would be really interesting. They don't get targeted because they ping your phone
Starting point is 01:00:07 if your location services are on and they ping your phone and they know where you're at. But you know what? Get over it. All of this stuff does. So does every single app that you have. So don't sign up for it. If you don't want it, don't sign up for it.
Starting point is 01:00:22 I'm on Google Drive right now and I have all these questions about all this. Is Google reading my documents and then using that somehow and selling it? Because that's terrifying to me. In Google documents, I'm actually not sure. I don't want to... Because when you have an email
Starting point is 01:00:39 and you have Google Gmail and then you type in something and then you get an ad for something that was in your email. So let me come back to what I was just saying. Okay, sorry, I got distracted. Americans. Whether you're wearing a Make America Great Again hat,
Starting point is 01:00:57 or you want somebody else to be president, if you put those people together, they have common interests, common concerns, common desires, common fears. And it kind of goes like this. I'd like the ability to have a job so I can make money
Starting point is 01:01:18 and take care of myself and my family. I'd like to be able to have healthcare so I don't have to worry about illnesses. I'd like my children to have a good education. And I'd like to have safety. And I don't really want to have to worry about food. Right. I don't think there are many Americans who would argue that they agree with those statements. For sure. Right. So I don't like this division of Republican versus Democrat.
Starting point is 01:01:54 Yeah. Us versus them. Right. No, we're all human beings. First, yeah. First. American second. American second. And just break down what's important to us right as individuals like
Starting point is 01:02:07 i want to go out there and start hugging uh all these people who have uh make america great hat on again and be like we're the same thing yeah we care about the same stuff for sure and where it gets really ugly really really ugly i'm glad you brought up... And I know that's true because I take care of everybody, right? I take care of, you know, all religions, all political persuasions, you know, in my office. And everybody's a human being first
Starting point is 01:02:35 and that's where I connect with them. I mean, that's the beauty of our country is, you know, we have freedom of choice and freedom of belief systems and freedom of all this stuff. But like... Not too much anymore. Well, let's just remember that we all care about each other, the future of our country,
Starting point is 01:02:54 the future of our children, the future of our own health, the future of education. And we're all in this together, right? What's horrible is when politics gets in the way, power and greed gets in the way, and people of power and greed don't mind using fear against people to get them to hate others, us versus them. It doesn't make sense. How can you hate other people that you've never met? Yeah. That doesn't make sense no it just how can you hate other people that you've never met yeah like that doesn't make sense yeah it just doesn't make sense but that is absolutely what's happening our country is absolutely being divided apart yeah um there's a great there's a great movie called the sacrifice by darren brown people should it. It's on Netflix. And it shows how this American who was very much against immigration, who was very racist against Hispanics and Mexicans, was put through a process of understanding the humanity of everyone and connecting with actual human beings.
Starting point is 01:04:08 He was a white supremacist, right? Yes, who were actual Mexicans or immigrants. And in the film, he gets him to have so much empathy and connection with these people who he thought were evil and bad and taking what was rightfully his, that he was willing to take a bullet
Starting point is 01:04:25 for someone who he barely knew who was a Mexican immigrant. Yeah. So I met that guy. Darren Brown was the guy who would take the bullet. So people should watch it because it shows how much we are all just really a conversation or two away from actually...
Starting point is 01:04:44 And then let me just use one more example because you used Nazi Germany, and that's like an obvious one to go towards. But if people have studied the genocide in Rwanda, where one million Tutsi were slaughtered by the Hutu in 100 days. One million, right? And it was their neighbor. It was like the Hutu in 100 days. One million, right? And it was their neighbor.
Starting point is 01:05:08 It was like the person who lived next door. It wasn't some remote community somewhere. Like really, like not their neighbor in terms of like border. No, like right next door. Right, the next door neighbor. And you went to school with them and all that stuff. How did this happen? And the history of it is really fascinating.
Starting point is 01:05:22 Going back to end of World War I, the Belgian went into the Congo, and they also went into Rwanda. And they said, huh, we're going to give you a card that says you're Tutsi, and you guys a card that says you're Hutu. Literally arbitrary. It wasn't by culture. It wasn't by race. It wasn't by race. It wasn't by anything. And they just made sure that they were a lot fewer of the ruling class that were the Tutsi.
Starting point is 01:05:51 And they're up for a lot more of the working class that were the Hutu. And they're because a smaller class of privileged people are easier to control for the Belgium to suck money out of the working laws of the Hutu. Back then, the communication channel was radio. Yeah. Right.
Starting point is 01:06:12 So what started this genocide? The Hutu got hold of the national radio and said the Tutu are about to slaughter us. Get to them first. That was it. That was it. That was it. Just a sound bite. These people have been oppressing us, and now they're about to slaughter us.
Starting point is 01:06:33 Grab your knives, your forks, your axes, your everything. Get to them before they kill us first. Yeah. Holy shit. A million people slaughtered next door neighbors. Yeah. So don't ever underestimate what's possible. Holy shit a million people slaughtered next-door neighbors. Yeah, so So don't ever underestimate what's possible, right? Oh, it's also fascinating is the end of that story
Starting point is 01:06:53 Which is when Kagame who you could debate whether he's a good ruler or not as the president Rwanda Came back as an autocrat essentially but said look the end of this you're all gonna talk to each other You're all gonna say you're sorry and you're all gonna look, at the end of this, you're all going to talk to each other, you're all going to say you're sorry, and you're all going to live together in harmony, and here's how you're going to do it. He also said, burn those cards,
Starting point is 01:07:12 and it's now illegal to refer to anybody as a Hutu or Tutsi. Yeah. And it was sort of like the Truth and Reconciliation Commission in South Africa. It was very similar. And when you see the movie about this, it's just extraordinary
Starting point is 01:07:22 how literally, you know, it would be like me going over and killing your parents and your kids absolutely and then me having to sit down with you absolutely and have a conversation absolutely like if that can happen there and by the way that was on the local radio station that's everybody now realize we're living in a world where hate groups or people of power or politicians can get so deep into your individual head and the head of those who are in your immediate community to drive so much hate, so much anger against this mysterious other.
Starting point is 01:07:58 So what do we do about this? We talk about data rights. Should we actually regulate data privacy differently? What are the ways that we can combat this? How does the individual protect themselves? How do we not get sucked into this manipulation? How do we actually get our democracy back? I mean, these are the questions that I'm thinking about. I don't have the answers to, but maybe you do. So unfortunately, I don't think it's a data thing. And unfortunately, I don't think these big companies are going to self-regulate because they make money doing it,
Starting point is 01:08:26 and they'll accept money from anybody doing it, and that's our capitalist society. And I don't think it's going to be regulated by government because the regulators have no idea how it all works. So they don't know what to regulate. There are a bunch of really good people out there who care a ton about this stuff. A very good friend of mine, Tristan Harris, a guy named John Bordwick, a whole bunch of people who are really trying to figure this stuff out. Within the government? No, from the private sector.
Starting point is 01:08:53 A bunch of us that kind of like grew up building this stuff and then being like, oh shit, what did we build? So now it's kind of like our, like, by yeah i'll say i'm sorry like i am really really sorry because i took place of building something uh for financial benefit and i didn't realize the harm it was going to do and mea culpa oh shit i'm not just going to say sorry i'm going to do something about it to try and fix it for the future of society um what i hope what i really really hope for. I mean, it's like you've been in a car,
Starting point is 01:09:28 but then you didn't realize people are going to be driving through crowds and killing people, right? Like that's sort of it. Well, it's a little bit worse than that, to be honest, because advertising and convincing people to buy stuff and then being and knowing how potent this form of communication is once you are able to do psychological profiling on people by what they listen to, what they watch, blah, blah, blah. So the most important thing right now,
Starting point is 01:09:54 like here and now, is people, what they read, what they see, what they view as video. Don't take it as fact. Yeah. Full stop. Research the hell out of it. All right? We don't live in a world where just because you read something or see a video
Starting point is 01:10:20 or it's on your internet feed, don't think it came from a scholar or from somebody who's an expert. I mean, holy shit, if my car breaks down, right, and I need to be able to fix my car, I'm not going to go online and listen to some phony telling me about how to go into my car and fix my carburetor. No, I'm going to go and find an expert mechanic to know how to do that.
Starting point is 01:10:46 Otherwise, my children are going to get into a car and I'm going to drive off a cliff and die, right? So please, everybody who's- Buyer beware, reader beware. But it's reader question. Question all of it and realize that most of it out there is fiction. Yeah. Or most of it, if it's not fiction, it's opinion.
Starting point is 01:11:11 And then before you trust someone's opinion, make sure that they're qualified. Yeah. To have that type of, is that somebody who you should be listening to? How do you do that? I mean, the average person, like, how do you vet whether this is true or not? I read articles all the time. Like, I look at where the authorship is.
Starting point is 01:11:29 I look at who they are. I look at where they work for. I mean, I try to do that, but it's tough. Like, even in science, we think science is this pristine field. But, you know, much of science is funded by industry. Much of the data is manipulated to shape it into outcomes that the funder wants. How about this? Take your time.
Starting point is 01:11:50 Take your fucking time. We're all going from article to article to article. We all have to open up all of our email addresses. We have to get back to everybody's text. And we have so much we have to do. So we just read the headline or no, no, no, no. Take your time. Be curious.
Starting point is 01:12:08 It's confusing. You watch CNN and then you flip to Fox and you feel like you're living in two different planets. 100%. It's like, wait a minute. I mean, they can't both be right. Maybe they're both wrong.
Starting point is 01:12:21 So at a minimum... By the way, this drives me crazy. It drives me fucking batshit crazy living in New York, right? So all the people I know, they watch CNN, they watch NBC,
Starting point is 01:12:36 they read the New York Times. I'm like, okay, well, how often do you watch Fox? Yeah. And how often do you, oh, I can't stand them. What are you talking about? At least listen to both sides oh yeah and the same thing to people who might read breitbart or might watch fox at least listen to the other side yeah if it's opinion-based stuff right at
Starting point is 01:12:59 least see both arguments of it where is quality whentrough Cronkite when we need him? And then be proud of being... if you don't want to be manipulated, if you don't and you don't think you can be manipulated and you don't think you can be brainwashed, then at least listen to all the arguments and then you can make up your own mind yeah otherwise if you're only listening to one of the arguments by definition you're being brainwashed by that argument you're not making up your own decision so basically we need to be more astute discerning readers of content and not trust it as fact number Number one. Number two, what about regulations on our digital data? Or should we actually be looking at our phones and saying, turn off all the tracking things and turn off our Google, changing our privacy settings?
Starting point is 01:13:55 I mean, how do we protect ourselves besides just being smart about what we're reading? So listen, my hope, my big hope, and this is what should happen because we're an educated society. We really are an educated society. If people literally listen to both sides of an argument, then they can choose who they're going to trust as their curator or teacher of information. And therefore, less people will be reading false information. And remember, these are just businesses that all make money off of advertising. So if less people are reading stuff that's untruthful,
Starting point is 01:14:35 they'll go out of business. They literally will go out of business because they're not making money off of advertising. That's something we can do. What else can we do besides that? But that's a big one. Yeah. That's a really, really big one. Another thing you can do, and this is fun, really fun, and I forgot the name of the company. I wish I had it with me right now. It's an organization that, and it's now up to 400,000 people across the globe,
Starting point is 01:15:05 when they read hate news or see hate messages on the internet and then see recognizable brands like Audi, JetBlue, Walmart, whatever, next to those hate messages they take a screenshot and they send it to the company they send it to the companies they send it to the JetBlue's or the Walmart or the Kmart or whomever right so are those companies deliberately selling ads into those hate messages? Absolutely not. Absolutely not. Those ads get there because of something called...
Starting point is 01:15:48 It's all algorithms, right? No, no. Programmatic media buying, which are these big kind of conglomerates that represent all these small little publishers. So if like a JetBlue wants to reach all these small little publishers, they don't even know what the content is
Starting point is 01:16:15 that their brand is going to be sitting next to. And trust me, like a Walmart, a JCPenney. They don't want their ad next to them. They do not want their ad sitting next to, you know, neo-Nazi type stuff. So there's grassroots movements that are changing. That's a big, big, big thing that people can do. Yeah.
Starting point is 01:16:35 Besides that, you know, yeah, you can go live in a cave. Nobody's forcing. The guy who mows my lawn, he's like, doesn't have a cell phone. He doesn't have an email. He doesn't have a computer. And doesn't have an email he doesn't have a computer and i thought he was a paranoid screwball but turns out he may not be it's all it's all how much we value convenience and what what about the regulation legislation i mean how do we solve this because it terrifies me to think that our democracy is on the decline and that freedom and freedom of choice and autonomy as a country is being usurped by
Starting point is 01:17:10 this kind of invisible force that I think some people are using for their particular ends, which may not be laudable, but you know, in themselves aren't necessarily evil, but when you add it all together, it's, it's a mess.
Starting point is 01:17:23 And I feel terrified for our future and for my children and grandchildren because i i don't i don't understand i mean i i know how to fix the food system i mean i think i do anyway i wrote a book about i have an idea of what the problems are and what the solutions are i really other than like being a smart reader of your news feed and you know creating a grassroots movement to to pressure companies to not do this uh and turning off the tracking on my phone like what what can we do well so number one um and this is something i did want to say um there's a huge debate right now around whether facebook and google should get rid of political ads yeah right now um and uh a lot of um people who are very smart but don't understand uh the
Starting point is 01:18:09 technology all that well yeah uh don't realize that if facebook and uh google uh shut down political ads right now uh that is the destruction of democracy yeah because it's both sides no no because brad parscale and donald no, because Brad Parscale and Donald Trump have already weaponized up. And even if they shut it all down, they already have relationships with something like 40 million Americans. Their email address can find them and all this stuff.
Starting point is 01:18:37 Because the Democrats have been slow to the game, if the big tech giants were to shut down political ads or the tools that are needed like custom audiences and lookalikes, if they were to shut it down now, this upcoming election 2020, it'd be like Trump has a nuclear weapon and the Democrats have a switchblade
Starting point is 01:19:00 if it gets shut down now. That's terrifying. And by the way, whatever democracy does, at least all sides should have an equal voice, an equal ability to communicate with voters. And if the tech giant shut it down right now, only Trump will be able to communicate effectively. Because it's true, because what you're saying is
Starting point is 01:19:28 that up to now, the Democrats really haven't created news, whether it's fake news or whatever. They haven't created their version of fake news or whatever truth or not. And the Republicans have. So again, remember, it's not about fake news.
Starting point is 01:19:47 It's about knowing how to put the pieces of the technology puzzle together to effectively understand you as a human being through your own behavior of what you watch, read, care about, to then be able to message to you certain things around those issues and educate you on who the best candidate is, whether it's Trump or somebody else. That's what Trump has. And Trump is just, Trump's Trump. Brad Parscale is the brilliant digital marketer. Brilliant, okay? So it's just right now that the Democrats are catching up, and in particular, this organization called ACRONYM,
Starting point is 01:20:40 who's really good at it. And it's not fake stuff. It's real news. Well, it not fake stuff. It's real news. Well, it's real news. It's important stuff to voters. And the other thing is, we've seen the death of local media, the death of local newspapers.
Starting point is 01:20:54 Thousands have shut down, which is often the most trusted source of information. But there's a company that was formed called Courier Media to create local reporting and local news. It's legitimate, but it speaks to voters in their communities. And that's really what happened in the Virginia election, where the legislature and the governorship switched over for the first time in a generation, because they were very smart about using very
Starting point is 01:21:22 local, specific, targeted information that was a counterpoint to the other side. And I think that speaks volumes. And I think, you know, right now it seems like David and Goliath, and career media seems like David, but maybe when these issues sort of come out, I think people will be able to sort of think differently about how to go forward in the election. Because I, you know, it doesn't matter if you're Republican or Democrat.
Starting point is 01:21:48 I think there really are a lot of commonalities and values that both parties share. Yes. And the overlaps are often more than the disagreements, sort of like paleo and vegans. They both agree on almost everything except where you get your protein. And far more in common than, for example, people who eat the standard American diet right but there are sort of like this opposing camps and I think the same thing is true in our political process but we we have to sort of start to to sort of take back our democracy I think that's that's really the message listen the the genie's out of the bottle
Starting point is 01:22:19 or Pandora's box is open we're in a period of time with a brand new, I mean, I just told you, the first time it was ever possible was 2014. Compared to the history of our country, that's only five years ago that any of this was possible. that it's like exponentially better than television, print, radio, direct mail, any of these other forms are exponentially, if done correctly. Yeah. If done correctly, then all sides, I don't have to say like right or left, right, left, middle, top, bottom, whomever, it should be. Everybody should be able to communicate using the platforms to their best ability, technically. So I'm trying to help share this methodology
Starting point is 01:23:17 that only Brad Parscale has been an expert at. Anybody who wants to... So you basically invented it. You gave it to him. i didn't give it to him he was taken from me and was handed to that organization i see all right i was just building a business yeah to make money from democrats and republican and centrist and independent because there was a billion dollars of political advertising yeah every election business gonna be really good business what I didn't know is the guy who I was doing it with was in bed with Trump and yeah so we handed
Starting point is 01:23:57 it over to Trump and handed over to this you know Brad Parscale was not a very good digital marketer he was He was a bankrupt web designer. Yeah. But a bankrupt web designer who had the blueprint on how to manufacture a car. Yeah. And, like, step by step was able to manufacture a car. Wow. And now he does it really, really well.
Starting point is 01:24:20 Okay. What final words do you have for everybody so they don't go slip their wrists or jump off off a bridge because i'm like what are we gonna do sure so we're living in a time of absolute freaking chaos like in my and by the way um communication channels like this right i think this is the first time in human history that we're all connected. Yeah. Done correctly as one species, communicating collectively, we can end climate change. Yes. We can change the food system.
Starting point is 01:24:55 Yes. We can educate everybody. We can finally have equality across the board. We can further science. And we can come together as a less competitive society and working together to collaborate for everybody's good. Sounds good. And therefore have better mental health and enjoyable lives for the first time ever in the history of the human species. And to do that, we need to?
Starting point is 01:25:24 To do that, what we need to? To do that what we need to do is be a lot more discerning about the stuff we take as gospel that we read, that we see in the internet, and realize it for right now, for this moment in time, nobody's regulating it, nobody's saying whether it's truthful or false, so don't believe 99.9% of it until you can actually go and do the research. Use the internet to search for a store. Use the internet to search for a product.
Starting point is 01:25:57 Do not get your news off the internet. What's frightening to me is you hear Trump say, don't believe what you're reading. Don't believe what you see. Don't believe what you hear. Just believe what I say. And I'm like, well, he's also saying the same thing, right? Which is be discerning about what you're reading,
Starting point is 01:26:16 about the impeachment, about Russia, about whatever, because... But then just make people back it up with facts. Yeah. I mean, it's an old thing. Just because somebody tells you to jump off a bridge, are you going to go jump off a bridge? No. Like, literally, think before you believe.
Starting point is 01:26:31 Think before you do. That was the other movie I encouraged people to watch called The Push by Darren Brown, where he literally, in 45 minutes, gets totally normal people to commit murder. Really? Yeah. Frightening.
Starting point is 01:26:45 That's how manipulatable we are. Yes. That's what we all should be aware of. Yes. Watch the movie The Push. Watch The Sacrifice. Watch The Great Hack. Definitely watch The Great Hack.
Starting point is 01:26:56 And definitely be careful of what you think is fact or fiction. Make sure that you're being, this is the generation. This is the society of the tipping point so take your time and please don't jump to anger and hate first give people the benefit of doubt and realize that we're all in this together how about we lead with love absolutely lead with love, compassion, and a sense of respect for one another. And realize that it's very easy to have somebody make you angry, frustrated, infuriated, and go into battle. Don't be that easily fooled or that easily riled up.
Starting point is 01:27:44 Yeah. Don't be that easily fooled or that easily riled up. No, if you want to say that you have free choice and the ability to make up your own mind, then prove it. Don't be so easy to be angered. I think that's true. I remember what Dalai Lama said. Someone asked him, are you mad at China? He's like, no. He said, they took my country, but they didn't take my soul my mind you got it or my heart you got and i think that i think is a
Starting point is 01:28:09 good message to leave with you got it so thank you andy for being on the doctor's pharmacy uh if you love this conversation please share with your friends and family on social media i guess leave a comment we'd love to hear from you. Fact check all of it. Fact check all of it. You can read the book Targeted, which is also about this. And you'll hear us next week on The Doctor's Pharmacy talking about another conversation that matters. So see y'all later. And I hope this wasn't too depressing,
Starting point is 01:28:39 but I think it's good to be empowered. And Andy, thank you so much for being on this podcast. And we'll see you next time on The Doctor's Pharmacy. Thank you so much for being on this podcast. And we'll see you next time on The Doctor's Pharmacy. Thank you so much. Hey, it's Dr. Hyman. Do you have FLC? Well, it's a problem that so many people suffer from and often have no idea that it's not normal or that you can fix it. So what's FLC? Well, it's when you feel like crap. And you know the feeling. It's when you're super sluggish and achy and tired, your digestion's off, you can't think clearly, you have brain fog,
Starting point is 01:29:14 or you just feel kind of run down. Can you relate? I know most people can't. In my experience as a practicing physician over the last 30 years, I've identified four main causes that lead to FLC. The first cause is too much sugar in the diet. Surprise. Don't think you eat that much sugar. Think again. Processed carbs from bread, pasta, and cereal turn into sugar in the body. In fact, whole wheat bread spikes your blood sugar more than plain old table sugar. A diet that's high in processed carbs and sugars is the number one culprit for FLC. Okay, the second cause of FLC is not enough nutrient-dense whole foods.
Starting point is 01:29:53 It's not just about avoiding sugar and processed carbs. It's also about what you do eat. Most of us don't eat enough of the right kinds of foods. This means healthy fats, clean protein, and loads of colorful plant foods. If I look at your plate, I should be able to see a rainbow. The rainbow that comes from Mother Nature, not from candy. All right, the next cause of FLC is eating too late and at the wrong time. The research shows that eating too late disrupts the quality of sleep we get at night, which can make us sluggish the next day. It also makes us hungry and crave carbs and sugar.
Starting point is 01:30:25 Research also seems to show that eating too frequently and not giving your body a break from food for 12 to 14 hours negatively impacts the body's circadian rhythms and the repair processes in the body. That's why when we eat is just as important as what we eat. Now, the final cause of FLC is not prioritizing sleep. This is the number one mistake I see people make, even those of us who think we're healthy. You see, sleep is when our bodies naturally detoxify and reset and heal. Can you imagine what happens when you don't get enough sleep? You guessed it. You feel like crap. So now that we know what causes FLC, the real question is, what the heck can we do about it? Well, I hate to break the news, but there is no magic bullet solution.
Starting point is 01:31:06 FLC isn't caused by one single thing, so there's not one single solution. However, there is a systems-based approach, a way to tackle the multiple root factors that contribute to FLC. And that systems-based approach involves three pillars, eating the right food, incorporating two key lifestyle habits, and a few targeted supplements. I've combined all three of these key pillars into my new 10-Day Reset system. It's a protocol that I've used with thousands of community members over the last few years to help them break free of FLC and reclaim their health. The 10-Day Reset combines food, key lifestyle habits, and targeted
Starting point is 01:31:45 evidence-based supplements. Each of these areas supports our health, but when combined together, they can address the root causes that contribute to FLC. Together, they're a system, and that's why I call my 10-Day Reset a systems approach. Now, FLC is a diagnosis. It's not a medical condition. It's just something we fall into when life gets busy or when we indulge a little too much around the holidays or don't listen to our body's messages. It's our body out of balance. Now, everyone gets off track here and there, and the 10-day reset was designed to help you get back on track. Now, it's not a magic bullet. It's not a quick fix. It's a system that works. If you want to learn more and get your health back on track, just visit GetPharmacy.com. That's Get Pharmacy with an F, F-A-R-M-A-C-Y.com.
Starting point is 01:32:31 Hi, everyone. I hope you enjoyed this week's episode. Just a reminder that this podcast is for educational purposes only. This podcast is not a substitute for professional care by a doctor or other qualified medical professional. This podcast is provided on the understanding that it does not constitute medical or other professional advice or services. If you're looking for help in your journey, seek out a qualified medical practitioner. If you're looking for a functional medicine practitioner, you can visit ifm.org and search
Starting point is 01:32:58 their find a practitioner database. It's important that you have someone in your corner who's trained, who's a licensed healthcare practitioner, and can help you make changes, especially when it comes to your health.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.