The Chris Voss Show - The Chris Voss Podcast – Containing Big Tech: How to Protect Our Civil Rights, Economy, and Democracy by Tom Kemp

Episode Date: August 14, 2023

Containing Big Tech: How to Protect Our Civil Rights, Economy, and Democracy by Tom Kemp https://amzn.to/3DSa9Z9 The path forward to rein in online surveillance, AI, and tech monopolies Technolog...y is a gift and a curse. The five Big Tech companies―Meta, Apple, Amazon, Microsoft, and Google―have built innovative products that improve many aspects of our lives. But their intrusiveness and our dependence on them have created pressing threats to our civil rights, economy, and democracy. Coming from an extensive background building Silicon Valley-based tech startups, Tom Kemp eloquently and precisely weaves together the threats posed by Big Tech: the overcollection and weaponization of our most sensitive data the problematic ways Big Tech uses AI to process and act upon our data the stifling of competition and entrepreneurship due to Big Tech's dominant market position This richly detailed book exposes the consequences of Big Tech's digital surveillance, exploitative use of AI, and monopolistic and anticompetitive practices. It offers actionable solutions to these problems and a clear path forward for individuals and policymakers to advocate for change. By containing the excesses of Big Tech, we will ensure our civil rights are respected and preserved, our economy is competitive, and our democracy is protected.

Transcript
Discussion (0)
Starting point is 00:00:00 You wanted the best. You've got the best podcast, the hottest podcast in the world. The Chris Voss Show, the preeminent podcast with guests so smart you may experience serious brain bleed. The CEOs, authors, thought leaders, visionaries, and motivators. Get ready, get ready, strap yourself in. Keep your hands, arms, and legs inside the vehicle at all times, because you're about to go on a monster education roller coaster with your brain. Now, here's your host, Chris Voss. Hi, folks. This is Voss here from thechrisvossshow.com, thechrisvossshow.com.
Starting point is 00:00:42 Ladies and gentlemen, boys and girls alike, welcome to the giant big top of podcasts here in the internet sky, as they like to call it, the intertubes in the sky, as some senators or, what was it, House members call it. Welcome to the big show. We certainly appreciate you guys coming by. Remember, and you might want to tattoo this to, I don't know, a part of your body. I can't offer it as some sort of service.
Starting point is 00:01:07 You know, what you can get for free. You'll have to pay for it. Your local tattoo artist. But you might want to just remember this. The Chris Voss Show is a family that loves you but doesn't judge you. What kind of better family can it be? What more do people want from me, for hell's sakes? I mean, not even your mother-in-law likes you that much.
Starting point is 00:01:23 So there you go. Anyway, guys, refer the show to your family, friends, and relatives. Go to goodreads.com, Fortune's Christmas, YouTube.com, Fortune's Christmas, and LinkedIn.com, Fortune's Christmas. I have no idea why I give more accent to YouTube. It just sounds funnier and funner. We have an amazing guest on the show, and he's going to blow your mind, open the veins in your brain. I think I just invented a new shirt for the open your open the veins in your brain i think i just invented a new shirt for the show open the veins in your brain he's going to expand the blood flow
Starting point is 00:01:51 to a point that you're going to think more intelligent thoughts beyond what you've ever thought before and i know most of you listen to the show for almost 15 years uh and you're like chris how can i get any smarter well we keep bringing in two to three new guests a day, and there's just much more room. What is it they say? We only use like 5% of our brains? I think most people listen to the Chris Voss show use about 10% or something. But I'm getting a notification from the attorneys.
Starting point is 00:02:18 I can't legally say that. Anyway, guys, he is an amazing author. He's got his newest book that comes out August 22nd, 2023. And he's been sitting here listening to me ask my audience to beg them for plugs and five-star reviews on iTunes for a while here. The name of the book is called... How many segues can I do on this damn show? The name of the book is called Containing Big Tech. How to Protect Our Civil Rights, Economy, and Democracy. Hey, these sound like cool things to protect.
Starting point is 00:02:48 The book is, the author is called Tom Kemp, and he joins us on the show today. And he's going to expand your mind. And if he doesn't, well, we'll just, you can call him and ask for your money back. Tom is a Silicon Valley-based CEO, entrepreneur, and investor. Tom was the founder and CEO of Centrify, renamed Delinea in 2022. He's a leading cybersecurity cloud provider. They were. I'm just having a midday, aren't I? Amassed over 2,000 enterprise customers, including over 60% of the fortune 50, uh, for his leadership,
Starting point is 00:03:27 Tom was named Ernst and young as a finalist for the entrepreneur of the year and Northern California. Uh, Tom is also an active Silicon Valley angel investor with seed investments in over a dozen tech startups. Welcome to the show, Tom. How are you doing? I'm doing great. That's an amazing introduction. I That's the best one yet.
Starting point is 00:03:48 I fumbled a little bit through your bio there, but we try to make it sound fun, and this is the second show of the day, so I can't feel my feet. But welcome to the show. Give us your.com so we can find you on the interwebs, please. Yeah, I'm at tomkemp.ai. There you go. Tom Kemp.ai. You're cornering the AI market. That's the hottest new thing. That's where it's at, baby. There you go.
Starting point is 00:04:13 So, Tom, let's start out. What motivated you to write this book? Well, I've been in Silicon Valley 30 years. I started a bunch of companies. I've been doing a bunch of policy work, including helping pass the California Proposition 24, which gave California the nation's best and first privacy law. And I've just seen a lot of bad stuff happening. And I kind of figured that I could put forth a book that hopefully gives readers a clear path forward to help them and help the country rein in online surveillance, AI, and tech monopolies.
Starting point is 00:04:50 There you go. Containing big tech. And so is there some going on with our civil rights, the economy, and democracy that maybe Silicon Valley isn't always having the best interest for? Look, we now have five large big tech companies. We've got Meta, Apple, Amazon, Microsoft, Google. And certainly they've built innovative products. They're innovative companies. It improves many aspects of our lives.
Starting point is 00:05:19 But what's happening now is the intrusiveness and our dependence on them have really created some pressing threats. And it first starts with the majority of these vendors' business model, which is to collect as much information about us as possible. Now, in the past, it was about serving advertising. But what we're seeing now is that data collection is being weaponized against us. And it's kind of scary, like the amount of data, the sensitivity of the data. And then now you throw in AI, which sucks up that data and really, you know, starts generating things and personalizing things and makes their products more addictive. And if anyone were to use TikTok, they would probably agree that it's pretty darn addicting. And then finally, look, the fact is, is that these companies have evolved
Starting point is 00:06:12 into monopolies and their monopoly positions in the market actually exasperate the problems of privacy, digital surveillance, AI bias and exploitation, et cetera. So I just wanted to put it all together, connect the dots, not only just, but not be a Debbie Downer and exploitation, et cetera. So I just wanted to put it all together, connect the dots, not only just, but not be a Debbie Downer and say, ah, the, you know, here's all the problems, but I also wanted to provide solutions both for individuals as well as policymakers. There you go. That's what we need more of is solutions and people, you know, you can bang on, you know, this is wrong or stuff isn't right, but having those solutions there is really important. So what, what, give us a little bit about your hero's journey, your story, and your background.
Starting point is 00:06:49 What led you to this point? What got you involved in Silicon Valley and interested in technology? Yeah, so I grew up in Michigan. I went to the University of Michigan, go blue, and was a history but also a computer science major, and I just wanted to go out to Silicon Valley and see what it's about. And then when I got here, it was just amazing, the entrepreneurial spirit. And so I started my career at a, it wasn't that big of a tech company at the time, Oracle. Yeah, they had just gone public a few years before I had joined. And then I started doing startups, right? And, you know,
Starting point is 00:07:25 and then my last company, I founded it, was CEO, raised about $90 million in venture capital money. And then from there, you know, Silicon Valley, actually, people, entrepreneurs support other entrepreneurs. And so I'm a pretty active angel investor, which means I give seed investments. I write checks out of my own bank account to other entrepreneurs and help them start their way as well. But with all the goodness and all the hype about Silicon Valley and me living and breathing this here, I got the thinking as you know it's a double-edged sword right that uh you know what we have now is especially with the large tech companies their products are free right but they're actually you become the product right because it's your data
Starting point is 00:08:18 and their actual customers you're not the customers, the customers are the advertisers. And then people are finding ways to exploit that. And then as our laws have changed, then it actually gets to the point where there could be discrimination and other things that occur. And I just wanted to just have my perspective on this, as opposed to some sort of academic talking about this. There you go. Do you discuss AI and some of the complications and maybe ways that we can resolve different... AI seems to be this quite emerging technology that's kind of the wild west a little bit right now. Yeah. Actually, it was kind of fortunate because a lot of my companies that I've invested in, plus the companies that I started, really started investing in AI a number of years ago. And I just kind of felt that when I was writing this book last year, that I could see that the large tech players were making very large investments.
Starting point is 00:09:17 It's clearly exploded. Everything's, you know, chat GPT, DALI, you know, this whole generative stuff. But AI has been around for a long time. And so I really wanted to explain what AI is to people. I wanted to tell people how it could be used, especially like in the medical space, incredibly positive things that are happening in areas such as that. But because humans write AI, bias can see seep in or it can be exploited and you probably are familiar with the concept of deep fakes right and there's a deep fake pornography etc so uh and there's things called dark patterns that trick people and fool people so there's good
Starting point is 00:10:00 but we do need to put guardrails to make sure that we don't become exploited by this technology that basically is getting closer and closer to thinking like humans. And when we have some of that stuff, we probably need to put a few guardrails in. Hi, folks. Chris Voss here with a little station break. Hope you're enjoying the show so far. We'll resume here in a second. I'd like to invite you to come to my coaching, speaking, and training courses website. You can also see our new podcast over there at chrisvossleadershipinstitute.com. Over there,
Starting point is 00:10:36 you can find all the different stuff that we do for speaking engagements, if you'd like to hire me, training courses that we offer offer and coaching for leadership, management, entrepreneur ism, uh, podcasting, corporate stuff, uh, with over 35 years of experience in business and running companies as CEO. And be sure to check out Chris Voss, leadership institute.com. Now back to the show. There you go. I, in fact, I just posted on my Facebook today, and it was from August 7th from Wired Magazine, criminals have created their own chat GPT clones, and they're
Starting point is 00:11:12 running around basically using them as fake scams and ways to collect their own data, probably for illicit uses. And so, yeah, there's a lot of stuff going on. You know, one of the books like yours that offer solutions are really important because you look at, uh, when people go to Congress and talk to these congressmen, they have no clue what's going on. And then they don't have no clue what's going on when people go talk to them in the Congress and, and, you know, about the problems, you know, you see the, it's almost like a carnival, you know, when the, when the Senator house members saying, if I move the phone over here, do you know exactly, you know, weird shit like that, where you're just like, you have no idea what's going on. And you're, you're the guy who's supposed to regulate these people on these
Starting point is 00:11:57 companies. Yeah, no, it was really funny. I remember, I mean, we all know like a Senator, the late Senator Orrin H you know i said well how do you make money mark zuckerberg and you know we we do advertising right but but just recently you know the uh ceo of google was being asked by a congressperson like okay tell me about this iphone thing right you know and i got this problem with this iPhone. And like his response was, Congressman, we don't do iPhone, right? So, yeah. And so, you know, part of it was, was that not only for, for individuals, but providing a road, just try to like, you know, like explain it to your uncle Larry or someone that cares,
Starting point is 00:12:44 et cetera. So I wanted to boil it down, connect the dots. And so an average politician could read this and say, aha, I get it. This is why we need a privacy law. This is why we need to guardrails. This is why we need to address some monopoly issues because the reality is, is that here in Silicon silicon valley we have no fly zones for startups like like people will just like if i come to say you're an investor and say hey i got this great idea for a search engine you just start laughing right because who's going to compete against google right or i got i got a great idea for a mobile operating system people just laugh because you can't compete with it right there. And so we now have major sectors of our economy that are off limits to people such as myself in Silicon Valley. And so having such concentration of power is also of significant concerns and exasperates
Starting point is 00:13:39 the problems of surveillance and AI when you have all the eggs in one basket hey i've got a i've got an idea for silicon valley uh no fly zone sort of stuff uh i i want to buy a uh texting social media company for uh for five times it's a true value uh and then run off the advertisers and then eventually run it into bankruptcy oh shit that. That's already been done. I, I, I, I forgot.
Starting point is 00:14:06 Okay. Yeah. All right. That's a no flies on their Twitter X, whatever the fuck. Here in Silicon Valley, there's a saying if, you know,
Starting point is 00:14:14 how can you make a small fortune start with a large fortune? Right. And so I think he has, he, he has enough fortune to be able to, to write that off. Sure. Yeah.
Starting point is 00:14:24 She probably stick to satellites able to write that off. He should probably stick to satellites and cars versus social media. There you go. You know, I mean, you got to have that write-off pet project, right? I mean, you got to have that deduction for the taxes. So there you go. But it's fun to watch. It's kind of like cops or cheaters. It's fun.
Starting point is 00:14:43 It's more like a slow motion crash. It's more like a car crash. Everyone likes watching that. You just, you just tune into that. I used to do a thing with cops where I would watch cops anytime I was depressed. And I just, I was like, my life is horrible. Everything is wrong. I'm just, I just should give up. And then you watch cops for two hours and you're like, my life is fucking awesome. Look at these people anyway. Uh, Hey, you know, whatever turns the wheels. So, uh, give know, whatever turns the wheels. So give us some different steps. You know, I think a lot of people talk about, they have this attitude that like, well, I'm sure they're big companies and they'll do the right thing. You know, is there a fallacy with people that believe that, you know, money and bigness means a company will do the right thing?
Starting point is 00:15:28 Well, the problem is, is that you're not their customer you're their product so it kind of starts from that right so they're they're very much focused on satisfying who their customers are which is advertisers which they say oh you want to advertise to women between the age of 20 and 30 that are pregnant, right? And they can slice and dice and they can give you that targeted advertising, just like we're so used to. We shop for red shoes and for the next five weeks, every webpage has a red shoe. They're really good at that, right? But the problem is that if you can target someone you know women 20 with young kids etc you can use that to exclude people from ads for rental units from loans etc so they don't see that that's that's where the weaponization can begin and then you know to be
Starting point is 00:16:21 not wanting to get overly political but we're in a post-abortion rights America. And now stuff that you used to be able to do in terms of your Google search, am I pregnant? You know, where's this, that, and the other thing. That's not illegal in some states, right? You know, and so that data can be used against you, right? And so then it becomes like, do we really need to collect is it really safe and sane to collect all this information and i don't in some cases i don't think they set out to be evil but they don't understand the unintended consequences especially the best example of this
Starting point is 00:17:02 is they design products for their fellow tech bros right they don't realize that kids are signing up to this and kids cannot get out of rabbit holes when all those ads and all those postings kind of push them in a certain direction towards drugs or alcohol or or now we have a lot of conspiracy theories etc because that's the stuff that people like and it gets amplified, et cetera. And so, yeah, it was cool that the algorithmic amplification was for the cat video, right? But now it's for other things and that's just not healthy for society. There you go. And the dopamine hits that social media has delivered, the algorithms have delivered, uh, sheer manipulation. I mean,
Starting point is 00:17:46 on one hand, I mean, it's entertaining, but I've been one of those people, especially during COVID when we were in lockdown, where I would, you know, start,
Starting point is 00:17:53 I go to bed about one or two or something and I start flipping through TikTok and then suddenly the sun's up and you're like, what the fuck just happened? Yeah. And, uh, so, you know,
Starting point is 00:18:02 it's, it's one of those things. And where you and I probably grew up in a world where we didn't have so you know it's it's one of those things and where you and i probably grew up in a world where we didn't have you know we probably were spin dialing phones um we didn't have these you know constant beep beep beep beep notification notification you know we had this semblance of human interaction looking at each other face to face talking meeting you know there was all of that but now these kids they're being raised on uh on a thing you know we've had people on the show that are um scientists and they said one of
Starting point is 00:18:31 the biggest problems that we have is we're really not genetically designed or we haven't adapted to looking at 2d screens we're used to looking at each other's faces and seeing you know the blink the the movement and and there's and also being close to each other and feeling each other's faces and seeing you know the blink the the movement and and there's and also being close to each other and feeling each other's energy or vibe or seeing it touching each other hold shaking hands that's why politicians shake hands because babies there's something to the touch that we have and so all these people are you know it it kind of uh i think desensitizes people. Look, here's the deal. The motivation of their business models are to serve ads and to better serve ads, it's to collect as much information and to keep you on as long as possible. So it becomes a hamster wheel, right? And you're right that people say
Starting point is 00:19:20 TikTok knows me better than I know myself, right? And so each video that you're watching, TikTok is actually collecting your swipes, your likes, et cetera, and they're feeding that into the algorithm. And so we as humans, it's very hard to adapt because, for example, if we see a flashing red light, that's danger and we're used to that. But now our phone is like lighting up like a Christmas tree with all these notifications, right? And then the problem is, is that kids, because they're, they're the, oh boy, you know, kids don't want to miss, you know, fear of missing out. Right. And so if they get that notification at 10 o'clock at night, of course they need to look at it, right? And so, you know, we, I think another fundamental thing that I bring up in the book, I have a whole chapter on this, is that we've, in the end, have designed, we have, in the physical world, we've designed things with kids in
Starting point is 00:20:15 mind, right? That's why we have, you know, kids' car seats, right? That's why lead paint, that's why, you know, look at the jungle gyms from when we were a kid all concrete and you know everyone had a broken arm uh by the time they graduated but now they're all wood chips and things but we don't do that in the online world right it's the same user interface for for for kids and they're the most susceptible and you know they give most into peer pressure etc so i think i'm just trying to raise awareness but i also trying to provide solutions like hey here's things that we can do um and it turns out tech really wants to keep it combo it's so complex and all that no actually
Starting point is 00:20:56 there's some really simple solutions there you go yeah you know and even worse what you mentioned with the hamster wheel i need to tape that to the top of my phone. Get off the hamster wheel, dummy. Maybe tattoo it to my thumb so that when I'm sitting there on the phone. Here's what's worse. They've trained me not only to do the hamster wheel, but to run the hamster wheel myself. So with TikTok, I've learned that what I watch for any sort of length of time beyond a few seconds they're going to keep feeding me more of that so if i see something stupid annoying or ignorant because i don't like everything i'm pretty particular uh that comes from my feed i know hey you need to
Starting point is 00:21:36 swipe that really quick because otherwise tiktok's gonna feed idiot more of that and so i've learned to train the algorithm onto myself. Is that like not like... You're hacking the algorithm. I mean, you're basically, you're trying to hack the algorithm. But it gets worse from a society perspective is that everyone is trying to hack the... Like take search, for example. The most, like everyone now tells you just, you got to write your web page or your blog
Starting point is 00:22:05 for seo perspectives because you want a higher rank and so everything is now becoming more and more artificial and then in terms of to feed the algorithm so the content gets there so clickbait floats to the top then google amazon are saying well if you really want to be on the top you have to pay us and so if you compare the google search result page from 10 or 15 years ago to now before it was so simple with a few little ads in the corner now it's just like ads all over the place okay so so we have a fundamental problem in which the tech companies um we're adjusting our lives we're adjusting our content to the tech companies and now we're in a situation is that people are using ai to create fake books in the names of authors on amazon right oh yeah people are doing like
Starting point is 00:23:02 fake travel books it was in the new y Times yesterday. I just saw the travel one. Yeah. Are they putting those in the name of the big thing? Sometimes in the name or a like name, a similar name, like best guidebook to France. And they use AI to scrape this together. It's all print on demand,'s it's being faked and so we have this downward spiral of inauthenticity and then ai can further exasperate it and again there needs to be more pressure put on the tech companies to to actually provide more filtering especially uh of fraudulent uh posts and uh and
Starting point is 00:23:42 and products you know i realized when i was discussing with telling you how TikTok has actually trained me to game myself, I, it gave me this, uh, this, this image in my head that started with, wow, I'm, I'm a, I'm a person who lives in a prison and, uh, I think it's special that I get to pick my jail cell and I go to my jail cell every day and then lock myself in it. Like I don't even have to have jailers. I'm just doing it to myself. And then when I want to, I go out of my TikTok jail cell and I go over to my Twitter jail cell. And I'm gaming all these. So I'm designing my jail cell so they look pretty and they're very comfortable. But really, I'm living in what you're talking about your book where you know i've got certain civil rights i need to be concerned about i've got my privacy and different
Starting point is 00:24:29 things but i'm literally wandering out a giant technological uh tube in the sky of jail cells where i'm going let's go to facebook jail cell today and sit in here for a while and figure out how we can make this look a little bit better but but I'm going to lock myself in here and, and I'm actually going to be the monkey who runs the jail cell, but puts myself in a cell. Yeah. That's insane. I think you're right. But the key thing is the key solution is,
Starting point is 00:24:55 is that they keep you in the jail cell by personalizing things and personalization is based on data. So the key thing is, and I talk about this in the book, is you need to reduce your digital exhaust, your data footprint. And there's ways that you can actually do that, that dramatically cut back by just flipping a few switches. Like on the iPhone, they introduced this app tracking transparency, ATT, not to be confused with AT&T, where you block third-party trackers. When Apple announced that, the next earnings call that Meta had, the owner of Facebook said, we're going to be short
Starting point is 00:25:31 $10 billion in revenue because of the data flow being cut off. Now, on Google, which is basically an advertising company on the Android devices, they don't have something similar to att to block third-party trackers but you can download a product called duck duck go it's an alternative browser you don't have to use it for browser but it also blocks the tracking that happens and so all of a sudden your data footprint of all this data going to the big tech companies and data brokers gets dramatically reduced and then another thing you can do is on your PC, there's a free plugin from a privacy nonprofit called EFF called Privacy Badger.
Starting point is 00:26:13 It blocks all the third-party cookies and tracking. Once you put those in place, oh my gosh, you actually no longer start seeing the red dress or the red shoes or the toilet that you shopped for three months ago you know showing up on every website etc and it dramatically reduces your footprint which means the personalization stops to keep you on the hamster wheel there you go uh you know uh one of the other things too is you and i grew up in an era where the stuff we did as kids uh that may have been a little nefarious.
Starting point is 00:26:47 You know, I think I stole a candy bar when I was five and my mom made me go back and take it back. And I was crying. And of course, I was like five or seven or something. I remember being just a child and I took a candy bar because I was, I don't know. I don't think I even completed to understand what the whole concept of money and stuff was. And, uh, but she took me back and taught me a good lesson. I learned, but you know, I mean, stuff like that isn't on the internet. And I've seen these employers now that have these tools where they can scrape like all of your
Starting point is 00:27:20 social media accounts. They can do a deep dive of seemingly every place you've been i've seen some pretty interesting ones and then hold it against you and and be like oh well you know in 1995 you i don't know you uh i don't know told someone to f off or something you know uh we saw a lot of that with the me too era where you know stuff that people were doing i mean there's there was some very egregious things that needed to be addressed there and and some really horror monsters but i mean you had people you're gonna like at 95 you use the n-word um do we how do we how do we square that how do we reconcile that is is is the person in 1995 still the same person today etc etc so um you know how do we resolve some of that how do we resolve some of that? How do we square
Starting point is 00:28:05 some of that paradigm where, you know, it's, I don't know about holding it against people's rights for jobs. It seems like there's some issues there. If you're, you know, somebody's, you know, I don't know, they don't like the flavor of ice cream that you post on your Facebook or something. I don't know. Well, that's a big issue because you're're you're right that uh an employer i mean yeah there's one thing about yeah you don't want a a crazy racist you know joining your company right um etc right you know we're a racist running your hr department that's making yeah or are you you know there certainly there's value right there but if you're're posting photos on social that, hey, great news, I'm pregnant, right? An employer looks at that and they can say, I don't know if I necessarily want companies also don't think about the the unintended consequences and so their motivation is to build what they call a network effect which is get more
Starting point is 00:29:11 users and then once more users on it it attracts other people as well and so their motivation is by default to have everyone's social profile set to public right which is terrible when a child signs up because you have creepoids you know purposely looking for public profiles of uh kids that are clearly underage um and then also you have the scraping going on where background check companies will add that to their arsenal of, we'll also scan social, right? And they'll dig up that photo when you got completely drunk and you threw up and they may say, I don't want this alcoholic or whatever, but that was the one time in college you drank too much, right? And so my recommendation further to reduce the data footprint is is that you actually have you cannot put your social stuff as public right that's with the exception of something like more
Starting point is 00:30:12 of a professional setting like linkedin um and uh and for god's sakes please if you send a resume make sure it syncs with whatever you have in LinkedIn. I've seen so many cases of people like having the resume and then something separate in LinkedIn as well. And people saying, wait a minute right there. So you just have to become more conscious, but it's about reducing the data footprint. And then just, if you do want to have a public persona, make it more of a professional one.
Starting point is 00:30:43 There you go. Whatever you lied about on your resume, make sure you lie about on linkedin uh i used to be i used to be really good at finding out bullshit in people's resumes uh and a lot of times there are dates there'll be dates and and different things uh let me ask you about this because i've dated all my life. Um, and I remember the first time that I had a girlfriend send me a sexting pick. And I remember thinking in my head, cause I own lots of companies and I don't know how that's pertinent, but I remember my head thinking this, this,
Starting point is 00:31:20 this is going to go all wrong. If people start doing this, like, because you've sent me something that's public if if i if i ever if we ever broke up and i want to be an asshole you know uh i could do this if i sent you something you could do you know i i there's i i don't want my body parts on the internet but i'm kind of old world you know like i said uh before but the thing i see in dating now because i I'm single, I've dated all
Starting point is 00:31:46 my life is this trend, especially with young people where they're sending body parts to each other, uh, over Snapchat. Like it goes out of style. In fact, it's really big with a lot of the guys that hit on young ladies that go, Hey, uh, send me your Snapchat. And of course the, the images start coming. And while these are a lot of young people, I've started talking about this for a while now, where I'm like, we're going to have a generation of people that are so used to sexting each other, and these images are stored somewhere on the internet, if not on phones and people's Google drives and my photos go up to Google Google photos for my dogs and stuff. Um, and we're going to have, you know, somebody running for house speaker of Congress or politician or president of the United States, man or woman.
Starting point is 00:32:33 And somebody is going to go, Hey, I got a really cool picture of, uh, something very private about them that they sent me in 19, 2001. And, you know, these people are gonna be 40 years old. They're going to have a wife and kids or a husband and kids. And, uh, they're trying to maybe do some serious, you know, CEO of IBMs, you know, whatever. And they're going to have these naked images floating around. And, and I'm astounded by how much scale there is to it. And so I don't know if that fits into your idea of privacy and civil rights and stuff.
Starting point is 00:33:04 It does. I mean, actually, you know, when you and I were a kid, we took a technology class and that was called typing. Right. And that was actually my favorite class in high school. And I was like, I'm going to like out type everyone else. I forget what my, you know, like this. I mean, that was my my you know my my competitive juices went out but i really do think we actually need to have a you know have kids take a course
Starting point is 00:33:34 in high school about you know modern day technology and explain to them that you know how these companies make money how you can be scammed. I mean, one of the big scams going around is interns that join a company over college. And all of a sudden they're getting texts from the CEOs of the company and say, Hey, I'm in a lunch meeting. Can you run out and get me an Apple gift card and then send it to me and I'll pay you back. Cause I'm going to, I'm busy in the meeting. And of course you're like the 19-year-old intern. You're like, oh, you drop everything, you run, you spend a couple hundred dollars and all of a sudden you've been scammed out of $500,000 of gift cards.
Starting point is 00:34:20 And so there's the whole fraud that's going on, the phishing, the fraud that's going on. But then there's also that people just don't understand the consequences of oversharing information and how it can resurface. Because once it's on the Internet, it's stored somewhere. There's no, I mean, even like you Snapchat, someone's got a phone and click, right? And they're taking a photo of your phone. And so, you know, people need to be aware, especially at the younger ages, that just like we were taught, you know, shop class, we're taught typing. We need a tech, we need a, you know, tech safety hygiene course. There you go.
Starting point is 00:35:09 The OnlyFans, the women of Dabble and the OnlyFans, and most OnlyFans don't make a lot of money, actually, if you do the stats. There's a lot of people complaining now. They're like, oh, I didn't understand that if I started that, it's now on the internet forever. And it's impacting my ability to get jobs. It's impacting... People, well, or people, you do it once, people record it, and then they offer it for free. And so you may have gotten that $20, $25, et cetera, but now it's out there, et cetera. And then the thing is, or they'll record it, and then they'll use AI to slightly change the voice, the image it's etc so look i mean we do need to better educate people about you know how data is collected and can be used against people and it starts with identity theft it starts with potential you know digital redlining and discrimination like to your point like people are not getting jobs um or not getting
Starting point is 00:36:06 a loan because of something that may have been posted or some data that uh implies that uh they can't pay it etc and most of the data that especially these data brokers have 50 of the data is incorrect but but decisions are being made based on it yeah i've even you know i've had friends and myself we've googled ourselves in in uh ai and you're like you have the wrong guy and you have the wrong data and you've mixed this guy with that guy and like some and and and you're just like wow i hope this gets better but they call it hallucinating it's called there's a term in silicon valley it's called ai hallucination and the and the funny thing is like the term sounds like it's actually a cool term like oh
Starting point is 00:36:50 he's hallucinating like he's on drug like it's a good trip he's having a good trip but but you're but actually it's not good right when you actually uh put something in chat gpt and like tell me more about this individual and it gets the information wrong. And AI can further be used not only for Denny theft, but further push out disinformation and misinformation in society. So we need personal guardrails, but we also need policy guardrails. And the scary thing is that in the United States, unlike all the other major countries, we do not have a national online privacy law. Okay. Now I'm in, I'm in California. I, I worked on the campaign to pass the California privacy rights act in 2020. So we're one of now
Starting point is 00:37:39 12 States, but the other 38 States in the Us do not have a comprehensive privacy law so i have the right to tell a business to delete my data but if you're in alabama a bit and you say i want you to delete my data that the business can actually say no i don't have to delete your data and it's like we need a basic online bill of rights, you know, for the digital age. And it's been frustrating that we don't have that yet, you know, at the federal level. Yeah. It's frustrating that the, you know, these big companies, they can just lean on the politicians and of course fund them and buy all this stuff. You know, we've had a lot of discussion that was Citizens United and SCOTUS and different things. So any suggestions you want to tease out, do you think we need to do to relegate stuff?
Starting point is 00:38:30 I mean, obviously, democracy is a big thing in the news right now. We're dealing with, you know, having our elections being leaned on in 2016 by, oh, I can't remember the name of the companies, but, you know, Brexit and us went through that with Facebook and stuff like that. And, of course, social media was used for January 6th. And so that's being litigated right now. But definitely, you know, protecting our voting, protecting our elections, protecting truth. That seems to be the hardest thing to do anymore. Well, I think the key thing is, actually, I do have one proposal.
Starting point is 00:39:11 And let me run it by you, which is, and this actually comes up even with like at colleges and high schools. A family friend had a daughter and she was told to in high school, and she was told to write an essay, like how to get into a good college. And she wrote a very funny essay, which is I'm going to move to Montana. I'm going to take up the bassoon, like trying to play the whole demographics thing, right? I mean, that was a funny, you're laughing. It was funny. You know what the teacher said? Oh, this is too funny. It was written by AI. And so she was accused of that, right? And then she had, then it got escalated. And so, but now what about political ads or things that that are happening um and so i fundamentally believe that we need the ability for us to take a document an image a url
Starting point is 00:39:54 and be able to ask the large uh generative ai providers did you generate this yes or no like take a url and go to you know open ai slash verify put the url in hit go and it has to come back and said no i didn't generate it or yes i generated this on august 3rd at 2 38 p.m that so what we need is with ai we need transparency right is it real or is it human? I also feel like when we're on an online chat or even talking to someone on the phone, sometimes you get the feeling like when you're talking to someone on the phone, I don't know if this is a human or not, right?
Starting point is 00:40:35 I do that with my mom. So we need like a star 411 or like a prompt that we can type in in the chat and say, you know, some sort of signal that says, is this real or not? And have the ability. This is the worst when you try to call someone and you get the whole recorded messages. There should be by default like some sort of, you know, star six or something like that or star 411 that you that they force them to actually have you talk to a human yeah right and so we're getting to the point where ai is is and it's going to get worse because of the generative aspect that they can actually on the fly
Starting point is 00:41:17 kind of try to solve your tech support problem etc but at some point, got to say, bang, I need to talk to a real person right here because I'm not getting anywhere. And same thing with text and images. We need the ability to know is this photo of XYZ politicians slipping on a banana, is it real or was it created by AI? There you go. And we're already already seeing that politician defakes and different things you know uh some of the things you've you've covered i actually had a friend who uh and he was just trying to be funny and interesting because i'm pretty funny on facebook i think i'm funny let's put it that way i'm a megalomaniac narcissist i guess i think he's funny and so i posted something that was i don't know whatever I was bitching about or rapping about or whatever. I usually try and do snarky stuff on our stuff I see as ironic or silly in human nature.
Starting point is 00:42:11 And so it was a private post. I've had to, I've had to start privatizing all my Facebook cause I get all these, uh, AI, I guess they're AI or they're, they're these bots that are clearly run out of probably Africa and Nigeria that, you know, they look like some sort of Asian young lady. And, uh, you know, there's, they're clearly, that's not who is writing me and they're always hijacking the comments. So I've, I've had to move to privacy. Um, and, uh, so I had a friend actually take one of my things that I said, which is kind of critical and i didn't want to put out in the public sphere and he put it into uh chat gpt the post was marked private he put he put the thing into chat gpt and told it to write kind of a funny spin on what chris foss was defining here and what chris foss is about and i'm like and when he reposted my comments i'm like are you fucking kidding me did you just take
Starting point is 00:43:07 something that's private to me and put it into chat gpt and now that shit is buying what's real and so uh you know we we have that probably coming more in the future where you know politicians will be seen you know uh maybe having heart attacks or faking health problems or whatever the case may be. It's probably going to get really weird by 2024 presidential elections. Absolutely. I mean, we do need kind of a bill of rights, and it starts with having nutrition labels, right? I mean, if you kind of look at the evolution of like food safety or car safety, this is all consumer protection. It first started with a label that says, you know, 80% of this was created by AI.
Starting point is 00:43:53 And then it gets into transparency. Is this AI or not? With my proposal to be able to ask the AI, with also the the they can't keep the information so if i have a text document that i want to say did you generate or not they can't turn around and start using it themselves so that's that's part of the proposal that i put forth in the book but but the whole thing is is that it's about guardrails and you know the same evolution that happened with like car safety like you know if you have a kid a young baby you wouldn't probably just have them in the lap you probably would have the whole baby
Starting point is 00:44:30 seat etc uh you know wipers airbags etc and we we need wipers airbags uh kids car seats um you know as with with ai uh and this technology as well and just of course yeah the automobile industry 50 years ago they didn't want to put a wiper that worked on their seat belt and all that stuff because that would add 30 cents of to the production or whatever but it's look we're not trying to i'm not proposing that we make the tech industry like the airline industry, but we need to make it more like the car industry, where there's rules to the road, there's some basic safety with the actual equipment, etc. And that's what we need to do, is we need to empower both individuals as well as tell policymakers that this AI wave is coming, and gotta do something because boy it's going to be really bad uh if we don't jump in here there you go uh is it section 230 the law that gives uh social media networks kind of some real leeway and we had some challenges that with the uh 2016
Starting point is 00:45:39 election and the uh january 6th and other different things. What do you think about that law? Does that law need to be updated or changed? Yeah. So the law is called the CDA, right? And Communications Decency Act, Section 230. And it basically is like the Magna Carta of the internet, which is that if you're a platform provider, you do not have to worry about what is actually posted.
Starting point is 00:46:07 So you're not technically like a publisher. Like if the New York Times posted something, or Fox News, like what recently happened, they could be sued for defamation. But basically, but on the Facebook platform, Google, YouTube, if the same type of things that were posted about Dominion and all that stuff was there, Google's like, hey, we're not a publisher, section 230 protects us, etc. Right. And so I understand. And so it hasn't been whacked away. And it is allowed these large tech platforms not to have to worry about content curation but we're now in a situation in which algorithms are amplifying the content and so i think the fundamental issue is is that when you have algorithms amplifying content does that not actually start leading them into becoming an actual publisher because they're putting their thumb on the scale of what content gets seen or not um and so that's the interesting debate that we're having i do think that there can be some
Starting point is 00:47:19 reforms to section 230 again which is basically the consumers need to have rights to if they see something on the platform they have to they should have the right to object and have the tech company respond in a timely manner as opposed to basically getting poop emojis coming back and saying we don't care right yeah we've seen the viral you know it they always claim well there's no way we can catch these facebook live videos and you know as soon as something starts going crazy but i don't know maybe they're getting better at but maybe it seems like you know tiktok knows within a few different adjustments that i make that it will change the algorithm for me i can manipulate the algorithm yeah and just by what i watch and if i watch about five videos of like i don't know like
Starting point is 00:48:05 i started watching some of the maui hawaii fires last night because i you know i was really heartfelt about the crisis that was going on there with everything burning and of course naturally my feed changed it's it's all you know hawaii stuff and things burning and uh so i was able to basically tune in if you will uh and what and what's interesting to me is, you know, we've seen these, uh, these terrorist shooters that have gone onto Facebook and shot up, uh, churches and different things streaming it live. And they're like, well, there's no way we can catch that. And you're like, I can change the honor of the visit as, uh, on my own recently.
Starting point is 00:48:42 Uh, just today they were talking today about, uh talking today about Biden, President Biden was coming to Utah. There was somebody online who started threatening to, you know, possibly go after him with weapons and do harm. And, of course, the social media people, I guess, picked up on it, alerted the FBI and all that good stuff. So they have ways of identifying this stuff it's just whether they really care enough or want to have that semblance enough to go uh you know is more harm bad than good i guess look we should have freedom of speech but there shouldn't be freedom of amplification and and and reach um and the the funny thing is is that within 10 seconds of you looking for a red shoe all of a sudden every website has the red shoe so if they're smart enough to do within 10
Starting point is 00:49:34 seconds figure that out i i agree with you they they just need there need to be needs to be more pushing um and and and one way you can do it is you can require some of the information to be made available to researchers for analysis that can feed back and they've been blocking that um and then the other thing is is that there should be kind of a customer complaint um where they do have to respond to that. And you could also limit the complaints to say, these are the hundred organizations that have been vetted, and these are the ones that we'll listen to, and they can do the filtering as well.
Starting point is 00:50:12 So there are solutions to this, right? Look, I don't want to, you know, the internet, people can spout out crazy ideas, et cetera, but those, you know, just like someone can walk around and swing a stick as they walk down the street, but it ends at the tip of my nose. Yeah.
Starting point is 00:50:30 Right. And so I think that there needs to be a mechanism for empowering consumers to be able to flag this and maybe to one of these 100 organizations that analyzes it and then officially gives the you know a feedback to to meta or google or whomever etc so there does need
Starting point is 00:50:52 to be amendments uh to the cda which has that section 230 that gives them kind of carte blanche to not be considered a publisher there you go uh you You know, I saw, like, I don't know where the, these are, they're fairly timely, fairly old. Not two, but maybe three or four years ago, somebody, some very big name news agencies had done the digging and they'd found that they did some experimenting too. And I think they work with some agencies that deal with child trafficking but they found that as as soon as young people join uh instagram that are underage uh they will start getting immediately start getting swamped by predators and and everything else and it's like it's like i think they timed it it was like within 15 minutes to a half an hour yeah and i talk about that the book containing big tech in the chapter on kids safety and it goes to the point where instagram
Starting point is 00:51:51 wants network effects and so by default the switch is set to public for new accounts right and so there and california actually passed a law last year called the Age-Appropriate Design Code, AADC, which is based on a UK regulation that actually mandates that for kids under 18 that the settings should, by default, be set to private versus public. Here's another example you should be you should these tech companies should be banned from sending notifications after 10 p.m at night for maybe kids up to 16 and maybe up to midnight for kids under 18 they can do that right because there's nothing worse like uh you know i mean you and i probably deal with this or like our phone is going off with notifications at two o'clock three o'clock in the morning but we But we're used to just like, oh, just forget about it. But a kid, it's like, oh my God, what if a friend posted and I don't respond with a smiley face within 30 seconds?
Starting point is 00:52:54 Am I going to lose the friendship? So there are ways that you can design tech products with kids in mind. Much like we've made that evolution with kids online safety in the physical world, we should do that as well. And those are a couple of examples that you did a great job of highlighting. It's like you need to make these accounts private by default. And it's up to the tech companies to do it. Yeah. We've had people on the show that talk about ways to protect your kids where you make sure that they don't post anything that shows a background that might give the location uh you know even x it was x fill tags
Starting point is 00:53:31 on photos that tell locations yes you know make sure that you know you have that blocked uh i've we've had some people on the show that talked about different ways to protect your children and it's interesting to me what i'm leaning up to is you know as as human beings we we assess and evaluate the wrong things you know we we think about you know like a terrorist is going to blow us up well the odds of that you know the odds sequence of just about anything we over amplify uh and we worry about the wrong things you know like like maybe um we we need to but what i'm trying to get to is we need to demand more of our politicians and demand more of them to look into this more of them to design rights
Starting point is 00:54:10 you know and we get led around by the nose by a lot of these you know buzz paul political things and maybe regulating some of the stuff that we've talked about today especially for kids etc etc protecting you know maybe helping regulate some of the dopamine hits that are, because I know people that are addicted to the dopamine hit and, and the likes and the attention and the validation they get. A lot of women are wrapped up in that. It's, it's, you know, they're just constantly in their phones and that attention, that dopamine hit their people are literally addicted to it.
Starting point is 00:54:44 They can't put their phones down and they can't live without them um but we need to demand more of our politicians you know maybe maybe having somebody from the lgbt community read to your kids at a school isn't quite as dangerous as or even dangerous as you know your kid being on social media and you don't know who's talking to him in their dms well look i mean the there was a lot of concern about tiktok because in the end the parent company is china based and like oh my god all the the people in china you know these this chinese government could get access to your your your swipes and your likes and all that stuff but but the funny thing is the same data collection is being done by meta google snapchat etc right and there's nothing stopping a third-party entity from using a credit card to advertise and push you in certain
Starting point is 00:55:34 directions um and so it's it's fundamentally you know we need data privacy rights and we as consumers need to reduce the the digital exhaust that we give out um and that's just so critical in today's digital age and yeah you're absolutely right just the the level of addictiveness of of technology uh it's just it's just out of control and i just i worry most about kids that don't have the natural, that, you know, if we see a rabbit hole or we see, as you said, some people that are from,
Starting point is 00:56:10 you know, that are posting things that, that you, you can probably immediately tell that that's not a real person. But, you know, a 13 year old kid that gets a high from someone, they don't know who it is. Right.
Starting point is 00:56:23 They don't, they don't have the guardrails, and that's scary. I remember like 10 or 15 years ago, my sister was complaining about her computer, and I was like, what's going on? Yeah, it's not working well. Can you help me with that? So I went over to her place, and she had like 20 of those button bars. Remember those button bars were like the thing?
Starting point is 00:56:42 And she had 20 of them actively working that covered half the screen. And I'm like, how do you use this thing? But she, you know, she signed up for everything. Yeah. No, it goes back. I think we do need just like, you know, you and I went to shop class to learn, you know, put the glasses on, learn how to cut a piece of wood. And of course, typewriting, my favorite course. Boom, boom, boom, boom, boom.
Starting point is 00:57:03 The only thing I learned in the school that had any value. Yeah. Well, typing, the typewriting course my favorite course. Boom, boom, boom, boom, boom. The only thing I learned in school that had any value. Yeah. Well, typing, the typewriting course was the best for me. But yeah, no, I think we do need to raise awareness of these issues and start it young. And I think also we need to have a class that teaches what this does to our brain. Like we've had neurologists on that have talked about this and the dopamine hit the the fear of missing out and you know the emotions it creates you know there's a lot of young girls that they've they've uh done articles on they've talked about the depression that they feel because uh you know you're able to go well i don't look like this girl and that girl
Starting point is 00:57:41 and you know and then the in fighting the competition, you know, there's, there's people that commit suicide over, you know, some sort of attack, or even there were suicides of young boys who, uh, had gotten fished by predators and then told that they were going to exploit their photos. And, you know, they were, these kids were, I think under 10 or teenagers and they off themselves because over the shame. Anyway, I highly recommend people look at your book and understand what's going on, read it, and all that good stuff. Give me your.com, Tom, so people can find out more in the interest. It's tomkemp.ai, T-O-M-K-E-M-P.A-I. And you can just click there to order the book. It's going to be out on August 22nd, but it's fully available for pre-order.
Starting point is 00:58:24 It's an e-book audio book and then good old paper there you go demand more from everybody our politicians you know uh help with the economy democracy and civil rights these are all important things and and understand how how bad you're being spied on order the book wherever fine books are sold it's available august 22nd 2023 containing big tech how to protect our civil rights economy and democracy something everyone should really be thinking about because ai is kind of wow it just opened the wild west and and uh you know every one of these technologies i remember when i wanted twitter came out and we're like hey we can talk to everyone in the world and kumbaya and it's John Lennon's Imagine,
Starting point is 00:59:06 which song I love, and we can all be one big happy family and, you know, between governments who went, what, you opened a Pandora's box? Let's see what we can do with that thing. Some evil. And then Mark Zuckerberg. Oh, I threw some shit there.
Starting point is 00:59:21 So anyway, order up the book. We're fine books are sold. Thanks for watching. It's for tuning in. Go to goodreads.com, for it says Christmas, order of the book, wherever fine books are sold. Thanks for honest for tuning in. Go to good reads.com. For instance, Chris was LinkedIn.com. For instance,
Starting point is 00:59:28 Chris was all those places we were on the internet. Thanks for tuning in. Be good to each other. Stay safe and we'll see you next time. Great.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.