3 Takeaways - Former FCC Chair Tom Wheeler: Our Loss of Privacy Is Worse Than You Think, No Matter What You Think (#105)

Episode Date: August 9, 2022

Your entire life is an open book of information collected by tech companies. According to Tom Wheeler, former head of the Federal Communications Commission, the privacy problem is shockingly large, ge...tting bigger, and has frightening consequences. What, if anything, can be done? Listen and find out.

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to the Three Takeaways podcast, which features short, memorable conversations with the world's best thinkers, business leaders, writers, politicians, scientists, and other newsmakers. Each episode ends with the three key takeaways that person has learned over their lives and their careers. And now your host and board member of schools at Harvard, Princeton, and Columbia, Lynn Thoman. Hi, everyone. It's Lynn Thoman. Welcome to another episode. Today, I'm excited to be with Tom Wheeler. Tom is the former chair of the FCC, the Federal Communications Commission, and I'm excited to learn about privacy and what information is out there on each of us. I think the amount of information is actually very surprising.
Starting point is 00:00:46 Welcome, Tom, and thanks so much for our conversation today. Hello, Lynn, it's great to be with you. Great to be with you as well, Tom. Let's start with privacy. What privacy do we have now? Increasingly less, but there's a broader question here, Lynn, and that's how do you define privacy? I mean, one of the challenges in the whole issue that we call privacy is what do we mean by it?
Starting point is 00:01:15 You could mean the gathering of my personal information. You could mean the securing of my information in data files so that others can't get it. Or you could mean the use of that otherwise obtained data for commercial purposes. And the difficulty in dealing with privacy is just that fact that the term is so broad. And what we really need to do is parse it out and say, really, what is it that we're concerned about? And my concern, at least, is the hoovering up of personal information and then turning around and treating it like a corporate asset and selling access or selling the use of that information. So, Tom, what is known about each one of us? Oh, golly, just about everything
Starting point is 00:02:14 and some of the things that you don't know. I mean, there's a great line, you know, it was Larry Page, the co-founder of Google, who said about Google, if we did have a category, it would be personal information. Your whole life will be searchable. He said that in 2001. But I think he's delivered on that. Can you give some examples? Sure. We all know the obvious ones. When we go to a website or click on a link or something like this, a record is made. Add to that the use of our mobile phones and how it is a tracking device, and you can correlate not only where we've been, but you can make a pretty good guess about where
Starting point is 00:03:05 we're going. Tom, if somebody goes to a doctor or a hospital, who can see or use their test results and can they sell it? Well, that is the HIPAA. It's interesting. The Congress has acted over time to have protections put in place for privacy, but is yet to put in place protections for privacy in the digital era. I mean, early on, there was concern about just exactly what you're talking about, which is medical records. And as you know, anytime you go to a doctor, you have to sign over and say, this is who can get my information and this sort of thing. That's because of an act of Congress. It's interesting that Congress passed a law after Robert Bork's confirmation for the Supreme Court featured information about his video rentals and his library withdrawals, saying that you can't have video information,
Starting point is 00:04:01 information about what you're watching from a video rental store, not from Netflix, from video rental store. And yet Congress has yet to step up to what are our privacy rights with regard to the new digital environment and the platforms that have built their business model on the capturing of your personal information and monetizing. So how is data being collected? How and when is data being collected on us? Anytime you turn on your computer, anytime you hit a stroke, anytime you walk out of the door with your cell phone in your pocket, anytime you use a credit card, and increasingly, anytime you come under a video camera. What data do the telephone companies collect on us? What do they know about us? Well, that's a really good question, Lynn. And it is a benchmark, I think, for how we ought to think about our privacy and what information can be collected by us on the Internet.
Starting point is 00:05:12 For decades, there has been a rule of the Federal Communications Commission dealing with what's called consumer proprietary network information. And that means that absent a warrant or some other kind of lawful tool, a telephone company has to treat who you called, the number, content, and all of the information that you generate when you use a telephone network as confidential information that they cannot release without your permission or some kind of a lawful order. The interesting thing is that that doesn't apply when those same companies are acting in their capacity as an internet service provider. When I was chairman of the Federal Communications Commission, we passed a rule that said, yes, it should, that you have a right to expect that the kind of privacy that you're getting on the telephone network, you want to be getting from your network service
Starting point is 00:06:14 provider on the internet. Tom, there's so much information about each of us from our mobile devices, from our internet. The data's out there on who we spend time with, where we go, what we purchase. There's financial information. Every time I look at cookies on my computer or on a website, often there are a hundred plus companies that are tracking every website I go to. How detailed is the profile that companies are building on us and that they are buying and selling without our knowledge? Well, you just mentioned cookies. Cookies are so yesterday. Google is proposing to eliminate cookies because they have so much data about each of us already that they really don't need to know from cookies.
Starting point is 00:07:08 They can infer what they need to know. There was a great line that Eric Schmidt had when he was chairman of Google, something to the effect that you don't even have to touch your keys for us to know what you're going to do. And we can probably predict really what you're about to do. And the answer to your question, then, is the privacy cat is out of the bag, the horse is out of the barn, whatever the metaphor is. And the amount of data that is known about us today is beyond comprehension. Therefore, the policy challenge becomes, what are we going to do about it? You can't put the horse back in the barn. But we do need to come up with some rules that
Starting point is 00:07:53 recognize, first of all, it's my information. I ought to have sovereignty over that information. And we ought to establish expectations for the companies that collect it as to what they're going to do, that privacy is going to be a forethought, not an afterthought, that you're going to design products in such a way that the only data that they really need to collect is the data that is necessary to provide that particular function, which is not what happens now, which is I've got a connection to Lynn. How much can I siphon out even beyond what is needed to provide the service that she's looking for? And we need to move from a reality in which the privacy, quote, protections that all of these platforms say they have really aren't protections, they're permissions.
Starting point is 00:08:52 Absolutely agreed. Tom, what are three or four things that these companies know about people that people would not think that they know? There's the obvious stuff about who are you last talking to, but also then who are those people talking to? And what does that mean? And if you were talking to Lynn and Lynn was talking to, and by talking, I mean, not just telephone talking, but I mean, you're online with her and have some kind of back and forth. If you're talking to Lynn and Lynn was turning around and she had talked to somebody else who you don't even know, what are the inferences that can be drawn from the data about that person and the data to be applied to you? Simply because we
Starting point is 00:09:36 both had Lynn as an intermediary. Do they know what you had for breakfast this morning? Probably, but it's bigger than that. It is, how can I take what is known about you, what is known about your practices and who you relate to and who you interact with, and then who they interact with and what they do to infer something about you that may or may not be true, but that I can turn around and say, this is probably somebody who is interested in or believes in or leans this way politically or whatever the case may be, and then turn around and sell that inference. So they can tell essentially
Starting point is 00:10:14 what people's political beliefs are by who they spend time with, by what they read on the internet. That's the easy lifting. That's the easy lifting. And what is the hard lifting? What kind of data, for example, would a Facebook collect? The metaverse, which Mark Zuckerberg, the founder and CEO of Facebook, has been a great
Starting point is 00:10:36 champion of, is going to make the kind of privacy intrusions that we live with today seem like child's play. The metaverse in a nutshell is a 3D online interconnected world. Whereas today, you and I go to the internet to get something back. In the metaverse, you literally use the same network to go into a virtual world in which you participate. And it's 3D, and you're wearing glasses or goggles to give you to patent goggles that will read your eye movements and draw conclusions from them about what you're thinking. It will read your facial expression. studies in which they've shown that the ability to read someone's eye movements and facial expressions provides sufficient tools for you to be able to influence the way in which that person acts, including to get them to act in immoral ways. And so when we're talking about privacy, we really have to be
Starting point is 00:12:08 focusing on is what are the consequences of that capture of my information? And what are we going to be doing to get in front of it for the next generation? Because we sure have failed thus far in dealing with the early internet. Tom, how do we get back our privacy? The more data that companies have, the more of an advantage it is for them. How do we take back our privacy or how do we limit what people know about each of us? Well, first we have to think about, do I really want to be on this service? Do I really need to click the yes, I accept button? But there's a broader issue here. And I think the public policy has a huge role to play as we tried to do setting rules for the internet service providers and what they could know. The history
Starting point is 00:12:57 of technology is that innovators make the rules. And that's great because that's where advancements in business, science, the arts, everything comes. But ultimately, those new rules interfere with the rights of others and the public interest. And at that point in time, government has to step up and say, no, wait a minute, we're going to put some boundaries around this. And that's where we stand right now. We need to establish what are the rights of individuals with regard to their own privacy. And again, I go back to the fact that you need to have sovereignty over your own data. This is not a situation where you opt out, you know, where it's, well, we're going to take your data unless you tell us not to. No, I'm going to rob your car unless you tell me I can't. No, that's not the way this
Starting point is 00:13:51 works. And we need to redirect what the consequences of the use of that data is and say that first, when you design a product, you need to have privacy by design and you need to think about that privacy going in. I'll give you one quick example of how that's happening right now and being ignored. We all know about the Alexa in our home and the ring doorbell and the automated thermostat and all this sort of stuff. And all the companies that are providing that, those kinds of services are getting together to come up with a standard to assure that they're all interoperable. That makes a lot of sense because if they can all talk to each other, then the market will expand and it'll be great for everybody. Not once in the design of the standard are they asking the question, what kind of behavioral effects should we also be designing in protections for? No, no. This is just how
Starting point is 00:14:57 do we make things work better and to hell with the consequences. So we can easily be manipulated with all this data, Tom. You're telling us about biometric readings of our facial expressions and how long we glance at anything. People know our medical histories when people spit in cups or on swabs for those ancestry apps. They know how fast we drive when we're driving a car with our phone in the car. There's so much data. Is there any way that people can protect themselves short of our government passing regulation, which does not seem to be happening fast enough? Well, again, it's recognized that you do have some agency here and you don't have to be on Facebook or Instagram or TikTok. The Industrial Revolution brought great upheaval and great inequalities and all kinds of difficulties with workers' rights and consumers' rights and monopolies that we were forced to come to grips with and did. And that made the 20th century the successful century that it was. We need to have the same kind of discipline to come together to put guardrails around
Starting point is 00:16:22 what are the rights of the technologies in the 21st century. And we need to start expecting, demanding that our representatives in government step up and do that and quit defining tomorrow in terms of what we knew yesterday and understand that it was rules that made the 20th century successful and we need similar rules in the 21st century. Although there is an asymmetry, you have all of these enormous digital companies, which are the largest companies in the world, that are trading on our data and making billions of dollars versus individual consumers. Oh, but you had the same situation in the Industrial Revolution. You had J.P. Morgans and the Carnegies and the Vanderbilts and everybody. Let me give you one example.
Starting point is 00:17:26 In the 19th century, the big network, the revolutionary network, the equivalent of the internet today was the railroad. And as these steam locomotives would cross farmlands where they had been given a right of way under eminent domain, in other words, the land confiscated from them, much like our data is confiscated from us, they would spew off cinders, hot cinders, that would set hay ricks and barns and houses afire. And what happened was that the Congress and the courts said, wait a minute, there is this hundreds of years old concept, common law concept of a duty of care.
Starting point is 00:18:10 The duty of care is basically when you provide a product or service, you need to anticipate and mitigate the harms it might cause. Therefore, you railroad are liable for these houses burning down. Guess how fast the railroads put screens across the top of smokestacks to catch those cinders. What we need today is to reinstate that hundreds of years old concept of a duty of care and create, if you will, the digital screen over the ill effects of the new technology. We've done it before. We can do it again, but we're not going to do it by just repeating what we did before. We need the changes that were made in the 19th and 20th centuries, we need that same kind of innovative public leadership to step up and say, we're going to deal with the creation of rules about the consequences of the new technology.
Starting point is 00:19:20 Because there are no rules and regulations for the digital technology. You got it. Tom, what are the three takeaways you would like to leave the audience with today? The first takeaway is that we're not living through the fourth industrial revolution. There are a lot of people who try to, including the World Economic Forum, who try to say that what we're living through today is just the next iteration of the Industrial Revolution. No, it's not. The assets are different.
Starting point is 00:19:49 Those assets behave differently. The networks are different. The networks behave differently. We have to recognize this is a unique situation in history. The second thing that we need to understand is once we realize that this is not the fourth industrial revolution, and that secondly, the innovators make the rules as they always have until they reach a point where they interfere with the rights of others or the public interest, that thirdly, we need to step up and do something about it and recognize that doing things the way we did them yesterday is just an excuse for not thinking.
Starting point is 00:20:30 And that just as in the 19th and 20th century, entirely new governmental concepts were put in place that protected people and allowed the economy to grow. So do we need to do the same kind of creative thinking today. So it's not the fourth industrial revolution. We need to respond to the fact that innovators make the rules as we have in the past. And when we do that, doing things the way we did them yesterday is just an excuse for not thinking. Thank you, Tom. This has been great. I am very much looking forward to your book, Tech Clash. Thank you very much, Lynn.
Starting point is 00:21:13 If you enjoyed today's episode and would like to receive the show notes or get new fresh weekly episodes, be sure to sign up for our newsletter at 3takeaways.com or follow us on Instagram, Twitter, and Facebook. Note that 3takeaways.com is with the number three. Three is not spelled out. See you soon at 3takeaways.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.