Front Burner - Should Tech Companies Pay Us For Our Data?

Episode Date: February 25, 2019

Our behaviour online creates a lot of data that's useful for tech companies - what we buy, what videos we watch on YouTube, what movies we see on Netflix. Author Glen Weyl says if tech companies make ...money off this information, we should get paid for it.

Transcript
Discussion (0)
Starting point is 00:00:00 Hey there, I'm David Common. If you're like me, there are things you love about living in the GTA and things that drive you absolutely crazy. Every day on This Is Toronto, we connect you to what matters most about life in the GTA, the news you gotta know, and the conversations your friends will be talking about. Whether you listen on a run through your neighbourhood, or while sitting in the parking lot that is the 401, check out This Is Toronto, wherever you get your podcasts. This is a CBC Podcast. With breaking news, the suspected leader of a capital region cult...
Starting point is 00:00:37 He's been charged with sex trafficking and forced labor... The most searing, awful pain was being dragged across my body. What happens when someone you know tries to take down a bizarre self-help group? The most searing, awful pain was being dragged across my body. What happens when someone you know tries to take down a bizarre self-help group she's been a part of for 12 years? I'm thinking, how am I going to get out of here? Like, literally, where is the back door? How do I escape? Escaping NXIVM, from CBC Podcasts Uncover. Subscribe now at cbc.ca slash uncover. made lately. But you might want to add a couple new jobs to the list. Like the fact that you work for Facebook, and Google, and Instagram, and Amazon, and a whole bunch of other companies
Starting point is 00:01:31 that design artificial intelligence. The list, it really goes on. And while Facebook posted a record $6.88 billion in profit in the last quarter, I'm gonna guess you didn't get your paycheck. That, in a nutshell, is the argument being made by a growing number of people that these tech companies, some of them the most profitable businesses on the planet, should be paying us. That's what we're going to be talking about today on FrontBurner. So I'm talking to Glenn Wild today. He's a principal researcher at Microsoft and the co-author of a book called Radical Markets. His ideas have helped inspire a social movement in the U.S. called Radical Exchange.
Starting point is 00:02:16 Hi, Glenn. Hey there. Thanks so much for joining us today. My pleasure. So can we start this conversation, which I am so interested in? Can we start it here? Why do you think people deserve money for providing are going to pay people for all the amount that they're contributing to a creation. And all this stuff about how jobs are disappearing, about how the future is so grim for the working class, this is basically being
Starting point is 00:03:19 manufactured just as a result of people not being compensated for the ways that they're actually producing the technologies that everyone says are putting them out of a job. And I want to get to that question around AI. But first, can you try and help me understand how people like me help these companies make so much money? Well, so the first thing to understand is what modern artificial intelligence is about. So there was an approach in the 1980s to artificial intelligence that was based on nerdy programmers sitting in a room coding up exactly what a computer was going to do. Dragon Naturally Speaking from Dragon Systems is the first general purpose computer dictation program that recognizes natural speech. that recognizes natural speech. And I don't know if you remember, you know, from the 90s or something like that,
Starting point is 00:04:12 all of those incredibly buggy voice-to-text engines and so forth that never did anything. Those were all based on those principles, and they never worked. And what actually ended up working, the reason we got this explosion of activity around artificial intelligence that's led us to all of our social discussion around it, was that researchers took a different approach. They said, no, we can't do it on our own. What we need to do is take a ton of work that people have done and turn that into something. We have to find a way of sort of composing all of that work that people do together and turning it into a program. And that's called machine learning, where basically all of us are teaching machines
Starting point is 00:04:47 how to do these things, rather than machines just doing it themselves based on something that someone programmed in. And so most people don't realize that that's what's going on, but that is actually how artificial intelligence works today. Those self-driving cars of yours, you know, they keep hearing about. Self-driving car technology that sounded like science fiction just a couple of years ago is now a reality. Just this week, Uber started self-driving car pickups.
Starting point is 00:05:19 All of them are trained on things as trivial as some of those websites where you need to sign in and identify the images that have traffic lights. What do you think that's doing? And so I'm just trying to get a sense here of just the scope of what these companies are doing with our data and then how much money they can make from it. So we talked about self-driving cars, I mean, to a lesser extent, advertising. Are there other examples you can think of? Yeah, absolutely. So I'm sure many listeners have used Google Translate to translate things. Excuse me, can you please tell me how to get to the bus station?
Starting point is 00:06:07 Excusez-moi, pouvez-vous me dire comment vous rendre à la gare routière? Those translations are drawn from human translations. If you've ever seen these auto-recognizers of faces within images, those are done based on human recognition of faces. These automated employment algorithms that are often doing employment now are based on human employment decisions. Effectively, almost anything that you see technology doing today that it couldn't do 15 years ago is a result of training on human data. And can you give me a sense, and I know this is a hard question to answer, but just how much money these companies have been able to make off of our data?
Starting point is 00:07:01 Well, I think it's a lot, but not quite as much as you might think. So probably if you add up all the different companies, somewhere in the order of $100 to $200 billion a year right now. But the problem is that a lot of that's a result of them not really charging for these services and not paying people. So sort of all the money has been taken out of the economy. You can think of it a little bit the way that women's work at home was during the 1950s and 60s, where, you know, women didn't get paid for all the care work they were doing. And they didn't really have a very good opportunity to go out and work outside either. And once you started having women go and get jobs outside the home, there became all these opportunities for care work, you know, for babysitters, for nannies, and really a whole economy got created by taking that work seriously and valuing it. So I think while currently the
Starting point is 00:08:00 companies are maybe making $100 or $200 billion a year, if we really took this whole economy seriously, it would grow in a way and would offer all sorts of opportunities that could really change people's lives. So the idea here is just like how women's work was minimized, people do not fully understand the role that our own data is playing in the creation of artificial intelligence. And how much more of a role it could play if we gave an opportunity to people to be acknowledged for it. You know, acknowledging women's work was fair, it was honest, and it was respectful.
Starting point is 00:08:36 But maybe at least as importantly, it was also liberating and productive. Because women were opened up to the opportunity to find the place within society where they could contribute the most rather than being constrained to the home. And I think that acknowledging the work of data would have a similar effect. Can you give me a sense of how this would work logistically? Like if I wanted to get paid for my data, what would be something that I could do to get paid in return? In terms of the types of activities you could be rewarded for,
Starting point is 00:09:17 I'll give you a few examples. So imagine that you're a chef who maybe records YouTube videos of yourself making all sorts of recipes. And I'm going to put the fattest piece of steak in first because it will obviously take the longest to cook. Now all the time new recipes are coming out. And imagine you have, sometime in the future, we have an artificially intelligent chef. So most restaurants are not actually staffed by chefs on site. They're staffed by robots. Now, how do those robots learn to make those new recipes? How do
Starting point is 00:09:51 they learn, for example, if there's a vegan, what things to substitute for what? They're not going to know that themselves. But imagine that you're a chef at home and you make those dishes, you make those substitutions, you figure out every time there's a new recipe, and you make those dishes. You make those substitutions. You figure out every time there's a new recipe what to do with it. And you train the computer so that it can continually provide new and exciting dishes to the people who go to the restaurants. So that's just one very concrete example. And it also really shows you why you won't run out of tasks. Because human culture is constantly producing new things like new recipes that computers have to learn how to make.
Starting point is 00:10:53 Right now is the fear that I'm a chef and I have a YouTube channel and I post these videos about my recipes and that this information is being taken and used to train these robots without paying me. Yeah. And not train them as well. So it's not just that it's unfair to you, but that it won't work well. Because if you know that you're training a robot, your first reaction is going to be to be upset that you're not being compensated. But your second reaction is, well, if you were just honest with me, I could actually train you better. You could tell me what things are confusing to the robot. And I would do more of those or I would explain them more. And if you just spy on me, I don't get the opportunity to show you what a good collaborator I can be. Do you worry, though, that that will just hasten
Starting point is 00:11:32 this process of taking away people's jobs? I think that technological progress doesn't have to be the enemy of human dignity. In fact, the greatest period of technological progress, basically, in recorded history was from the 1920s to the 1970s, which was also the period that we associate with the best paying jobs, because we had labor unions, because we had the right government protections, because we allowed social structures to ensure that people could be collaborators with the technology rather than victims of it. And that's what we're missing today. We'll be back in a second. Discover what millions around the world already have. Thank you. We talked today about you thinking that this should be something that should be negotiated.
Starting point is 00:13:11 But have you thought about a number here? Like how much people should get paid for their data? Well, I think at present, it's probably somewhere depending on, you know, where you are and who you are in a country like Canada, on the order of $100 to $300 a year. You know, $100 to $300 a year seems like a pretty modest estimate. It might be as high as $500 or $1,000 a year at this point already. But in the longer term, I think it'll be much greater than that because more and more of the economy will be driven by these types of technologies. If even 10% of the economy becomes driven by these technologies, we estimate that paying people for their data would be something like $20,000 a year for a family of four. So that
Starting point is 00:13:59 would make quite a difference in people's living standards. How do you think these tech companies, these guys in Silicon Valley would respond to your argument? Well, I think that, and I've had many, many conversations with them. Because you yourself have worked at Microsoft. I work at Microsoft Corporation right now. Yeah, absolutely. And, you know, the truth is that conspiracy theories aside, it's a diverse crowd in some ways. I mean, and there are different voices inside. And, you know, there are organizational internal conflicts and, you know, divisions and so forth. I think there's many people in the tech world who are very open to these ideas. And there are many people who created existing technologies who are very upset about where
Starting point is 00:14:43 things have gone. Tim Berners-Lee, the founder of the World Wide Web, is totally distressed about the present moment, and he's created a whole campaign to try to fix it. We love the fact that the web is open. It allows us to talk. Anybody can talk to anybody. It doesn't matter who we are. And then we join these big social networking companies, so in fact we're sort of sometimes limiting ourselves.
Starting point is 00:15:06 And there are many people that that influences within these tech companies who feel maybe they have gone in the wrong direction. What about the argument that these companies give us a service? So, I mean, I like Google Maps. It helps me. And that we have just traded our data for these services. Like that is the payment. You have Facebook and you can communicate with all of your friends and stay in touch with them. And that is, you know, the currency that's being traded there. Yeah. I think the problem with that is that it's a strange bargain. It's one very similar to what used to happen under feudalism. So under
Starting point is 00:15:45 feudalism, you know, you would live on a lord's manor, and you'd have a right to chop your, you know, wood from the forest and to build a house and to fish in the pond or, you know, these sorts of things. And in exchange for that, everything valuable that you produced was taken by the lord. Now, I guess you could say that's a, quote, fair trade, unquote. I don't know. I wouldn't call it particularly fair, but it's a trade of some sorts. But it's not one that acknowledges your agency. It's not one that says you grew these crops.
Starting point is 00:16:19 You deserve a value associated with what you grew so that you'll have an incentive to develop the land, to improve it. It fundamentally undermines our agency and our dignity if the contributions we're making are not acknowledged, acknowledged socially in the way that the technologies are described and acknowledged economically in the way that we're compensated. economically in the way that we're compensated. If we don't do this, right, if we don't decide as a society that our data is worth something and that we deserve to be compensated for it, what do you think the consequence of that is? Well, I think we'll likely end up in a world where the vast majority of the economy goes as payments to people who own the machines. And that will leave us with roughly two choices. Either we find some way to nationalize those machines and use the value coming from them
Starting point is 00:17:21 to pay universal basic income or something like that to citizens. Or we end up with a few people who dominate and control the whole society. And if we do start approaching data as labor, what do you think would change? I think that probably the most important thing is that people would have an increased sense of agency and dignity. That a lot of the people who feel that the tech economy is running all over them and that they are passively absorbing things will come to feel a pride every time they use digital equipment. They'll think I contributed something to that the same way that the worker at a GM factory thinks I built that. Glenn, thank you so much for this really interesting conversation. Paying people for their data could seem a little radical right now. Well, it might not be so far-fetched.
Starting point is 00:18:43 California's new Democratic governor, Gavin Newsom, proposed a data dividend that could call on companies like Facebook and Google to pay people for their data. That's all for today. I'm Jamie Poisson. Thanks for listening to FrontBurner. For more CBC Podcasts, go to cbc.ca slash podcasts. It's 2011 and the Arab Spring is raging. A lesbian activist in Syria starts a blog. She names it Gay Girl in Damascus. Am I crazy? Maybe. As her profile grows, so does the danger.
Starting point is 00:19:28 The object of the email was, please read this while sitting down. It's like a genie came out of the bottle and you can't put it back. Gay Girl Gone. Available now.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.