TED Talks Daily - Can big tech and privacy coexist? | Carole Cadwalladr and Chris Anderson

Episode Date: April 13, 2025

“If you can’t respect the basic fundamental underlying principles with which we order society — which is ‘Do not steal’ — then what are you left with?” asks investigative journalist Caro...le Cadwalladr. Following her TED2025 stage talk, Cadwalladr is in conversation with Chris Anderson, head of TED, to warn about surveillance fascism. What happens when big Silicon Valley companies take over communication platforms and intellectual property is weaponized against you? She suggests that when you feel powerless, it’s often actually because you are powerful — and explores why it’s so important to fight information chaos by supporting independent media and journalists. Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:00 Support for this show comes from Airbnb. Last summer my family and I had an amazing Airbnb stay while adventuring in Playa del Carmen. It was so much fun to bounce around in ATVs, explore cool caves, and snorkel in subterranean rivers. Vacations like these are never long enough, but perhaps I could take advantage of my empty home by hosting it on Airbnb while I'm away. And then I could use the extra income to stay a few more days on my next Mexico trip. It seems like a smart thing to do
Starting point is 00:00:31 since my house sits empty while I'm away. We could zip line into even more cenotes on our next visit to Mexico. Your home might be worth more than you think. Find out how much at airbnb.ca slash host. This episode is sponsored by Oxio. Home isn't just a place. It's a feeling, a connection. And let's be real, in 2025, home is wherever your Wi Fi works best. That's where Oxio comes in, an internet provider that actually feels like home. With Oxio, what you see is
Starting point is 00:01:03 what you get. Fair fixed prices, no surprise hikes, no exhausting negotiations. They've never raised a customer's price and they never will. That means more peace of mind for your movie nights, deep dive research sessions, or endless scrolls through your favorite feeds without lag getting in the way. Plus, with speeds up to one gigabit per second, you can stream, game, or work without interruption. And if Oxio doesn't make your internet feel like home, you've got 60 days to get all your money back. Visit oxio.ca if just like Oxio, you were born to be online.
Starting point is 00:01:37 Use promo code TEDtalks at checkout to get your first month free. This episode is sponsored by Edward Jones. You know, as I talk about these big ideas that shape our world, I sometimes think about the decisions that have impact on our daily lives, like financial decisions. That's where Edward Jones comes in. Earning money is great, but true fulfillment in life isn't just about growing your wealth.
Starting point is 00:02:02 It's about using your resources to achieve your personal goals. And Edward Jones gets this. Their advisors take time to understand you as an individual. They build trusted relationships to help you develop strategies that align with your unique goals. What's special about Edward Jones is their holistic approach. They see financial health as a key part of overall wellness, just as important as physical or mental well-being. It's not about chasing dollars, it's about finding balance and perspective in your financial life. That's something anyone should be able to achieve. Ready to approach your finances with a fresh perspective? Learn more at edwardjones.ca. Money's a thing, but it's
Starting point is 00:02:43 not everything. You're listening to TED Talks Daily, where we bring you new ideas and conversations to spark your curiosity every day. I'm your host, Elise Hugh. Earlier this week, we shared an explosive talk from investigative journalist Carol Cadwallader about a fast-moving technological coup threatening democracies around the world. Make sure to check that out in your feeds. Today we're bringing you an extended conversation between Carol and Chris Anderson, the head of TED. Chris and Carol sat down to dig into the rise of techno-oligarchies and the vast implications
Starting point is 00:03:23 of living in an era of data surveillance. They reflect on whether progress and privacy can coexist, and why Silicon Valley's favorite mantra, move fast and break things, could be more dangerous than we think. That conversation is coming up. Carol, Carol Walter, it's so nice to get to sit down with you. A few days ago, you opened the TED conference with an absolute blockbuster of a talk, got a huge reaction from people. In a one-sentence summary, and I'd like you to expand on this, what you argued was that we're in the middle
Starting point is 00:04:06 of what looks like a digital coup, that the combination of Trump and a collection of big tech leaders is in danger of creating a new kind of autocracy in America. Is that about the core of it? Yes. Yes, that's right. Well, I mean, you know, I put up the photo from the inauguration and one of the things that I found really resonated with people, which is so it's the photo of the tech leaders behind Trump. And, you know, I called it tech bros in hostage
Starting point is 00:04:39 situations. And it's this idea that Silicon Valley has been captured by the administration and the administration is acting in all sorts of unlawful ways and Silicon Valley is now part of that. And the main way in which Silicon Valley is helping advance this is what? Well, you know, I talked about, for example, for me, the big danger moment was when, you know, the first weekend that the administration took office, Elon Musk sent his, I call them cyber troops into the US Treasury, where they gained unlawful access. They got access to the nation's data,
Starting point is 00:05:29 its financial data, and he now has that. For me, as you know, as I know, I call it the crack cocaine of Silicon Valley is always data. You need data to feed the AI and you can never put it back. I mean, that's one of the whole things. Once you've got the data, when you've got the entire nation's data, you can't just put that genie back in the bottle. And that to me is that this is a power grab, which goes
Starting point is 00:06:01 beyond any of the guardrails of democracy. And that's not just about now. Silicon Valley, as we know, does not think in four-year cycles. This is absolutely about a land grab for the future. That's what I was trying to say, really. Goes beyond politics. So I think probably the purpose of this conversation now is for me to gently try and play devil's advocate. Tennis obviously, we're trying to be open tent. We want people of all political views and so forth to, we want to listen and treat with curiosity and respect and so forth. So, I'm going to frame what a different view of what's happening might be and see what you
Starting point is 00:06:44 make of it. One thing to say, first of all, is that Silicon Valley is not a thing. From inside Silicon Valley, they would probably all say, no, these are our competitors. Elon Musk, Mark Zuckerberg were both there, but they are competitive enemies, to say the least. And so, like you have, I think it was you who coined this powerful term broligarchy. I believe so. I started using it a year ago and I was because it was like, oh, hold on a minute. What we're seeing here is this elite, this business elite, like oligarchy, but it's tech bros. And I was like, of course it's broligarchy.
Starting point is 00:07:18 Right. But so the way I think of an oligarchy is of a group of powerful people kind of acting in unison. And I think they would say that they are not acting in unison. Largely, they are competitive with each other. And maybe there are some aligned interests like having legislation that makes it easier for companies to expand dynamically and so forth. But that's one piece. Like, if it was the case that they generally are competitors with each other, what's the sense in which you feel that they're acting as a group here? I don't think there's any conspiracy here and I don't necessarily think they're acting as a group at all. And this is where I think it's really helpful to look at America now from the frame
Starting point is 00:08:01 of understanding what has happened in other countries. I'm really sorry I've got such a sore throat from talking so much, Chris. I apologize to everybody. So, you know, I think Russia is a template here for what is happening, which is the breed of we call them oligarchs, right? They didn't agree with each other. They were suing each other. They were sometimes murdering each other, but it was they needed a relationship with Putin with power and In some cases it was about enriching themselves about creating opportunities But a lot of the time it was also just survival Which is and that's what I mean about them looking like hostages
Starting point is 00:08:39 There wasn't a choice it feels to me in terms of who was up there on the inauguration. Trump knew that he needs Silicon Valley because in a standard coup, when the military takes over, the junta takes over, the first thing you do is take over the radio station, right? You need to have the means of communication. And in this case, the means of communication are these big Silicon Valley companies. It's such a colossal thing that is happening right now. It's not just America. Of course, we can see that in many ways. But the fact is these are global communication platforms, and they are now in an alignment, captured, whatever you want to talk about
Starting point is 00:09:23 it, with what is a coming autocratic regime. Right, so they, even though they're in competition with each other, you're saying that they share a need to have the president's approval, so they're doing things to win that approval and thereby they're helping to construct and empower the creation of a kind of autocracy. I mean Trump was explicit in his threats right? I think he sort of threatened Zuckerberg with jail you know it's partly carrot but it is partly stick you know and that is understood I think that there is there is both the carrot which is is there are opportunities, it's going to tear up regulation, it's going to make it much easier to do the things that they want to do. But there is also the, we can see that if you're not obeying, then life is going to
Starting point is 00:10:16 be very difficult and we see that playing out in all sorts of ways. So for example, with media organizations, we're seeing lawsuits on a daily basis, we're seeing it against big legal firms, that's one of the most shocking things for people. So it's not just they're sucking up to Trump for the sake of it, they feel they don't have a choice. It's unquestionably true that there was a big swing in Silicon Valley that has traditionally been left of center toward Trump over the last six months of the last campaign. If you talk to people there, most of them, I think, would have explained it as follows.
Starting point is 00:10:50 They would have said two things. One, these companies are reacting against years of sort of progressive culture that they didn't like that got in the way of building the stuff that they wanted to build, and a belief that the deregulation commitments of Trump and the explicit effort he made to do things like embrace crypto and so forth showed that he was interested in having a sort of a positive environment in which technology could flourish. From that standpoint, like a defender of Elon would say, look, Elon is known for running businesses more efficiently than anyone on the planet. Cut 75, 80% of the workforce at Twitter and at least operationally, functionally added
Starting point is 00:11:41 features and so forth. What they would say is it's fantastic and amazing to have, for the first time really, a really powerful businessmen come into government and apply some of the tools to save what are crazy wasted costs in government, probably by common consent. And that the key to do that is that you have to start with the actual information systems. That's the pathway to do that. So what might look like, oh, he's going in to seize the data for his own use, is not that. It's delivering the means to figure out how to make the cuts. Do you see any rationale for that at all?
Starting point is 00:12:23 I could absolutely see from an engineer's brain that looks totally, completely rational and absolutely and why wouldn't you? And also if you're an engineer, as we've seen in Silicon Valley, laws, regulation, you know, sod that, we know how to do it better, we know how to do it faster. If we do it quick enough, then actually,
Starting point is 00:12:43 it'll take them ages to catch up with us. And we've already done it. Like that is the history of Silicon Valley. Move fast and break things. You know what though, we've got to stop using that phrase. It sounds so innocent and it's like, oh, it's like a baby breaking its toys. What that means is breaking the law and getting away with it. It's having absolute impunity and knowing that it takes ages for regulators to catch up. And that has created the situation that we're in. And this is exactly
Starting point is 00:13:14 how DOGE is working. So everything that is done, for example, the cuts to USAID. Devastating, devastating cuts. That is money which was allocated by Congress. This is not lawful and this is to use the Silicon Valley framework. It's because they've always gotten away with it. So you know if you do it fast enough, it's then too late. The damage has been done and the world moves on. And that is the mistake that we have made with Silicon Valley time and time again. And it's why now, whilst this is happening in real time, this is the moment that people have to act. Because if you want to take the lessons from authoritarian countries, then it's too late after the fact. The longer that this goes on, the longer that the breaking, the wrecking, the vandalism, the illegal and unlawful behaviour
Starting point is 00:14:13 goes on, it's more is consolidated, the harder it is to fight back. And so my talk in essence was about the fact that even though it's confusing, people are in denial, they feel powerless. There's this moment of paralysis, but actually people do have power. That was what I was trying to communicate in my talk really as somebody who has experienced powerlessness. I think, as I said, it was only coming to Ted that I had this revelation about, actually, when you're at your most powerless, it's often because you are powerful. That's why you have to be stopped. And the people of America are more
Starting point is 00:14:58 powerful than these guys, right? There's more of you. And you have values and morals on your ethics, a belief in the law on your side. So that's the thing that I'm trying to communicate. Support for this episode comes from Airbnb. Every year I travel to Vancouver for the TED conference, a week filled with big ideas, inspiring speakers and late night conversations. But while I'm away, my home just sits empty. I've been thinking, why not list it on Airbnb?
Starting point is 00:15:33 Hosting could help cover some of my travel costs and maybe even let me stay an extra day in Vancouver to soak in the city's beauty. Instead of rushing to the airport, I could take one more walk along the seawall, grab another amazing meal, or relax at the spa after a busy week filled with inspiration. Hosting on Airbnb feels like the practical thing to do and Airbnb makes it easy to get started. Your home might be worth more than you think. Find out how much at airbnb.ca slash host. I used to say, I just feel stuck, but then I discovered lifelong learning. It gave me the skills to move up, gain an edge and prepare for what's next.
Starting point is 00:16:18 The University of Toronto school of continuing studies, lifelong learning to stay forever unstuck. Well, you really touched a nerve in the most powerful way by being eloquent, as you just heard, and by being vulnerable and coming at it from a very personal space. When it comes to the demolition of USAID, I personally know organizations of people who were wrecked by that. And I think history will show that that was a pretty brutal and reckless approach. So, can I turn the tables, Chris, and ask you about that? Because, you know, TED has done this amazing thing of bringing together innovators, plus also people who think about the really hard problems that the world faces. And it's always been about a sort of synthesis between those and finding new ways. And this spirit of optimism has always run through the place.
Starting point is 00:17:12 And for me personally, you know, it's been a big thing. I first came to TED in 2005. But this is something really different, isn't it? And do you find it hard to retain your your optimistic frame? I've always described myself as a determined optimist which means that no matter how dark things are you look for a pathway forward that has some hope and you try and shine a light on it and hopefully you know you can find your way there. I've always believed that the worlds of ideas, innovation, technology are actually ultimately
Starting point is 00:17:47 more powerful than politics. And I'm dismayed at the world of politics right now, dismayed at it because it seems pretty helpless. There seems to be an impossible divide between two tribes. I actually think that over the next three or four years, the even bigger story will be how technology plays out because I think AI is growing in power at such a speed that it, you know, it will be more important than the political decisions are made. So for me, my focus is can we think of a way of ensuring that we get the best of AI and not the worst. And that takes
Starting point is 00:18:27 us I think into one of the key conversations I want to have with you is around data. In your talk, you so powerfully talked about how these companies are extracting our data. It's surveillance capitalism, surveillance fascism, I think you called it. I cut that line. You know, can I just say that? I was really sad. When I woke up the next morning, I was really sad because I had that line in there of like, this is no longer surveillance, capitalism. We're on our pathway to surveillance, fascism. But because I was like mindful that everybody was like, you've got to cut it down a bit. So I lost that. And it was one of my, it was like, you know, you said-
Starting point is 00:19:01 Well, there we go. We've got it back. We've got it back now. Thank you. But this is, I think, it's something that I'm wrestling with myself, because it's true that data, your data, used by someone in power against your interest is a horrifying thing. It's also true that in a way, you know, everything works from data.
Starting point is 00:19:24 You need information to have any kind of useful knowledge. So, the sharpest way I can put this is this. Let's say I've got a very powerful AI companion, right, that I'm consulting and I'm getting wisdom from and so forth. And you ask the question, do I want it to know about me or not? I think most people will end up concluding that they do want that AI to know about them because it's only by knowing about you that they can actually give you wise advice that's tailored to what you need and who you know.
Starting point is 00:20:01 And it's almost like the classic sort of examples about the misuse of data around, you know, the advertiser knows before you do that you're, you know, you're a target for Viagra or for whatever, you know, ailment that they can foist on you and it feels very uncomfortable. But when it comes to an actual intelligence that you're working with, I wonder whether you're going to win the argument on data or whether actually most people are actually going to voluntarily say, no please, you know, literally I want you to read all my emails and help me be wiser. Is that horrifying or do you see some logic to that? I understand the beautiful vision that that is that there is a really helpful assistant who's going to know all your problems and help you reach the great solutions.
Starting point is 00:20:49 The thing is about it is that first off is ownership, right? Who are these companies owned by? What are their values? Who are they aligned with? Where might that data end up? And do you trust them? And are they transparent about it? Do you know what's going to happen to that data? Where it could end up? And the thing is, is that in the current environment we're in, none of those things are true, right? There are none of these companies where you could say, yes, this person is the Nelson Mandela of the tech industry, and I have complete trust and faith in them.
Starting point is 00:21:22 And even if there was, you know, the fact is, is that as we've seen with 23andMe, right, so people have done these genetic tests that this company now has, their genetic code, and it's now up for sale. So where is that going to end up? And the thing is to go back to it, which is you'd never get your data back. When it's gone, it's gone. And the ways that that can be weaponized against you, a lot of women in America are starting to understand what that means. These period tracking apps, which now you've understanding that's a surveillance device, that if there's some instance in which they might have
Starting point is 00:22:00 to seek healthcare access, that that could become evidence which could be used against them. This is really personal information and that's the thing. You know, a lot of people talk about data as property and it's really so much more than that. It's like your blood, your bones, your skin, your cells, right? You have to think about how that can and will be, like assume the worst. Just at this point in time, you have to assume the worst. The business models of the AI platforms are different from the business models of social
Starting point is 00:22:36 media. Social media was dependent on advertising and the core there is almost like give the advertisers data that you've extracted and let them use it how they will. For the AI platforms to earn people's subscription, where you're literally paying out an amount per month, I think they're going to conclude that it's in their interest to demonstrate that they are trustable. I mean, if they're not, people won't subscribe. But they're not trustworthy. Who's trustworthy in the AI space then, out of these companies? Who has been transparent, ethical, and legitimate in their approach to data use and the models
Starting point is 00:23:11 they're building? Obviously, the stated policies of all of the companies is that they want to honor users' interests. I mean, so I've spent time talking with Demis Hesabis, who's head of DeepMind and basically drives Google's most important AI efforts. I think he's an honorable person. I think he's trying really hard to do the right thing and to develop Google's AI products on fair principles. It's not to say it's an easy thing to do.
Starting point is 00:23:38 But I also think even if you, like let's say that you don't trust Sam Altman. If OpenAI is exposed as abusing data, they have literally already billions of dollars that will go out the door from people who won't continue to subscribe to them. So you can say that maybe some individual at the top is not trustworthy. What I'm saying is that the actual system here doesn't obviously pull towards mistrust. It actually pulls towards like it's key to win people's trust for them to succeed. But I think one of the best measures of people's behavior in the future is their behavior in the past and this is actually how a lot of these systems work right and if you look at the behavior of open AI in the past which is it's
Starting point is 00:24:22 illegally scraped data from numerous sources without respecting property rights or any other laws in different jurisdictions. Well, you had that beautiful point in your talk where you said, so I asked Chad GPT to write a TED Talk in the style of Carol Cadwallader and you showed what was... Yeah, it was basically the outline of my talk. It was compelling, could have saved you a lot of time, Carol. I know, except as I said, it's like the opposite of human creativity. Well, so let me ask this though. You said, you know, I did not consent to this, and I do not consent,
Starting point is 00:24:58 and it feels like, you know, you've had... It just feels outrageous that they've been reading all your stuff and are now doing this, and anyone else could write a talk in the style of Carol Kedwaleda. Would you feel differently about it if there was an improved business model here where the platform's committed to respecting individual talent so that, for example, when a request is made to specifically embody the style of a musician or a writer or an artist, that actually that there would be some compensation back to that person so that you could say, actually, this is a way in which I could amplify my impact on the planet and I will actually be compensated
Starting point is 00:25:35 for it. Does that change the... So I think that's fundamental to it. But however, it's just like you go back to the point is that this was done without any of our permission, and so we can see that there are big players who are being able to make deals Right, so and I use the Guardian as an example of that, right? Which is the Guardian has done this syndication deal after the fact because the damage has already been done They've already scraped the entirety of the Guardians website. So I understand the logic. It's like, well, you might as well try and make some money out of it. But of course, that's not respecting of the IP and of individual contributors there. And individual contributors are not going to be in a position to do these deals with
Starting point is 00:26:20 the platforms because they've got no, there's no collective ability to force a proper negotiation. In a theoretical world with an ethical AI company which asked your permission before it scraped your data and then paid you whenever it used that in some ways, but as we know, it's so hard to make that assessment because it's taken in such vast amounts of data and then it's mixed it all up into some weird sausage which it's now putting back out there. I mean, they would argue, and I'm not saying I agree with this, but they would argue that every time technology changes that the rules need to be worked out again, that you've got a situation where, you know, your words were published, put out freely for anyone on the internet to read.
Starting point is 00:27:07 No matter how many more people read your past words, you don't get any more payment. And so the data is out there. They would argue that it's out there as a sort of public resource for fair use. And I think it's right that people are challenging that because the fact is that, say, a given artist could easily be displaced by AI able to do much more. It's actually no, no, no, it's actually much, much deeper than that, right? Which is that every nation state in the world has some form of property law, right? Which is you can't
Starting point is 00:27:39 walk into somebody's house and just steal the silver. That's what fundamentally like is the basis of law and order in our country. But when it's intellectual. And this is, no, it's just, it's property. These are property laws. You know, in Britain, we've had this law since 1783. And if you can't respect the basic fundamental underlying principles with
Starting point is 00:27:59 which we order society, which is do not steal, then what are you left with? It's like, it's fine, we're going to take your silver and then if we sell it on eBay, we might give you like 5% of it. Yeah, so I get the anger. There is a difference between a physical object where if you steal it, that person doesn't have it versus a digital property where if you, quote, steal it,
Starting point is 00:28:21 you still have access to it. Not under the law, there's no difference. Well, you know, I think there's traditionally a difference in like when an idea is out there, it can be built on an amplified. And like, for example, in the music business, there's constant building on one person's work by the next artist. The kindest way of viewing what they're doing is for them is to say, we're not stealing, we're amplifying. I think we are absolutely lost if we do not respect the law and that's what we're seeing. This is what is happening.
Starting point is 00:28:49 But the law isn't defined yet properly in A.O. And it's in the process of being defined. It's property. These are just property laws. There's no difference. And it's the point to go back to the case, right? The underlying basis of what Google did with it, where it digitized, it stole every single written book in the world, didn't it? That was one of its first acts. And as I sort of
Starting point is 00:29:10 said, it's just this acting with impunity has led us to a place where that ideology is now embedded in the government of the biggest superpower in the world. And that is what's playing out now in real time. And if you don't like those laws, well, then you don't respect these ones either. And this is where we're in a sort of cascading situation. Right, right. Well, okay. I want two things simultaneously. I want a world in which creators are respected and fairly compensated for what they do. I also want a world where I can search for the collective wisdom of humanity and find it. I want to be able to read from all these books and discover them.
Starting point is 00:29:52 And so because... But we're not going to be able to have any further wisdom because there's going to be no economic model for anybody to write another book. Where I agree with you is that artists absolutely, and writers should be compensated. And if we could do that, it's just about possible to imagine a world where AI data can actually amplify the best thinkers. The fact that in principle, someone's kid
Starting point is 00:30:15 could have a conversation with Einstein based on his wisdom, that's not something possible before now. Arguably, that makes the world better. I think it's a reasonable conversation. But I think most people here would agree that the law is not yet in a good place and that writers and artists are in severe danger of being ... It's not in severe danger, it's happened. And also I think going back to it, which is it's power. I think we just keep on having to come back with it's power. This is power
Starting point is 00:30:41 being concentrated in the hands of a very few companies which are now aligned with a rogue state. That is what America is now in the world. It is a rogue state. You know, one of the key things I want to get across in the talk is that technology is politics now and politics is technology. There is no separation between them. With the Fizz loyalty program, you get rewarded just for having a mobile plan. no separation between them. Small business owners gone at the days of waiting for your bank to respond. With Journey Capital's innovative online platform, securing funding has never been easier. You can even do it from your phone. Our cutting-edge technology enables you to apply online.
Starting point is 00:31:33 Get approved for $10,000 up to $300,000 and access the funds you need in as little as 24 hours. No endless paperwork, no long waits. Just fast, flexible financing designed for your business. Get started now. Visit journeycapital.ca today. If you're anything like us, you love attention. And my favorite way to get all eyes on me
Starting point is 00:31:54 is with next level shiny glossy hair. Which is why we're so excited to tell y'all about the new LaMelaure gloss collection from the girlies at Tresemme. And gigglers, we've got you too, because Tresemme partnered with us to bring you 1-800-GLOSS, a special bonus episode of Giggly Squad, where Hannah and I give advice on all things hair
Starting point is 00:32:14 and giving gloss. Check out the episode and grab the LaMeller gloss collection today, because I'm officially declaring this spring gloss season. I really appreciate Chris. It was so punchy of you to, and we should talk about why you decided to put me first as the opening talk of the conference. Tell me, why did you decide that? I put you first because there are a huge number of people, probably the large majority of certainly the tech community, is in a bit of a state of shell shock right now. I mean, the pace of change has not been seen before, either politically or technologically, and people don't know what to make of it.
Starting point is 00:32:53 And you are unbelievably eloquent at naming it and helping people feel it. You expressed emotions and feelings that so many people in the room feel. They were just so moved to hear that come from someone so powerfully. And, you know, you don't hold back. Most people are frightened to make bold accusations against named individuals. You're fearless. I would actually love you, just speaking of the journey that you've been on, to just explain a bit more your own story here, because you mentioned in the talk briefly that the last time you spoke, you know, you'd end up being sued and that it turned your life upside down. In that original talk, you described someone as a liar based on prior reporting
Starting point is 00:33:47 and he sued you for that. What was the court ruling there? At some point, the court ruled that you would have to pay his legal costs. What it is, is that I said words which we published in The Guardian, which were perfectly defensible, which was that he had lied about his relationship with the Russian government. And that was based upon this series of non-disclosed meetings, let's just say, that the Brexit donor had with Russian embassy officials in the lead up to the Brexit vote. Now that is, that's just fact and it was in our reporting. But the thing which I got tripped up on Chris, which is that in the very arcane meshes
Starting point is 00:34:36 of British libel law, a single judge decides on the meaning of your words and they take into context the entire talk and then they formulate their view of that meaning for all time. So the judge came up with this formulation which is that he had accepted money in contravention of the law on such and so therefore I had libeled because I'd made an accusation that he'd accepted foreign funding.
Starting point is 00:35:09 What it meant is that I had never said these words at any point. Those words were never said in the talk. I certainly never meant to say those words, but that's what I had to go into court to defend. And that is why it was the Kafkaesque quality of it, which was so confounding because I was having to defend something which I'd never said. And that was where it turned the case on its head because it meant I couldn't defend that judge's meaning.
Starting point is 00:35:39 So I then had to defend on the public interest of why I gave the talk. And it put all of the onus on me and my reporting. And that's why instead of me getting to do discovery on the man, he got to do discovery on me. And that was when I talked about because this is the thing which is really relevant. There's various things which are really relevant to what's going on in the US right now. The court case was called a slap, which means that it's a way of trying to shut down critical reporting or critical voices and that's what we're seeing happening in the US, these weaponised lawsuits. Organisations across America are now preparing for this to happen to them. They know that they are
Starting point is 00:36:22 going to be on the end of highly politicized lawsuits in which they're going to have to open up their computers, their laptops. And they also know that this is going to be accompanied as it was in my case by a sort of massive online hate campaign. And so that's the analogy which I was trying to make, which is what happened to me is a warning for what is coming for other people in America. Well, needless to say, for everyone at TED, it was horrifying to see what you went through. And I won. So just to be clear on that, I won the case.
Starting point is 00:36:52 The public interest was found to be, my talk was absolutely lawful at the time that I gave it. And then on appeal, what happened is, in the one year after I gave that talk a police investigation into the Brexit donor was voided and at that point the Court of Appeal decided that the defense fell away. So it was the continued publication by TED which is a foreign media organization in a foreign jurisdiction, I was held responsible for and that's why he got damages awarded against him. And that's the thing which we're now appealing at the European Court of Human Rights. The thing is, it was complicated,
Starting point is 00:37:35 nobody really understood it, it was the pandemic. And you know, I think when you realise the gravity of what was happening, you rang me up after the trial had ended and made that very generous gesture, which I do really appreciate, which is you said, you know, we will see you right. Yes. So thank you for that. I don't want that to go unremarked. I mean, you're an amazing fighter.
Starting point is 00:37:59 So Carol, during your talk, you referred to the terrible personal experience you had the last few years after your last talk. For someone who wants to understand more about what happened there, where can they go? So I'm in one week's time leaving my job, not through choice, but because 100 journalists from The Guardian were being terminated because The Guardian has sold our corner of it. So I have set up a sub stack and I will write a full account where I would love to be able to explain to people the bigger picture behind that. I really, really believe in independent journalism.
Starting point is 00:38:38 I really believe in independent media and independent film and that I think is so vital at this time. And so as I was saying my newspaper has been bought by unknown unclear investors and I don't feel it's possible to do the same kind of independent journalism there. But you know there is an explosion, there's a thirst and demand from people for I think these like clear independent voices. And I think out of the total crisis of media and what is happening, social media, you know, information chaos, I call it, I do also think there is an opportunity there to grow properly sustainable media from the ground up, supported by readers who value that without being dependent
Starting point is 00:39:22 upon advertising, which we've seen has been a terrible media game, and algorithms, which we've seen is another terrible game. So I sort of, you know, I really hope that people understand that trusted sources of information are vital and we need to pay for them. And yeah, that's my message, I suppose, there. Carol, thank you so much for coming today. Took lots of courage. You really touch people. Really wish you well as you continue your journey.
Starting point is 00:39:49 Thank you, Chris. Thank you for having me. I really appreciate it. That was Carol Cadwallader in conversation with head of TED, Chris Anderson at TED 2025. You can check out Carol's talk on the TED Talks Daily feed or on TED.com. And that's it for today. TED Talks Daily is part of the TED Audio Collective. This episode was produced by Lucy Little, edited by Alejandra Salazar, and fact-checked by Julia Dickerson. This episode was recorded by Rich Ames and Dave Palmer of Field Trip and Mixed by Lucy Little. Production support from Daniela Balarezzo and Shu Han Hu.
Starting point is 00:40:25 The TED Talks daily team includes Martha Estefanos, Oliver Friedman, Brian Green, and Tansika Sangmarnivong. Additional support from Emma Tobner. I'm Elise Hu. I'll be back tomorrow with a fresh idea for your feed. Thanks for listening. This episode is sponsored by Edward Jones. You know, as I talk about these big ideas that shape our world, I sometimes think about the decisions that have impact on our daily lives, like financial decisions.
Starting point is 00:40:53 That's where Edward Jones comes in. Earning money is great, but true fulfillment in life isn't just about growing your wealth. It's about using your resources to achieve your personal goals. And Edward Jones gets this. Their advisors take time to understand you as an individual. They build trusted relationships to help you develop strategies that align with your unique goals. What's special about Edward Jones is their holistic approach. They see financial health as a key part of overall wellness, just as important as physical or mental well-being. It's not about chasing dollars, it's about finding balance and perspective in your financial life. That's something anyone should be able to achieve. Ready to approach your finances with a fresh perspective? Learn more at edwardjones.ca. Money's a thing, but it's not everything.
Starting point is 00:41:45 If you're at a point in life when you're ready to lead with purpose, we can get you there. The University of Victoria's MBA in Sustainable Innovation is not like other MBA programs. It's for true changemakers who want to think differently and solve the world's most pressing challenges. From healthcare and the environment to energy, government and technology, it's your path to meaningful leadership in all sectors. For details, visit uvic.ca slash future MBA.
Starting point is 00:42:12 That's u-v-i-c dot c-a slash future MBA. With the FIZ loyalty program, you get rewarded just for having a mobile plan. You know, for texting and stuff. And if you're not getting rewards like extra data and dollars off with your mobile plan, you're not with Fizz. Switch today. Conditions apply. Details at fizz.ca.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.