Big Technology Podcast - The Rise And Fall Of Facebook's Big Transparency Acquisition — With Brandon Silverman

Episode Date: September 21, 2022

Brandon Silverman is the founder of Crowdtangle, which Meta (then Facebook) acquired in 2016. Crowdtangle was once the most useful tool for marketers and publishers looking to find out what was trend...ing on Facebook. Then, it became a favorite resource for reporters looking into how Facebook treats political content, leading to some headlines Facebook didn't like. Crowdtangle subsequently lost of support within the company. Silverman left Facebook late last year and is now working with governments around the world on legislation that could mandate the type of transparency that he tried to push forward inside Meta. On this episode of Big Technology Podcast, Silverman tells the Crowdtangle story from start to finish. Sign up for my dedicated podcast email newsletter on LinkedIn.

Transcript
Discussion (0)
Starting point is 00:00:00 LinkedIn Presents Welcome to the big technology podcast, a show for cool-headed, nuanced conversation, of the tech world and beyond. Today we're going to do a show that I've been hoping to do for a long time, and our guest, Brandon Silverman, can attest that I've been in his inbox pretty much every. month since he's left Facebook. He is the founder of Crown Tagle, a really interesting company that brought and still brings, I guess, to some extent, transparency to Facebook, although it's been through a number of evolutions. Brandon founded independently, sold it to Facebook. Now, he's left Facebook and is advocating for more transparency for social media companies. It's a fascinating story. I think over the course of this episode, you're going to hear a lot about what it's like to sell a company
Starting point is 00:00:59 to Facebook, what it's like to work inside Facebook, and then, you know, I think what we're really going to get into the heart of it is why we need more transparency inside these companies and why they're so resistant to bring it out. So with that, I want to welcome Brandon to the show. Welcome, Brendan. Hey, how's going on us? It's going great. So we've actually known each other since before you sold your company to Facebook, which is now six years since you sold it, but, you know, we've probably I know each other seven, eight years. And we've been talking. And, you know, I think that your story is definitely one I've wanted to bring out to the public.
Starting point is 00:01:35 And I'm thrilled that we're going to be able to do it today. Yeah, it's great to be here. Thank you for inviting me. And, you know, I mentioned this earlier. But, you know, I'm a big fan of your podcast and you're writing over the years. And there have been a lot of requests invitations I've gotten since leaving. And I haven't been able to get back to all of them, including the ones I wanted to. And partly that's just because of own personal reasons.
Starting point is 00:01:55 And we have a, we have a, you may hear this at some point during the podcast, but we have a nine week old new member of our family, our third kid. And so that's also, you know, can take up a lot of time. Yeah, no, no hard feelings. Although I have seen you showing up at a couple of places, Casey Newton's newsletter, Ben Smith. So I've been totally out of pocket. But anyway, I appreciate you giving the time. So why don't we start with this? What was crowd tangle?
Starting point is 00:02:17 What led you to found it? Yeah. So our origin story is a slightly, I think, unique one in the startup space. So I was in the nonprofit world. I was kind of a community organizer out of college, and you can let me know how deep to go into our origin story. But one of the things I ran into in a lot of our work and was just true across a lot of the nonprofit and organizing space is that the internet had shown up as this powerful, amazing thing with all of these promises in part to help transform organizing and political power and who has voices, et cetera. but for a bunch of years, it was really hard to figure out exactly how to take advantage of it. And so if you were trying to bring together people around issues they cared about, around topics, around political advocacy, for a while, the literally the best way you could take advantage of the internet was email.
Starting point is 00:03:11 And for a lot of us in this space, it was like, email is still pretty good. Say that as a newsletter writer. It works. Email is still really good. Yes. In some ways, that is still the most powerful way to do it. But there was this, I think for a lot of people doing organizing, there was a sense of like, there's got to be a more powerful thing you could do online in this space. And there were a bunch of attempts to build kind of community organizing tools online, everything from things like Nation Builder to, you know, one of the most successful parts of Barack Obama's campaign was this thing called My Barack Obama.com, where a lot of his volunteers could get together and self-organize. And there were out of the box kind of SaaS attempts out of things called like Blue State Digital. But as I was kind of wrapping up the initial version of the organizing I was doing, I was an internet nerd from growing up as a kid as I was really kind of intrigued by trying to
Starting point is 00:04:03 tackle that problem. And so the original idea of the crowd tangle was going to be to try and help build an online organizing tool. And the unique approach we were going to take to it was we were going to take advantage of this massive new platform and data that called the Facebook API, where they had released all these different APIs and tools. And at the time, we were like, hey, everybody we're trying to organize is on this platform. Why do we go build a whole new set of tools to take advantage of that?
Starting point is 00:04:31 And so in the beginning, we were one of those apps that was playing around with all of the data and APIs in order to build a community organizing tool. And like a lot of startups, our initial idea was not as great as we thought it was. And those for a number of reasons. but that initial idea didn't really work. And we found ourselves that we had kind of inadvertently built a thing that actually really made it easy
Starting point is 00:04:54 to see what was trending on Facebook. And that was the one feature that actually all of our initial kind of partners and clients were using. And so, you know, we had this very existential moment that I think a lot of startups go through. We're like, well, this thing we're super passionate about and have been gone around pitching for years
Starting point is 00:05:12 and all this stuff is not panning out. And instead, this random, thing over here seems to have like really people seem to like really value and love and um you know for a while we tried to hold on to both ideas but we eventually like you know eventually had to like pull ourselves out and just and we went and we decided to fully commit to that second one um and that was kind of the beginning of what kind of most people now know of is crowd tangle right and so using facebook's API you could see what was getting the most likes comments and shares and then sort of use that as sort of a shorthand to be like, okay, this has the most of those, so it must be trending across
Starting point is 00:05:51 the platform, all public links, right? They had made this move towards public links, and you were able to build a software interface that allowed people to see how, you know, what was, what was doing, what was spreading across Facebook. That's right. Yeah, I mean, the two unique things we did, I think, maybe three, is one, we built a very specific metric called overperforming, which is we found content that was getting more engagement than normal. And if you had just built a feed of what was getting the most engagement, and we actually saw this when we first built it, at the time,
Starting point is 00:06:25 it was literally just Vin Diesel and Justin Bieber and Taylor Swift. It was all of the huge accounts. And so you had to find some way to pull something more interesting out of like the full corpus of all of the public accounts on the system. And so he built this thing called overperforming. And on almost like a minute by minute basis, it let you see content that for some reason was doing. better than average. And that was a really powerful, like an insightful feed, you know,
Starting point is 00:06:49 based on everything we hear from our partners. And then secondly, we really focused on making it super usable and accessible and a super simple interface that was intuitive. And partly this was by necessity. We were originally building for nonprofits and they just don't have time to get trained on anything. And so we had to build an interface that was super easy to use, fast, you know, intuitive. And also, you know, we have a background. I was a graph, you know, I was an art major, some of our other early employers, or employees were all like design nerds. And so I think we also brought a design element to it that I think some of the other. At the time, if you wanted any of this data, you would sign up for tools like Sissimos or other things.
Starting point is 00:07:30 And they're great, but their interfaces were like these huge dashboards of pie charts and stack graphs and all these different things. And we were just, we were just much more opinionated and design oriented on what we wanted to show people. Right. And then I say the very, the third last thing I think what you alluded to is the context in which all of this is happening, which is in 2013, 2014, Facebook really began leaning into, you know, you call it kind of called third party content, which is instead of just friends and family in the feed, they began showing you way more New York Times and BuzzFeed and other links. And so if you were a publisher and you, you know, you can attest this as much as anybody, you know, in 2013, on 14 and 15, you were suddenly getting. these like massive amounts of traffic from Facebook. And like business changing traffic. And so suddenly the entire news and publishing industry was becoming just desperate to figure out where is this traffic coming from? How can we do better at it? You know, how do we stack up
Starting point is 00:08:33 against our competitors, yada, yada. And so, and you know, that was mostly dumb luck on our end, but we ended up also building it right around the time that Facebook was starting to really prioritize this stuff. And so it took our potential market and like, I exploded it. And this is when I became a crowd-tangled user because I was working at BuzzFeed as a reporter and our entire model was trying to figure out what was trending online and trying to, you know, embody or write stories that followed those best practices. So our stuff would spread online and BuzzFeed ended up becoming a huge publication because of that.
Starting point is 00:09:05 And I'll note, I've asked Jonah Prerty to come on the show. So hopefully we'll get them on. I know, Brandon, you heard me say that and here you are. so maybe it will work with Jonah. And it really was helpful to see what was spreading on Facebook and then say, okay, maybe we can write these type of stories or write these type of headlines, model our content, you know, this way. And it worked.
Starting point is 00:09:27 And then at a certain point, it starts growing for you. And you hear from Facebook, they're interested in acquiring you. How did that end up going from, you know, building this tool to, you know, having the company whose data you were actually sifting through. say, you know, maybe you'd be better inside. Yeah. Well, I mean, one of the funny stories, I'm not sure if I've told publicly, but my family's definitely heard this is, you know, we were at the time a tiny team.
Starting point is 00:09:54 You know, we had built this one metric. We kind of built a nice interface, but we were not a team of 15 PhDs and, you know, yada, yada. And so we were pretty terrified that we had just built a feature. You know, we hadn't built a company. And it could be just really easily ripped off by almost anybody. So we were in, you know, we were in stealth mode for a long time. And what would happen is we were just going entirely the word of mouth.
Starting point is 00:10:16 And thankfully, that was growing really quickly. But our website just said, you know, put your email in hand, email here to request a demo. And that was all that it had. And one day we were sitting, it was around this time, 2014 or something, 2014, we started to get a bunch of, you know, traction in the publishing industry when I suddenly get a request for a demo from Facebook. Oh. And I went, my first. response was, I think, oh, shit. And I called my co-founder, Matt, and I go, oh, man, we got a request from
Starting point is 00:10:53 Facebook. And I was like, what do we do? And we're like, you know, on the one hand, we probably can't just ignore them. But the other hand, if they call us in, we're going to have to show them what we built. And then they're probably just going to look at it and go, you know, so we talked to her, I don't know, we talked for like an hour, couldn't decide he had to go on a call. I had to do something. So we're like, all right, let's just, let's talk about this again tomorrow. And we did this for a week and couldn't decide what to do to the point where we ended up just kind of, we ended up forgetting about it and never getting back to them. And so like, so it eventually escaped our mind. You know, when you, when you find product market fit as
Starting point is 00:11:27 like a startup company, everything, everything just explodes. And it was, you know, it was just chaos. And so we lost track of it. And maybe like three weeks later, it comes in again. But at the time, there's like three people. There's like three people. at Facebook who all requests at the same time. And went, oh, my God, I don't think I ever get back to them. So I call up my co-founder. And we literally just have the exact same cycle. We can't decide what to do, what the right approach is.
Starting point is 00:11:49 And then finally, I think they had an intermediary reach out through like one of our funders, like the Knight Foundation or something. It was like, Brandon, can you get back to Facebook, please? And so he got back to him. They invited us to come up to New York. And I think I literally called my parents as I was heading into the building. And I was just like, listen, like, thanks for how supportive you've been in this whole journey, but this is probably the end of it.
Starting point is 00:12:15 You know, I'm going in to meet with them. But we went in and actually had like a great meeting and they signed up to be clients. And, you know, they were pretty, they were very forthright and honest. They said, you know, this is really cool. You're building something that's really valuable to all the partners. And it's not on our roadmap at the moment. And actually, and this is one of the light bulb moments happened for us, is they go, And actually, we would, our team would just like to have access to this data because we don't know.
Starting point is 00:12:44 Wow. And so one of our biggest user bases literally just became the partnerships team at Facebook. And then the same thing happened with YouTube, Vine, and eventually Reddit where it was just a simpler, easier way to get access to a bunch of data that they otherwise, you know, was too hard for them to access. So that was, that was the first time we met Facebook. Now, it's the partnerships team, so is people working with advertisers or people trying to, partnering publishers? Yep. So Facebook has this whole team and they're similar, they're analogous teams at pretty much most of the platforms where, yeah, they have people who work with their kind of organic content creators. And so it's everyone from, you know, the Logan Pauls of the world to the New York Times, et cetera. And it, but it is specifically not for their advertising, but for all of their other questions. and partnerships. Right. And so, you know, we did look at this stuff in the BuzzFeed as publishers as, you know, trying to, as trying to figure out what was circulating on Facebook, which was
Starting point is 00:13:51 the traffic cannon at the time. But at the same time, this was, you know, you get approached by Facebook at a moment where political content, you know, starts to become much more important on the platform. And we're getting close to the 2016 election. And this is a time where, you you know, a lot of misinformation is, is, you know, being tossed out on the platform. You know, it's not, it is controversial to say whether it's swung the election or not, you know, that's always going to remain an open question. Reasonable people can disagree on that. But it's not controversial to say that there were, you know, people in places like Macedonia
Starting point is 00:14:25 setting up content farms and having that blast across the platform. And it was doing gangbusters traffic and they were making, you know, big businesses for that. So it's kind of this interesting. moment where Facebook comes. And yes, they didn't have this data, but also, you know, they're saying, you know, we want to be a partner with you. Eventually that partnership turns into acquisition interest, you know, at this very interesting time. So, so take us to that point of the story, because that's sort of where the tension of this, you know, marriage begins even before it starts. Yeah. And I mean, you know, just to be fully transparent, I mean, the idea of potentially joining
Starting point is 00:15:03 them actually came up from the first moment we met them. Okay. They said, you're building cool. stuff. It's not on our product team's short-term roadmap. And so it was, you know, it was an idea that was brought up from almost the very first moment we met them. And there were versions of that conversation that actually happened over the course of years, over several years. So it was not as if suddenly in 2016 they came to us. It was, it had been a conversation we'd have for a while. And honestly, another, you know, one big part of that is we had this question of whether we had built a feature or, you know, a product or a company. And in the beginning, we weren't 100% sure because I think we hadn't got into it with the idea of building a social
Starting point is 00:15:42 animal leagues tool and it was entirely new to us. But one of the things that happened is we, as we sat down with people like you and BuzzFeed and others, there were all these different ways in which they wanted to better understand data coming from Facebook. And so it wasn't just about like what's trending and therefore how should that influence our strategy. But it was a who are the big accounts reposting our content was like a huge one. And that was probably the second feature we built, which people loved it, and we immediately had product market fit with it, and it was solving a real data problem that almost everybody had. Then as we started to sign bigger contracts and meeting with more C-level people at different publishers, one of their things was like,
Starting point is 00:16:20 well, we want to know how we stack up a competition. And so we built this leaderboard tool. And suddenly executives had a much better sense of like how they were stacking up. And it gives them more ability to like allocate resources and yada. So what happened also between that first meeting in 2016 was we actually, I think, both proved to ourselves but also, I think, to a bunch of other people, including Facebook, that we were good at building this stuff writ large, that we had built a full suite, that our core thing was not to actually see what's trending,
Starting point is 00:16:50 but actually we can organize and structure social data in really meaningful, accessible ways, and there's actually a ton of different use cases for it. And one of them, I think, to your point, was also being able to report on Facebook and social media platforms themselves. Yeah. It's kind of this weird circular. Okay. Yeah. Here it's column A and B now. Yeah. Yeah. But it had also been like C and D&E. And so there was less, you know, when people, you know, and honestly this gets down on one of the challenges we had internally was there's just a lot of different use cases. You know, like when you at BuzzFeed, if you'd go and talk to who are the BuzzFeed power users, it would be both reporters. It would also be like audience and strategists. It would also be sometimes the marketing team. There, you know, there would be senior leaders looking at it to see how they compare. to others. And so we ended up, it was both a blessing and a curse for us that we were,
Starting point is 00:17:40 we had built a bunch of different ways to organize social data and make it kind of easy to access. Right. So you're going to, so we get to 2016, but, and, you know, I think the other big thing that happened and in some ways was probably the most important catalyst for how we ended up there was, um, Facebook had decided, sometime in the 2012, 2013 period, that they were going to really go after news. And so they began kind of courting a lot more publishers in the news industry. They began putting a lot more news content into the feed. They started to staff out a news partnership organization.
Starting point is 00:18:17 But in 2000, I was probably to say 2015 or 2014 or 15, it started to realize that like it's actually a fairly complicated relationship and simply juicing the traffic and putting more that content in the feed was not going to like it wasn't as simple as that and so you know i'll give one concrete example is at some point so for a while i'd say for there were several years when the news industry like essentially loved facebook because they were just they were literally like five 10 xing their traffic on a regular basis but then what happened is the amount of news content you could put into the feed started to plateau and so suddenly there became a lot more comp so instead of all of Facebook traffic going up and to the right for publishers, it started a plateau.
Starting point is 00:19:04 And then instead of plateauing, it started to be like an EKG, where it would be like this like, not even EKG. It would be like, you know, just this very, like, unreliable, but like dramatic swings where one month you'd get five million visitors from Facebook. The next month, you'd get 500,000. And that got really frustrating because having an unreliable, huge traffic source is hard to like, you know, design business strategy. round. And so what happened is, and I'll, you know, I could go on too long around and stuff, but like the relationship got complicated and at times began to get like, uh, show seeds of like genuine frustration. And so in 2016, they internally, they were like, hey, how can we like kind of like create a better relationship with the news industry? And, um, they did three
Starting point is 00:19:52 things. Um, one is they kind of doubled down on their news partnership. And in fact, where they, in part by bringing in a bunch more people, including some senior people, so I hired Campbell Brown, who's like a very respected senior leader in the news space for a long time and came with a lot of expertise of knowledge. They gave her a team and a whole project called the Facebook Journalism Project. There was an entirely dedicated to this. For the very first time, they hired a Facebook, a dedicated product team for news, which literally had never existed.
Starting point is 00:20:23 And then third is they approached us about us joining and saying, why don't you come and you can provide them all of, you know, not only the data on what's trending, but you can give them data that helps them do their jobs, helps them, you know, all the different needs they have as publishers, and then we'll make it free, and it'll be one of the ways that we help support their use industry. So that was the biggest, you know, that was one of the other really big kind of capitalic things for, you know, our discussions with them. Yeah. Okay. And then eventually you go to, to, you come in house, you become, they acquire you. And kind of interesting timing. I think it was announced the day after the 20,
Starting point is 00:20:57 election result two days after it was a very surreal experience i mean i i'm a i was not a i did not for trump um most of my social network did not and uh it was a very funereal atmosphere on my facebook feed and i remember having to go in and be like hey uh we also just got acquired by facebook um yeah so yeah it was um yeah i mean but you know the way these But did it, you know, whether you liked it or not, did it come into your mind now that, oh, Facebook was like a pretty important political beast and you are going to, you know, as the company that provides like a lot of transparency to people in terms of like how it's steering the conversation, you are now inside. Like did that all, did that all like sort of come to your mind? And how did you think about that that day? Well, I mean, for, I mean, just to pull the curtain a little bit back. I mean, usually the way these things happened is so that announcement had been in the works for like three months. So the actual decision around acquisition, usually what happens is like it usually takes somewhere between three and six months before any announcements actually made. So we had been that summer kind of like solidifying our discussions with them and getting to the point where you, you know, you sign a merger agreement, then you go through due diligence and yada, yada. So we had, you know, that all of that calculus for me had been going on for both a couple of years, but then really peaked in the summer.
Starting point is 00:22:21 And yeah, I mean, listen, the big motivation. for us were that, one, they came to us with a really compelling and convincing pitch around supporting the news industry and journalism around the world. It's like one of the really concrete things was like, we want to make crowd tangle free for every news, you know, entity on the planet. And that we're also taking all these other initiatives to like redouble down on what we want to do. And, you know, we had begun to see that accountability for the platform itself was one of our use cases, but just to be totally honest, you know, it hadn't, it hadn't percolated to like the level of like, you know, weekly articles using the data. So it was one of the use
Starting point is 00:23:04 cases. And I think one that we, we also believed in and where like thought was really important. But at the time, it was also this really broad suite of things that we did to help the industry. And Facebook was like, we want all of those. We want to make free. And we want you to explain globally. And that was a really compelling pitch and one, you know, it was exciting. Right. And so you go into the company right away. I imagine things started pretty good. They did, except we had one just catastrophic screw-up, like three months in, which was one of the most stressful things I've ever experienced.
Starting point is 00:23:35 So, you know, we get there and we're obviously like very, you know, part of what I want to do is, you know, usually the way acquisitions happen is there's some leader or somebody internally at Facebook who's like shepherds the process and the champion advocate. And so once we get there, I also want to make sure that person looks good. and, you know, their champion of us was worth it. But so there was, once, yeah, after Trump's election, one of the things people obviously began looking into was what was the role of misinformation and what the role of the Internet Research Agency out of Russia played.
Starting point is 00:24:09 And there was an article that eventually came out in the Washington Post that was entirely based on crowd-tangle data. And it was about the organic reach. of all of the content that came for the IRA. Previously to this, most of the focus had been on the ads. Right. And just for context, the IRA, the Internet Research Agency was a troll farm in St. Petersburg, I believe, that was aimed entirely at disrupting, really causing chaos in the U.S.
Starting point is 00:24:37 as the election went on. Yeah. I tend to think of troll farms is like either more spammy or like just trying to, you know, cause chaos. This was a very deliberative, like, state-sponsored attempt to interrupt in, you know, the political atmosphere of the US. Right. Like they had like they organized opposing protests on both sides of an issue across the street from each other and people actually showed up. That's some crazy stuff. Okay. Anyway, sorry, go ahead. So most of the focus had been on the way in which they had run ads to do this. And eventually people began to get to look into the role of the organic content
Starting point is 00:25:12 from the IRA. And there was a there was a researcher I can't remember where NYU somewhere who used Crowtangle to try and figure that out. What was the scope of how much, you know, the reach that the organic content got? And he used our system to estimate the number of Americans who saw the organic content. And it was wildly, it was not, it was wildly off base. And it was entirely because we basically had like just a, a part of our system that was not particularly well designed. And it was easy to understand how they thought that was the implication of like one feature.
Starting point is 00:25:55 But it was just it was not true. And we were adding up numbers. We shouldn't have added. Yeah. Like you had the reach of, I didn't remember this. You had like the reach like the total number of likes for all the fans for every page. And you could say like you reached an audience of five million even though maybe like 5,000 people saw something. Yeah.
Starting point is 00:26:13 I mean, there were two things. I think I think I can do it quickly. So there are two things. If you're a your page, what happens is there. all these large pages that repost links. And we can show you all the pages reposted link. And, but there's two things that happens. One, if a, if a page has a million fans post a link, not all millions see that.
Starting point is 00:26:29 Yeah. You know, and if anything, veterans of, you know, social media will tell you it's like 1% at best see it. And then secondly, what happens is, so one is that person was taking the total audience every time and assuming that's how many people saw it. But then secondly, is if a page reposted it, we are, our interface just added that. So if the same million person page posted a link three times and they each had a million,
Starting point is 00:26:54 we said three million people might have been reached. And it was just a, you know, it was just a shitty interface that we had that was easy to understand. Unfortunately, this researcher was not super amenable to us explaining that and being like, hey, that's not actually what it means. So anyway, the short version is Washington Post article, you know, leading with this number, about how many people this organic content reached. And it was on the day that I think Cheryl Sandberg was like doing around the horn meetings on the hill about this stuff.
Starting point is 00:27:29 So I felt, you know, I felt it was super stressful. I felt horrible. I felt embarrassed, you know. And, you know, to Facebook's credit, you know, I also expected to go into a room at some point and just like had my head taken off. And it didn't, it actually never happened. And they were super understanding. And basically what I was told was like, we understand everyone makes mistakes.
Starting point is 00:27:47 you're going to get in trouble if you make the same mistake twice. And I was like, I really appreciate that. And, you know, we worked our asses off to get to fix as fast as possible. But so that was how it started, which is unfortunate. But otherwise, besides that, basically what we did is spent the first two years. You know, one of the really big concerns we got acquired was that we were going to go away. That's oftentimes what happens with tools like ours. There was this famous one called Topsy.
Starting point is 00:28:12 The journalists used to love and Apple acquired it. Yeah, I love that. That was great. So everyone, there was this whole. topsy experience where Apple bought it and then just killed it and people hated it and everyone so we are big thing and this is what I talked to Facebook about as well is like we don't want to get topsy like you're buying CrowdTangle you want CrowdTangle to be a thing we're going to grow and do more and so we spent first year or two really trying to to like prove out that that was why we joined
Starting point is 00:28:37 and that was the case and so we launched a bunch of products you know we launched a partnership with Reddit after we joined we launched an entire version of CrowdTangle just dedicated to local news. There were a bunch of work we did to kind of, like, show this to folks. You know, I internally, like my title internally was the CEO of CrowdTangle. Our whole team was still known as CrowdTangle. Like, so that was a really, you know, that was a big focus for us for the first two years. And honestly, it was, it was great. And I think, you know, there were a bunch of stuff I'd do differently.
Starting point is 00:29:07 But for the most part, we were able to accomplish a lot of what we hope to. Right. But, you know, at the same time, Facebook was, you know, thrust under the microscope, microscope by both parties for its role in either elevating or quashing, you know, political speech. And, you know, CrowdTingle was the number one tool for transparency inside the company. And it does seem like there is this, for Facebook period, now it was inside the company. And it does seem like there was this tension between, you know, Facebook now had a decision to make, do we want to keep CrowdTangle live and allow, you know, the transparency.
Starting point is 00:29:46 And just sort of, you know, people to run with, like the Washington Post story reported wrong. I'm actually, maybe it's the perfect example, you know, where there are stories that were reported and, you know, Facebook lost control of it. And there's a sure is a temptation to lock down the transparency and the sake of public relations. You know, I think what we began to start to see a shift was as things got closer to 2020. And I think there were a number of things happening. I think one is their, you know, the relationship, all of their plans and aspirations around kind of building a stronger and deeper relationship with the news industry, like, that strategy didn't work. And if anything, the relationship between the news industry and the platforms and Silicon Valley writ large got like more and more adversarial and like toxic and not between, that's generalizing a bit, but like, there was a real loss of trust between, news industry and platforms. And both would say, you know, both would talk about why, you know,
Starting point is 00:30:49 they have every right to have lost trust in the other. And I think they're both right at some our level. But like, the interest in working with the news industry began to like become less of a like obvious consensus thing inside the company. Secondly, I think there was a real concern about potentially being branded as having helped reelect Donald Trump a second time. And then third, I think in general overall scrutiny of the platform. was increasing globally. And so, and, you know, we obviously became front and center for that by, you know, by 2019, 2020. And, you know, I think one of the things, I mean, this is a long conversation, but I think, you know, I,
Starting point is 00:31:32 there were a bunch of dynamics that, like, I think made that topic hard internally. One is that there were a bunch of platforms that weren't doing any transparency at all. And so one of the questions always, like, why are we putting ourselves out on a limb when YouTube isn't doing anything or, you know, other platforms aren't doing anything? So one, there was always this like, why are we the only ones doing this? Secondly, you know, and I take some accountability on this. Like, you know, our team really believed in what we did. And like, you know, we pushed hard on it. And so one of the things that happened between like over the course of being there is instead of just working with the news industry and publishers, you know, I spent a bunch of years.
Starting point is 00:32:13 is trying to get approval to add a bunch of other verticals and industries to like the approved partner set. So we began and I was successful and our team was. And so we added academics, researchers, human rights activists, election protection groups. So a lot of like the civil society that I think has also helped drive some of the public scrutiny and accountability, like more and more of them getting onboarded and using tool. And, you know, that made a difference. So I think one, we were also pushing hard on it because we believed in it. And I think lastly, is there had not been like a fully realized internal and all the way up to like senior leadership strategy on what to do about transparency. Right.
Starting point is 00:32:55 And so by the time 2020 gets around, it comes around, you have all these dynamics are bubbling up. You have more and more at, you know, what some executive see is like frustrating coverage that, and by the way, it's not like frustrating coverage, but it was the frustrating coverage they felt was unfair was just like bubbling up more and more. to the point where they were finally like, and then meanwhile, we just keep launching more and more transparency stuff. And every time we sit down to show the stuff we have coming down the pipe, they're like, okay, you know. Yeah. And at this point, the discussion also morphed from is there misinformation on Facebook to does Facebook amplify conservative voices or does it suppress conservative voices? And people were using crowd tangle to draw their own conclusions. But sorry, go ahead. You were saying it finally. Yeah. And by the way,
Starting point is 00:33:43 That example, I think one important thing to say, one important thing is in all this is, like, that was the U.S. conversation. Like, Crowdsingle was being used globally and around the world. In some ways, like, the work I think I was, like, most proud of was work happening in Myanmar and Sri Lanka and Ethiopia and other places. It was becoming important enough to the company. There was, like, regulation being starting to be talked about, like, all these different things. And the initial idea around how to solve the rights of the news industry had clearly, like, not manifested in the way they hoped, that there was finally a, like, fully actualized. like the right people in the room conversation about what to do about transparency. And that was in the moment when there was just obviously they weren't, there was not as
Starting point is 00:34:23 much alignment as we would have needed to do all the stuff we wanted to do. There was a line in a New York Times article about, about you or about crowd tangle that said Facebook executives are more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it was amplifying harmful content. Fair? I don't think that's fair. I think it's, you know, first of all, I think it's tough to, like, describe any of these companies as total monolithic, yeah. You know, we had various senior leaders who were, like, 100% bought in on everything we were doing and wanted to get it right and wanted to make it more available.
Starting point is 00:34:57 We also, there were some senior leaders who didn't believe in transparency. And there were some who were, like, who that describes perfectly, but it was not monolithic. And there was, like, you know, what you're trying to do is navigate between six or seven, you know, senior people who are going to decide. the outcome of this stuff and it was fairly divided and if anything the vast majority of those people really believed in the work and thought it was important um so yeah i don't think it's fair to like there's this article with an email chain of like some of the top facebook executives nick clag i think fiji simo was running news feed and all of them seemed to be like oh like this isn't good for us yeah i mean well first of all that that was like you were on that email chain so yeah
Starting point is 00:35:35 and to be clear like that was one email chain i was on like there were many iterations of that email chain over the time. And I think that was about a very specific version of it, which at the time, you know, this was about, they felt like Kevin Ruse has a, you know, a Twitter bot or I think that for a while was manually doing it that had the top 10, you know, link posts on the platform. And there was a time in which that was getting a lot of attention as the 2020 election was ramping up. And so at the time, it was a very specific conversation around what can we do to potentially combat this particular narrative that, like, Facebook is a right-wing eco-chamber, which at the time, not only was a Twitter bot, like, Mark did an interview with Axios on HBO,
Starting point is 00:36:19 and it was like one of the questions they got asked, the economist did a whole piece about it. It just, there was a two-month period or maybe a month-long period where, like, that narrative was just really took off. And so, you know, we attempted to solve that problem, and I'm getting in some of the details on it. But this particular email chain was a like, how can we combat that narrative we don't think it's accurate and versus the entire crowd tangle situation yeah yeah yeah but nonetheless you do eventually in in april 21 get called into a room and say gent told your team is is being disbanded and the reports are that you're basically you know what are you like kind of like a minister without a portfolio you're just kind of hovering around
Starting point is 00:37:04 inside facebook so it does seem like at the end of the day you know that idea that maybe the company needs less transparency or if this stuff is happening, it shouldn't be coming from inside Facebook one out. So what happened there? Yeah. I mean, so there are a few things that happened. So I think, you know, that email chain was part of a number of different things that bubbled up to a like, hey, we need a coherent company-wide strategy on this stuff. And, you know, there were, there are parts of that I feel comfortable talking about. There are parts I haven't quite figured out yet. But at a high level, like the version we were interested in,
Starting point is 00:37:39 there was not alignment on. And that was kind of the beginning of the end for what, you know, what had been like our four-year experience so far of like being independent crowd tangle, pushing hard on this stuff. It became clear that that was, that era was coming to an end. I think there's a couple really important things is, first of all, I don't know if you've ever worked at a big company. I mean, I don't know if you call BuzzFeed a big company.
Starting point is 00:38:03 But like, there is a universe in which like a lot of Facebook's transparency efforts over the years were really them shipping their org chart and not having a coherent strategy. And one of the things we ran into when that larger conversation happened was where should this sit and who should own it? And there are not a lot of incentives. There were not a lot of incentives for anybody to raise their hand. Yeah, because you're going to be responsible for all the headaches. Yeah.
Starting point is 00:38:31 And that was a real challenge when this like deeper strategy was being kind of like figured out. Secondly, you know, one of the very specific disagreements, and this is one like I could talk about a lot and I really believe in deeply, is if we're going to share data with the outside world, who do we share it with? And one of the things I had come to believe after like, you know, eight years of doing this at the time, or 10, close to 10, was if you have built a public space that houses like a saving amount of like civic and political discourse. journalists should also be able to see what that looks like. And so whatever the transparency vision is, it should include some version that allows journalists to see what's happening on the platform. And by the way, I think that is even more true
Starting point is 00:39:23 if you think about a public platform that essentially has like the governance model that Facebook has, where a single person is ultimately responsible for like designing and shaping those spaces for the plans. And Mark Zuckerberg. Yeah, because, you know, he owns a vast motor to shores. he's also very involved in product stuff.
Starting point is 00:39:39 And it's not that this isn't a, you know, we could talk about its decisions. But like, if you just imagine a fictional world in which one person is dictating the design structures for like the public spaces of every country in the world, I think, you know, well, then you need even more accountability and scrutiny for that. And so a very specific thing is we thought, we think journalists should be able allowed to see what's happening on the platform. And that was an area of disagreement with a couple leaders that was like ultimately one where I was like, you know, wherever this is going to go forward if that's not
Starting point is 00:40:08 part of it and it's not like vision we believe in look so okay and then you end up leaving the company um i want to take you back before we break here to the uh the day of the acquisition was announced and i wrote this inside uh at bus feed because i was like oh this is interesting that they're going inside facebook but there is an obvious downside so here's what it is um crowd tangles existence was of course always dependent on facebook since facebook could have cut off crowd tangles access to its API if it felt the company was causing more grief than good. But now questions like, I'm seeing something weird on Facebook, can you help me look into it, will be much harder for Crown Tegles team to answer.
Starting point is 00:40:47 How they respond will be a critical test of Facebook's willingness to be transparent with reporters and by extension to the public. You know, that's the end of the acquisition story. So, you, I mean, let me know if I have to take this out, but you and I spoke about it and you said, don't worry about it. This is going to go in the right direction. Wasn't this all foreseeable in some way? I think it's entirely possible as foreseeable, but still totally worth it.
Starting point is 00:41:11 First is we did an enormous amount of work that I'm like deeply proud of in like the four years we were there. A lot of which is like never never got a lot of press or coverage. You know, we worked with. Yeah. Yeah. I mean like, you know, we had, you know, there were like human rights activists in Myanmar who like would send us emails about ways in which like the work. they were doing using CrowdTangle was like helping save lives there. There was similar ones we'd have in Ethiopia and Sri Lanka and other parts of the global south. There were
Starting point is 00:41:41 election, you know, we ran a ton of different election protection, you know, war rooms that it was just our team. And they're everywhere from Nigeria to Brazil to the U.S. I mean, we did an enormous amount of work around both federal elections when we were at CrowdTangle. You know, we provided a ton, We were the first data source post-Camberge Analytica to made available academics and researchers. And it was, you know, there was a lot of data that didn't provide them. But there was a, you know, I think we had hundreds of peer-reviewed articles in academic journals that were like looking at really important questions about the platform. And so, like, you know, we launched offices in Singapore, South Paulo, London, India. You know, there was, you know, it might be the most impactful work I do in my whole career.
Starting point is 00:42:31 It was a really amazing experience opportunity, and we did a lot of work, and most of it didn't generate a lot of coverage. But, like, if you talk to our partners in those spaces, they will say it was like a lifeline for their work. And so one, I think that's true. Secondly, is when we got there in 2017, following, and then when Cambridge Analytica blew up, Facebook shut off a ton of their APIs. So if we had not. So you would have been toast. We would have been close to. toast. They completely shut off their Instagram API. So our entire Instagram, any insights in
Starting point is 00:43:09 Instagram would have disappeared. And then, you know, they massively curtailed their, you know, the blue app API to the point where we might not have been an approved for access. And if we were approved, we certainly couldn't have like called it as often as frequently as we did from, you know, internally. There were also products where we use data that otherwise there were no APIs for even pre-camera general at it goes. So we began working on this thing that's now public, but it's an archive of content that's been removed for the platform for coordinated and authentic behavior. If you look, when you talk about things like the IRA or the Macedonian troll farms, anything that is taken down off the platform for being some sort of state-sponsored
Starting point is 00:43:51 disinformation campaign, it mostly just disappears completely. And we spent a bunch of years talking to lawyers, others internally, to get the permission to build an archive for it, where, researchers in the external world and the outside world could start studying that content even after it was moved. And that, there's zero chance we could have done that from the outside. And then, you know, the third, and I think this gets to what I've been spending the last few months on, is I think the scale at which we eventually were able to grow and everything we were able to do, I think elevated how important transparency is. And so if you look at a lot of the regulation being considered around the world, more and more what you hear is like transparency
Starting point is 00:44:27 is one of the key things that regulators are trying to help solve. And I've gotten, you know, I've, like, lost track of the calls I've gotten from lawmakers who are, like, designing these transparency regimes. And one of the things they want to do is have a crowd tangle-esque component. Crowd tangle today still exists, but it's kind of like a maintenance mode. Is that right? Yeah, although, can we pause for a quick second? So, so you wrote that article.
Starting point is 00:44:51 Yeah, go ahead. Yeah. Is that, was it, do you find that a convincing argument? Well, I do. Yeah, I do. I mean, I think that I've started the paragraph the way that I did for a reason. And the start of the paragraph is crowd tangle exists basically because of Facebook's, you know, API. If that API is gone, then crowd tangle's gone. So actually that both of those, both sides of those paragraphs turned out to be, you know, good predictor of what might happen. They did end up, you know, shutting off APIs around the world like you mentioned. And the fact that you were in there probably allowed you to do the work that you were doing more. And I think that also like when it comes to, you know, how much transparency Facebook was actually interested in, it was always clear to me that there was limited and it looks like you found a limit. So, you know, it was the acquisition, it was, there's nowhere in the story where I say like crowd technology shouldn't have sold and Facebook, you know, shouldn't have bought. I think that when that happened to me, it was clear that it was going to go down a road that it eventually did.
Starting point is 00:45:55 but when you're in the position that you are with crowd tangle, like I think you made the right choice. Yeah. I mean, I also think there's a few other. Yeah, I mentioned this earlier. It was like,
Starting point is 00:46:08 you know, we push pretty hard on it. And I think if our goal had been to keep crowd tangle around, you know, forever, there were probably some things I could, my team could have done a little differently. You know,
Starting point is 00:46:19 there were opportunities to help advertisers and marketers that we, you know, we didn't lean into as much as some other things. and that probably could have, you know, guaranteed our longevity a little bit more. Yeah, got to be near in the money. Yeah. And, you know, I mean, the other reality, too,
Starting point is 00:46:34 and I think maybe this frames your, a little bit of your thought process too is, you know, there's not a lot of founders who sell companies to Facebook and to some degree of most companies that end up staying. You know, if you're an entrepreneur and founder, I think one of the things,
Starting point is 00:46:53 you have a lot of attachment to the thing you built, and especially when they came there and they kind of let our team still own it, at some point they're going to make a decision to do some things that are match whatever their priorities are at a given moment that are a little different from why you got into it. And I think over and over, like you see that happen
Starting point is 00:47:09 with almost every acquisition. And if anything, I think the, you know, I sometimes joke that like, you know, my biggest accomplishment was actually lasting four years while we were there, you know, and like having the runway we did get and the independence we did get. But there were also, you know,
Starting point is 00:47:24 there were a lot of champions of us internally, and they deserve mostly credit. But partly it's like, you know, this is a little bit what happens in these stories, too, which I think is probably your point as well. Yeah. And so, and it's good. I mean, I feel like you're, you're, every time you speak about this, you're realistic about it. You're not like what the WhatsApp founder never sell your company. You know, after he made billions of dollars.
Starting point is 00:47:45 He did that tweet. Like, you're being realistic about it, which I appreciate. And just give us a quick update on like crowd tangle today. Maintenance mode, does it exist anymore? I haven't heard anything about it for a while. I know some I don't have like, you know, I'm not getting like daily updates on it from Facebook, but like it is operational.
Starting point is 00:48:02 People can still use it. There are definitely more complaints from partners who use it, that it's, you know, the bugs take a little longer to get fixed. It's not as fast. I think, you know, something I mentioned when I testified in the Senate a few months ago is, you know,
Starting point is 00:48:14 tools like CrowdTangle are only, they have to change because the platforms are changing constantly. And so you have to be launching new features to keep up with all of the product changes in the platform itself. And they've been very, to my knowledge, like no new product launches, I think essentially since the Reorg process started with our team.
Starting point is 00:48:34 And so, you know, unfortunately thing about that is it just starts to get less and less relevant. So like as Reels explodes, you know, I'm sure if we were there right now, we would have already launched a Reels integration or be working on one because, you know, of how important that product is coming.
Starting point is 00:48:48 So I think, you know, there hasn't been a lot, but it is still available, you know, I think there have been a few moments when they stopped onboarding people and then began onboarding again. And the last I've heard is they've publicly committed to keeping it around for the midterms. But I think after that, they're very non-committal about what happens to it after that. Okay. I had to get to a break. Brandon Silverman is with us.
Starting point is 00:49:10 He's the founder of CrowdTangle, which you've heard a lot about in this first half. Hopefully we can go a little bit longer and talk a little bit about the transparency work that he's doing. And what Facebook might do to news. So we'll be back right after this. Hey everyone. Let me tell you about The Hustle Daily Show, a podcast filled with business, tech news, and original stories to keep you in the loop on what's trending. More than 2 million professionals read The Hustle's daily email for its irreverent and informative takes on business and tech news. Now, they have a daily podcast called The Hustle Daily Show,
Starting point is 00:49:41 where their team of writers break down the biggest business headlines in 15 minutes or less and explain why you should care about them. So, search for the Hustle Daily Show and your favorite podcast app, like the one you're using right now. And we're back here on the second half of big technology podcast with Brandon Silverman. He is the founder of CrowdTangle, sold to Facebook in 2016. You heard a lot about that in the first half. Now, I want to talk a little bit about some of the assumptions underlying that first half conversation, beginning with news on Facebook, because every one of these conversations
Starting point is 00:50:18 tends to, you know, come with this established, given that news is very important to Facebook. Facebook is very important to news. But it does seem to be no longer the case. And that doesn't get talked about very much. So, Brandon, can you just talk about from like your vantage point? What's Facebook's view on news now? Because I don't think it was just transparency that they decided that they wanted to pull back on. And maybe it's a good thing that news is no longer a priority for them.
Starting point is 00:50:45 You mentioned a little bit about the news partnership team not working out. what's the state of the, of the Facebook's approach to news right now? Yeah. And I believe when it wasn't quite news, partnership didn't work out. Like, they did a lot of great work over the years. And I think they're still there. So, how I think about this is, you know, I think in some ways what we're seeing is the end of a 10-year experiment of Facebook,
Starting point is 00:51:10 going after news as a market opportunity. But we're seeing, like, kind of the end of it now. And, you know, when Facebook really started, in the beginning, Facebook was just friends and family. And it was in 2011 and 12 and going into 13 where they started ramping up how much called third-party content. And a lot of that was news into the feed. I think the honest answer is that was probably mostly came from a sense of like competition with Twitter. The Twitter was exploding around the same time and was seen as, you know, a potential existential competitor. But they were all news.
Starting point is 00:51:44 And so was there a way in which they could, you know, go after a similar market, if not, like, you know, take the, pull the rug out from them before they got too big. But I think one of the things happen, and this is also, you know, crowd single was wrapped up in this whole story is all of the different attempts to play a meaningful role in news
Starting point is 00:52:03 have turned out to be way harder, way more complicated than I think they ever imagined. And at this point, I think what you're seeing, and this has been leaked out, and in some basis is just kind of publicly reported on, is a bunch of kind of these signature efforts in that space are either being deprecated or not, you know, in some cases, not being renewed, but I'll just kind of go through an example. You know, there are product things ranging from bulletin, which was designed to be kind of
Starting point is 00:52:34 a substack as competitor, but again, with, I think, news is a heavy focus. There was a podcast thing. Yeah. Yeah. When that came out, I was like, I'll never do it. And people like, you're an idiot. They have a massive audience. I was like, this thing isn't going to last five minutes.
Starting point is 00:52:46 And here we go. Sorry, continue. Yeah. And, you know, I think at the time it was, again, there were all, my sense internally was there was a lot of like very earnest good faith attempts to do some of this stuff. But like, it's just way more complicated. It delivers way less value. And again, there were all these like third, there were these external things happening in the world that were making all of them way more complicated, including just this overall lack of trust and loss of trust between both major industries. As well as just changing user behavior, which is, you know, but so there are all the, you know, bulletin, some. I think the news tab was the single biggest product and partnership announcement that ever happened on Facebook, which is we're going to have a dedicated section just for news. And we are going to give news industry a lot of money to put content there. And there was a period year and a half ago, two, two and a half years ago when that was a huge deal.
Starting point is 00:53:37 And the plans were to roll that news tab out and place in all the world. But I think it's now been publicly reported while she journal, they're no longer renewing those deals. they probably, you know, they're not, and they're certainly not announcing new ones. Yeah. Big technology, unfortunately, missed out on that cash. So just kidding. We don't do partnerships with Facebook. Sorry, go ahead. Yeah. So I think what you're seeing is that the end of this kind of like, how can we go after that space. And I think there's a bunch of reasons for it. But I think, you know, I think part of part of the, part of the calculus was that certainly in the U.S., it is really, really, really hard to be like a neutral
Starting point is 00:54:23 entity in between like the right and the left. And if you're in news, you're inserting yourself into the middle of like the entire maelstrom of like political polarization. And now there are certainly ways in which you get there are really deep meaningful critiques that have nothing to do with that and ones that I'm very like sympathetic and believe in. But I think one of the core things was just that, like, trying to be neutral inside the American political and media ecosystem right now is really, really, really, really hard. Now, again, there's other critiques, which I'm happy to, like, go into that I really believe in. But I think that was one of the core nubs from their perspective. Well, it's also like, I'm curious if this is a critique that you believe in or not.
Starting point is 00:55:01 News and Facebook, they don't really belong together because a lot of what's happening on Facebook is just very emotional. And, you know, it does seem to me like your information, your news diet should come from, you know, the more thoughtful place. and less like, you know, emotional and confirmation bias type of actions. You don't agree with that. I know. I'm shaking my head because of how much I do agree with it. Like, no, I think the premise that we should filter somebody's news by what their weak social ties are sharing with them combined with like an engagement-based algorithm
Starting point is 00:55:36 is just not a great way to deliver news. That provokes angry sharing. You know, it's like, well, how can I make people angrier? Yeah, and I push, I spent a lot of time deep in the feed, and so I push back a little bit against this, like, monolithic, it's all angry stuff. Like, the biggest news content every year. Yeah, but yeah, anytime we did the biggest news post of the year, like, it was littered with like goats on cop cars and bears in hammocks and the deaths of celebrities and stuff like that. So there is certainly, my big thing around anger is it was always like cheap anger. and news is a is where you get into like really like you can have you can just develop a shitty
Starting point is 00:56:19 feed um but that's one of them but i think yeah but yeah i think this entire premise that we should deliver news based on like what your weak social ties sharing with you i think just may not have ever been a good one and then i think secondly i think you know there's a case to be made um for some class of harvard MBAs somewhere that from a business perspective deciding to insert yourself into like the center of like the American political system by being the primary distribution way for news might have been like one of the worst business decisions ever made by like a large company and that never it was never going to be worth it from like an engagement revenue and then meanwhile so much of the loss of
Starting point is 00:57:04 trust in their brand came from that decision as well as how insanely complicated it is to do that, not just in the U.S. in the moment we're in as a country, but then to take all the responsibility of doing that in every country in the world, like, it just was a horrible idea from the get-go. And then in the meanwhile, what they also, and what they did in the meantime is like they lost sight of all of their focus on, like, leaning into friends and family and deep social connection and meaningful relationships. and yeah.
Starting point is 00:57:39 And now that's in this awkward place in the middle where they have some of everything. So, okay, I want to ask you, you said you spent a lot of time in the feed. I want to ask you, we had Brandon Nyan on who was talking about, and we had Jonathan Hayd on, talking about the impact of social media on society. Oh, yeah, by the way, you just reminded me. You said one thing I was going to hit on. You said, you also said, by the way, in that last question, and maybe it's a good thing. Yeah.
Starting point is 00:58:05 on the divorce from news. I think there's like an entire hour-long podcast you could have around that. And I, my instinct is I think it might be a good thing. Now, like all things in the world, yeah, it's not going to be without costs.
Starting point is 00:58:18 And I think the biggest cost is in places where Facebook is the de facto internet, in places where they're authoritarian governments and very little free press, like taking independent news voices off of Facebook is going to have a real cost to like, you know, burgeoning liberal democracies in places that are struggling. So there's going to be a cost, but I think writ large without putting in the resources that are necessary to do it right, I think the net is probably harmful at like a global level. And that's a very, that's a very complicated topic. I think smart people disagree. But my instinct is like in the
Starting point is 00:58:56 end, it's probably a good thing. If you want to do that hour long podcast, I'll have you back whenever you want. So just letting you know open invitation. If you're willing to come on after this. Going back to what I was saying before, the Brendan Nyhan comment, we have all these researchers trying to figure out, you know, is Facebook good for society? Is it bad for society? Is it a place where harmful, you know, content is the main form of content or spreads to the point of like a serious detriment to society or is that overblown? Those guys, you know, they're smart, they're academics, they have access to a lot of data. They were never inside. And they never, and there's few people inside Facebook who were in a position like you were, where you had
Starting point is 00:59:34 access to effectively the fire hose you spend a lot of time in the feed you know i want to ask your opinion on this after having seen some of this stuff um is the misinformation problem on facebook as bad as a lot of people talk about kind of so but here's why i think not in the way people but not in the way people talk about it so i'm yeah i'm going to revise that so i'm going to say not in the way people talk about it i think the vast majority of quote unquote problematic misinformation is always going to be, by almost anybody's definition, legally allowed on any platform. That the single most, like the single most important graph ever,
Starting point is 01:00:16 I think produced about social media was one Mark actually put out publicly, which is it's a graph about the amount of engagement a piece of content gets and how close it gets to being violating. And as you get closer to being violating, you get more engagement. And that that's a fundamental, it is an incredibly difficult problem. to solve, but wherever you put the line, the vast majority of content, they're going to find a way to be to not violate the rules, but be up close to it. And the example I give for this is, you know, I can start an outlet that all it does is focus on the crimes that legal immigrants are committing
Starting point is 01:00:53 in the country. And every day I tell three different stories. And I would give anybody who watched that channel would come away with the impression that legal immigrants are like, you know, massive source of crime and disproportionately breaking laws, all this different stuff. And that is entirely incorrect and false. But that doesn't violate, those each individual pieces of stories don't violate any fact-checking rule. And I don't even sure if each of those individually are not misinformation, but collectively, very easily are misleading people.
Starting point is 01:01:23 And I think that is the question that's a much harder one and a much more important one for us to solve when it comes to building healthy information ecosystems is totally legal, but like deeply misleading. content. And I, the closest I can get to having the answer to that is as much transparency as possible. That when people see those ideas, other sources of news and, you know, influence can, you know, counter them and disagree with them and argue with them. And it's not perfect. But I think, um, I think for me, that's where the biggest challenge is with all this stuff. Now, that being said, I think the other big challenge is these platforms are so big that there are
Starting point is 01:02:00 just, to be solely blunt, just also a lot of like execution errors. And, you know, I think one of the really big divides you have that I'm not sure it's ever, like, I'm not sure people will kind of frame it this way, but like, you have a lot of people who work inside a platform who are like, this thing is so huge. We're actually doing as good a job as you could possibly do. And then you have people on the outside looking at it and going, but you're still missing all of this stuff and all this stuff is really important. and those two things just like fly by each other because the real the only answer to that is that the thing shouldn't exist you know and and so I think one of the one of the underlying questions on all this stuff is just a debate about responsibility and people internally saying we're we're managing this as responsible you possibly can and people
Starting point is 01:02:52 externally going you're not and that being the core of a lot of the arguments but I think You know, I think when you get in the misinformation space, like, I'm particularly sympathetic to a lot of the research I've seen, which is it is frequently the voice, like, it's not like the call is coming from inside the house. Like, if you look at network propaganda from Yohei Bankler out of MIT, like there's over and over a lot of research that enormous amount of misinformation is coming from elite stakeholders in society. And so I don't want like a president, maybe. Yeah, a president, like the most watched news source in a country. And I don't want a single entity of a private company deciding those people can't, you know. So I think that's why, again, I come back to as much transparency as possible around information ecosystems is how I think how you get at the like a lot of the challenge. Yeah. Question number two. We'll try to move through these quickly, although I'd appreciate the thoughtful answers.
Starting point is 01:03:52 So go as long as you want, I guess. Yeah, I'm not being, I'm not being a pithy podcast guest. I apologize. Yeah, no, this is, by the way, this is what we're here for. We do nuance here. Like, this is the selling point of the show compared to what we usually have in the news, in the news, news ecosystem. So, okay, number two is let's talk about whether you think after having watched this, does Facebook favor conservative or liberal voices in the U.S.? Does it censor conservative or liberal voices disproportionately? One of the things that got, maybe got couching, tingle in trouble was this.
Starting point is 01:04:27 Twitter account that you mentioned from Kevin Ruse, where it kept saying that Ben Shapiro and Don Bongino, who are conservative voices, you know, kept crushing it on Facebook. So having been inside the belly of the beast, so to speak, what's the answer on that? Yeah, I mean, this is a long topic as well, but I'll attempt to you a little faster. And this maybe is also to answer your previous question is there's a lot of questions about the role that Facebook plays in like civil society that Facebook just actually doesn't track that well. And so sometimes there's a question of when these debates come up externally and you go, oh, well, what's the source of truth internally? And it was not uncommon for there to be like, well, we don't know.
Starting point is 01:05:15 Right. And that's again why I, having been inside for four years, I continue to just come around We need more of the outside world able to look at these questions. And by the way, so when it came, you know, when it came to the U.S. political content of U.S., it was not being tracked every day by anybody internally who is constantly measuring the, like, you know, the degree of partisanship in the news ecosystem and where is it? And by the way, that's to say nothing of doing that exact same thing in every country in the world. So I would say for the most part, there's just a massive lack of a muscle around so many questions I think the outside world wants, that you need a much more, you need a much bigger infrastructure
Starting point is 01:05:55 to be able to help answer monitor on a regular basis. So I think that's what, that being said, yeah, what's your feel on it, though? Like, what's your gut feel? I know it's sort of, let's caveat it a little bit. But like, you have to answer the question. So like, what would you say if you had to like, you know, make your best guess? I think engagement in any algorithm where engagement is like a, is like a. significant factor in the ranking of the content, there's a degree to which right-wing content
Starting point is 01:06:23 will do better in the feed, period, in the U.S. I think it's for a handful of reasons, so I'm going to go through a few. One is, I think right now, there's a lot of the right in the U.S. that is very anti-institutionalist, and there is a lot of engagement to be had in scapegoating, existing power in a way that resonates with people, especially people going through pain. Secondly, I also think there's a degree to which there is a part of this is a reflection of the demographics of each party and the, and like where a lot of the like political like first principles are for each of the parties. So like right now you have a, the right tends to be mostly white and Christian. The left is a metro or heterogeneous party.
Starting point is 01:07:09 And so one of the things you see is I think it's much easier to find a scapegoat that the entire right will find like plot. and emotionally engaging versus on the left where you just have a much more broad diverse coalition of interests and beliefs and by the way I also that might be we might be going through reshaping of that also though in the US yeah yeah and I think it's an important thing for all of this stuff is like whatever moment you're looking at it it's the moment you're looking at it five minutes later yeah it could be different yeah um but I also say by the way I also think that list with that list also reflects is the the the media ecosystems on each side um and I like I think one of the under-discussed things about that list is that a reader which represents much more of
Starting point is 01:07:49 like a consolidation and a much like fatter head of the media ecosystem on the right versus on the left. And so I think there's a much more of a fracturing of attention and engagement among like left-reading newsreaders on the platform than there is on the right. And so that shows up. I think there's a degree to which Ben Shapiro and the Daily Wire spend a lot of money on Facebook. And I also think that, what was I was going to say? Yeah, I mean, there's also a bunch of research that shows you, right? There's a lot of right-leaning accounts that skirt the rules a lot more than ones on the left. And so, again, if you're thinking about how to bump up against what's allowed and what's not,
Starting point is 01:08:28 there tends to be more of that. Or we've seen research there's more of that on the right, on the left. Does Facebook disproportionately take down right-leaning content? Maybe there's just more of it on the platform, so that's why it feels like it? Yeah. And by the way, yeah, that was the last thing I was going to say, is like there are demographic changes happening on the platform itself. So I think like there's not enough meta analysis of that list, like not meta in the normal sense, which is Facebook is, I think it's fairly publicly at this point that Facebook's been struggling with younger users. So you're having an age, increasingly aging demographic on there. If older news readers, you know, I think tend to rely more on on right winning content. So some of that just tells you more about the audience of Facebook these days. but I will say also just one very last note about that list is I was always a supporter of that list. I'm a fan of Kevin's because I think public...
Starting point is 01:09:19 Yeah, because I just, as their first principle, I think scrutiny of these large civic information ecosystems is healthy and a good thing. Are there parts of those lists that are misleading? Like, absolutely. Are there parts of them that should be interrogated more deeply? Absolutely. Like, you know, I also think top 10 lists are not a great way to look at anything. And I think, you know, like, and especially, so I think, like, we people were, and I think Kevin would admit this.
Starting point is 01:09:47 Like, I think people were taking away an impression of Facebook writ large based on a top 10 list. And I think, like, yeah, that's not fair. But, like, in the end, I'm going to always go to bat for people who are, like, looking at these systems and trying to interrogate and understand them. But I also think that, like, is it a fair way? I mean, fair, but how useful is it a way to understand the system writ large? And I would say not that useful. Okay, Brandon, I got two more questions for you. I'm just going to ask them both.
Starting point is 01:10:14 I know you want to go camping. So let me just throw them out there and you can see what you can do on both. First of all, we're talking a lot about news on Facebook. Obviously, a lot of news is going to move to TikTok, which is a company operated, you know, basically effectively with oversight in China. All the transparency, you know, efforts that we're talking about in terms of Facebook coming short, you know, it's a whole different ballgame when you talk about a TikTok where, You know, the bite dance, which owns TikToks, does not have the U.S.'s political interests in mind, doesn't have U.S. society's interests in mind the way that a Facebook might, or at least the Facebook might attempt to. So I'm curious what you think about that. And then lastly, just give a quick shout out about, like you mentioned the legislation that you keep hearing about.
Starting point is 01:10:58 You've been on Capitol Hill. You're working on transparency legislation to make it not the platform's decisions, whether or not to share. So why don't you just hit those two And then we could wrap it up and get you out of here Great. And I'll again, this is my baby brain talking But one very last point of covering your stuff One really important thing on all that Is if you build those exact same lists in any other country
Starting point is 01:11:21 They're not all right-wing sources So there is also a degree to which like There is something unique happening in America Is Facebook making it worse? They could be But like there's also something unique about The moment we're in as a country in the U.S. Okay, so we get to your, question. So I'll answer the second one and then remind me the first one. So, oh, yeah,
Starting point is 01:11:40 TikTok. So, you know, I, yeah, I mean, I'll be honest. It is something I think warrant some degree of concern. And this is mostly, well, for a handful of reasons. But I think one is, you know, there's been a bunch of reporters, I think, um, that have been doing some reporting where there's not, they haven't found fire, but I think they've, like, found smoke. So Emily Baker White, I think, you know, at BuzzFeed News. At BuzzFeed. Yeah, has been doing some great reporting. And, you know, their examples where, you know, Bight Dance used to have a news app in the U.S. And she's found, like, multiple, you know, moderators for that that said they were specifically told to tamp down anti-Chinese news and promote pro-China news.
Starting point is 01:12:18 So, you know, I think there's enough moments where there's concerning stuff that's come to light that I think it does warrant more scrutiny, especially because, especially when you think about the weird asymmetry that, like, Facebook and American social media platforms can't be in China. Right. So there's like, you know, so, you know, I'm not trying to be conspiratorial about it, but like there's been enough quality reporting where I think there's scrutiny is deserved in this case. And I think in the end, transparency is a big part of whatever the solution they can come up with. Yeah. And then, you know, I think for me, your second question, you know, I've spent a lot of the last nine months kind of unexpectedly helping and talking to regulators and lawmakers around the world around how to regulate more transparency. And, you know, I was totally, you know, blown away by the amount of interest there is. I think in the end, a lot of it came from the Facebook papers that I think kind of dramatically
Starting point is 01:13:13 raised the temperature on regulators trying to do something. But there have been, you know, I'm happy to go through all the different examples. But the most important one is there's an act in Europe called the Digital Services Act. And it has some, you know, it has some like generationally significant new requirements. requirements around mandating data sharing from platforms. And, you know, it has now been passed. I think the council has to do like a final passing. But for most part, it is officially in place now.
Starting point is 01:13:44 And now there's going to be several years of figuring out exactly what it means, how well are platforms going to comply with it, yada, yada. But there is really, really important and like industry impacting new transparency requirements that are going to be required in the EU that are a huge deal. and it was like a fascinating process to be a part of. And there's a lot of hard work to go from here. And they're not going to backfire like GDPR stuff where you have to basically just become this annoying pop-up.
Starting point is 01:14:11 Like this stuff will actually make a difference. I mean, there's going to be some backfire for sure. Like I think you can't pass this sort of regulation without having some unintended consequences. I don't, you know, it remains to be seen what they're going to be. And I hope it is not as, I think a lot of GDPR backfired. But I hope it is not that much. But, you know, there will absolutely be some.
Starting point is 01:14:31 there will be some unintended consequences, and the goal of the next few years is to, like, of the people in the weeds on it, is to figure out really quickly where they are, how to mitigate them, et cetera. But, you know, it's a risk. And listen, when you are requiring data from private companies, that's also a fraught thing to do. And so I think you have to approach it really humbly and cautiously. And so I think there will be some. I think the question is hopefully not that many, and they can be addressed as quickly as possible. Great. And for people trying to follow this transparency push, what's the best way to do it? Oh, that's a great question. I mean, if you're trying to follow Europe, there's a reporter in Europe at Politico Europe, Mark Scott, who tracks his stuff really well. And then, you know, there's not a good part of the problem around the transparency space entirely is like there's actually not a lot of infrastructure in civil society when it comes to social data. And I think one of the things you're seeing is the beginning of like building up. that infrastructure, so there being more reporters who cover it,
Starting point is 01:15:35 they're being more university labs to focus on this work, more nonprofits, NGOs, and think tanks. And they're starting to already be spun up and funded in Europe. And I think we're going to start to see the same thing over the next few years in the U.S. You're going to start to see new federal and publicly funded agencies. So I think part of it is like we're at the beginning of the birth of like a whole new kind of like industry and thing. but like unfortunately right now it's fairly like scattershot yeah you mentioned the facebook papers
Starting point is 01:16:05 are you friends with francis haggen um i'm not i'm not friends with her no okay well she'll be on the show next week so there's my pitch brandon if folks uh want to um follow you where's the best way to do it and that wasn't a derogatory thing i just you know right it's an honest answer um if you want to follow you where's the best way you have a twitter account that you tweet from every now and again yeah I have a Twitter account, Brandon, Silver, M, and I think that's probably the best place. Okay. Awesome. Well, Brandon, thank you so much. This has been a long time coming. And, you know, this conversation is one I've been waiting for for a while. And I, this is all, it's been awesome. I feel like we could go for, you know, three, four more hours. So I hope to have you back sometime in the future and want to say thanks again for joining. Cool. Yeah, thank you. I appreciate it. And yeah, great talking to you.
Starting point is 01:16:56 You too. All right, everybody. Thanks again for listening. Please make sure to rate the podcast and subscribe if this is your first time here. Every five-star rating goes a long way, especially you're here. You're here. We're at a minute, one hour and 20 minutes. You know, five stars, that would be helpful. Okay.
Starting point is 01:17:11 Thanks again to Brandon for joining. Thank you, Nick Guatney, for doing the audio. Appreciate you handling these longer episodes past couple weeks. Thanks to LinkedIn for having me as part of your podcast network. And thanks to all of you once again. Francis Hagen, back on the show next Wednesday, not one that you're going to want to miss. We're going to talk about the Facebook papers, which all those documents that she leaked from Facebook and whether actually anything happened after those came out. So it'll be fun.
Starting point is 01:17:38 And I hope to see you back here on the feed next Wednesday. So until then, I want to say thank you. And we'll see you next time on Big Technology Podcast.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.