Big Technology Podcast - Facebook Oversight Board Member Julie Owono Takes Us Inside The Trump Decision

Episode Date: May 5, 2021

Julie Owono is a member of the Facebook Oversight Board, which today announced it's leaving Donald Trump's indefinite suspension in place. Owono joins Big Technology Podcast to discuss the decision, t...he board's deliberations, and what comes next for Facebook and Trump.

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to the big technology podcast, a show for cool-headed, nuanced conversations of the tech world and beyond. Well, former president Donald Trump is staying off Facebook. The Facebook Oversight Board, a group of independent folks the company assembled to have final say on some content decisions, announced today they are upholding. holdings, Facebook's indefinite suspension of Trump with some caveats. Joining us today in an emergency podcast, the day of the decision, is Julie Owano, a member of the Oversight Board and the executive director of Internet Without Borders. Julie, welcome to the show. Thank you so much.
Starting point is 00:00:48 Thanks for having me. It is a pleasure to have you here, monumental day, of course, for the organization that you're involved with. How has the feedback to the board members been today now that you've decided to uphold the ban have you been concerned about getting hate mail as your inbox overflowing or has it been fairly quiet well uh thankfully so far at least in my case i haven't spoken to my other colleagues yet but um i haven't received hate mails yet but which i'm happy because it probably means that uh the decision that was reached was quite fair and and quite clear to to everyone no matter what board i mean
Starting point is 00:01:27 what position you have on the matter. What I'm really happy is that the conversation, of course, is about the former president of the United States, but there's also great interest in how this can impact other world leaders, other influential voices on social media platforms, in particular on Facebook and Instagram. And yes, also what it means for regular users, basically. You know, What should they expect from a company like Facebook and Instagram? And what we think is that they should expect transparency. They should expect clarity in the rules. They should expect that the rules are applied the same way to everyone and not in an arbitrary way.
Starting point is 00:02:14 Yeah. And so just to start off with the decision, instead of just saying, okay, this is what the board rules and down to that, you sort of punted it back to Facebook. There was a pretty remarkable line in the initial summary where the board said, in applying a vague standardless penalty and then referring this case to the board, I guess the vague standardless penalty is the indefinite suspension of Trump, and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities. The board declines Facebook's requests and insists Facebook apply and justify a defined penalty.
Starting point is 00:02:50 So essentially what that means is indefinite suspension. is upheld for the time being, but the board wants some more action from Facebook, whether it's able to compel Facebook to take some more action is, I think, still a question mark. We can get into that. But how did you arrive on this decision? Because the whole point was folks were uncomfortable with the power that Facebook held. So now the board is saying it's, anyway, I'm curious what your thought of is on this all this stuff. Yes. Well, our mandate is to you know, make binding decisions on cases as they are presented to us. What was presented to us was an indefinite suspension of the, those two pages. But we have to decide the two pages being
Starting point is 00:03:43 what we believe. It's right. The two pages of the former president, sorry, the two pages on Facebook and on Instagram. What we decide is based on Facebook community standards, which are basically Facebook's rules, those rules did not provide such penalty because such penalty, in our opinion, in our analysis, was open door to arbitrary, which we obviously don't want. And in addition to making rules on, I mean, making decisions on, based on Facebook's rule, we also make those decisions based on international human rights principles and standards. And one of the principles when an entity wants to censor, because that's what we're talking about,
Starting point is 00:04:29 you know, when you limit freedom of expression, you censor. Well, to do that, you need to do so based on clear and existing rules. This was not the case here. So there was no way we could endorse and say you make the right call to indefinitely suspend. Because this is not only about the President Trump. This could be someone else tomorrow, you know, who has, who sees themselves applied a rule out of nowhere. And we obviously don't want that. So that's why we told Facebook, you should make a reexamination, the right call, based on existing rules.
Starting point is 00:05:02 And we also provide guidance on what those existing rules could look like. So that's why we also gave six months of Facebook to do this reexamination in order for the company to have time to take into account. recommendations should the company see them, well, feed for the case and for its policies in general. Before we jump into these, I just want to ask, like, it's pretty remarkable, again, going back to the decision. Like, we'll get to the recommendations. But I want to talk particularly about the way that you came to this choice.
Starting point is 00:05:38 The fact that, like, because for a long time, Facebook was like, you know, Trump is held as the president and there's a newsworthy standard and he's held a. above, you know, what most other users are, would be judged on because he is the president. This is newsworthy and we don't want to remove that content based off of our, content from our platform. Then, of course, the January 6th riots happen. Facebook says, you know, Donald Trump encouraged these things. And then suspends him indefinitely, which I think you're pointing out here, isn't exactly
Starting point is 00:06:06 in the rulebook. Did it strike you that the decision to bent Trump indefinitely was essentially, you know, a Mark Zuckerberg decision was that what you guys were pointing at? It was part of the discussion. I mean, we all, you know, read that post on January the 7th, on the company CEO. And, yes, we felt that it was not, if it wasn't in the community standards, if it wasn't in the terms of service of the company, then it shouldn't be, I mean, it shouldn't even be a debate about that, in my opinion.
Starting point is 00:06:43 So, yes, what we're telling. Mark Zuckerberg and other, you know, leadership at the company is that, yes, you're asking us to help you, basically, restore. I think we could almost call it rule of law, basically, which is, yes, people have to be aware what they can and cannot do, and you have to be able to explain yourself and you have to be open to be held accountable when you're doing, you're not doing it right. Because it has implications beyond just, you know, you, Mark Zuckerberg, or whoever else at the company, it has implications for potentially 2.8 billion people, I think, everywhere in the world. So, yes, there's no way this can rest only on your shoulders.
Starting point is 00:07:31 And there are ways for it not to rest only on your shoulders. And these ways are clearly and deeply rooted in principles that the whole humanity has agreed to uphold. How did it feel when the board ended up getting this opportunity to decide whether the ban should be upheld or not? Because from my understanding, your purview was, you know, allowing people or allowing cases to be appealed to you when Facebook takes stuff down and should you put it back up like a post or a piece of content. But now it was an account and not just any account, but the account of the former president in the United States to be put on. So just emotionally first, like, you know, how did it feel to get handed this case that like, oh, wait a second, I just signed up to this board. You know, maybe we're going to be doing some content decisions here or there. And now essentially you're handed the biggest, maybe I'm exaggerating a little bit, but what seems like the biggest decision, content moderation decision in social media history.
Starting point is 00:08:36 So how did that feel to you? Wow. It felt like a huge responsibility. And that's why I think we need it this time to, yes, make sure we're not missing anything in this case. And also that we're reading all the 9,000 plus public comments that have been submitted. So it was, it's a huge responsibility. but I really hope that in this quite unknown territory of how should companies like Facebook and Instagram how should they handle speech by powerful figures in the world that includes elected public officials in countries, in powerful countries such as the United States, Well, how should they, or others, how should they treat their speech and the right of users, other users on the platform? And yes, I'm very personally very happy that what guided most of our discussion, all of them actually, was really, yes, how does this protect freedom of expression and the rights of others? and it wasn't it wasn't a very easy question but I personally have learned a lot in this in this cycle of deliberations and yes hopefully we've come up with something that's deeply rooted in in those principles yes yeah and so how did you and the fellow members did you do you have like a WhatsApp group or like when you first got word that it would be the Trump decision
Starting point is 00:10:32 that you're going to handle. I guess you get a decision whether to accept it or not. Yes. What was the chatter like among you in the beginning, like whether we should or we shouldn't take it, take us back there? So there is a, how do you call it, a case committee, case selection committee that's composed, that meets on a rolling basis, and that's composed of, I think, five board members.
Starting point is 00:10:58 And, well, I think that the committee just just met, and we received an email, an urgent email saying that, yes, we have a... Are you on that committee? I can't answer that for reasons related to impartiality and protecting members from potential lobbying. But I will at some point. I mean, every member sits on that committee at some point. since it's on a rolling basis. And, yeah, we received an email saying that we have an emergency meeting the whole board to discuss a case. And yes, that's how it happened.
Starting point is 00:11:43 And do you know why they decided to take it? Our rationale for taking cases is usually taking cases that have implications beyond the individual user that's concerned, who's concerned, sorry. And we also took the case because it had policy implications, including related to political leaders, how their speech should be treated on the platform. It had policy implications that were, yes, extremely important to deal with. So I think these are the two main reasons for us accepting this case. And then what do the deliberations look like once you decide that you want to take the case? Like, there aren't lawyers or, you know, a presentation of arguments. So is it just like everybody gets together on a Zoom call?
Starting point is 00:12:34 Like, what does it actually look like when you make a decision like this? Yes, given the current circumstances, everything is virtual. What happens is there are several deliberations meeting, depending on the complexity of the case. It could go from three to more if the case is complicated. it. And usually the first meeting is to familiarize ourselves with the case and the documents that we have forward. So in this case, it was the referral by Facebook, reading its rationale for referring the case. And after that meeting, as board members, we feel like we need additional information, which we obviously did in this case. We can request some political background
Starting point is 00:13:22 big briefing, policy background briefing, and also, yeah, we have a public comment period that allows people to tell us what we should do, what I think we should do on this case. Yes, and basically, and we also have the possibility to ask questions to Facebook, which we did, we asked 46 questions, and Facebook did not answer all of them, as you've seen. And yes, after that, we just deliberate based on all the elements that we have requested and that we have forward. And yes, we reach a decision based on that. And what did those deliberations look like? So I know you got a statement from Trump or from the Trump camp saying, hey, I should be let back on.
Starting point is 00:14:14 Now the deliberations are they like, you know, Zoom meeting. where, you know, people argue for different sides? What were the, and I'm curious, like, what were the options that you were considering before you landed on this one? So I can give, you know, very detailed of what was discussed, but what I can say... Sorry, but why not give some detailed decisions, detailed information about what was discussed? Because at the end of the day, like the whole point, I was going to say, you know, why not give some of you because at the end of the day like the whole i mean there's no real rule that's going to
Starting point is 00:14:52 limit the board being transparent about what happened in there and i think that like as the public decides tries to decide this you know i think that like it would be a good thing if you shared as much as you could about what actually happened inside these rooms and what options were considered of course of of course uh the board is is willing and has always been uh very transparent on how it operates, we will have, I believe, a report out on, you know, how this case was possible, what happened. And we also have, as you've seen in the decision, detailed information on the deliberations and particularly the minority opinions that were expressed by some board members. But I mean, what I can, what I can say is that, of course, the discussions were,
Starting point is 00:15:50 there are different opinions on the boards, right? There are people with different backgrounds. But what I, again, what I found extremely encouraging is that we are able, and we've shown that in the past, that we're able to set aside whatever we think on the, uh, the, the persons, the issues involved and look at all this, taking a bit of perspective and looking at all this with the lens of, yeah, is it grounded in principles? Is it principled? But yes, we did have strong arguments.
Starting point is 00:16:32 We did have discussions on whether or not before letting the former president back on, should the company decide to do so following its reassessment. Well, should the company ask the form president to refrain from claiming, yeah, again, that the election, the 2020 election was not fair. There were discussions on should we talk about the shooting and looting publication and how, and does that even plain to our decision? Yeah, should we look at the history? And what I find fascinating, what I found fascinating with this case is that I think for the first time we really had a very thorough history of an account, of what happened on an account. And yes, it was important. It's important because it gives context.
Starting point is 00:17:30 And that we always tell in our decisions is that these companies, and particularly Facebook and Instagram, cannot make those important decisions without taking a new. into account the context. Yeah. And now what were the arguments? You know, I know we're reticent to compare this board to the Supreme Court, but at least with the Supreme Court, we get to watch the arguments. That's what your deliberations weren't open to the public. So, Julie, like, you're our window into this whole situation.
Starting point is 00:18:00 So can you share a little bit more about what the arguments were back and forth? So in the statement from the user, so from the legal team that represented former president, Mr. Trump, they argued that the board should look at this case through the lens of U.S. law and U.S. constitutional law, particularly in the First Amendment, more specifically. So we did have internally a discussion not on applying the law itself because our mandate completely prevents us, I mean, does not allow us to do so. And I think that's great because not all laws around the fold are, you know, like I said, like in the U.S. So we did have discussions on to what extent we should reference the First Amendment in our decision. mainly the concern was to make sure that the American audience that would read it would, you know, be able to make parallels with the First Amendment and how it is applied in the U.S. We chose not to focus too much. I mean, we did mention at some point that they're resemblances.
Starting point is 00:19:15 And I think my colleague Michael McConnell has in interviews explained that they're consensus. that are quite similar between freedom of expression as it is understood in the international law and First Amendment. But we chose not to focus too much on that because, again, this decision is not only about President Trump, it's about other influential users, too, who could be concerned if they have a similar case. And if we focus too much on a specific law, then tomorrow, I don't know, I don't know, whoever else, whoever other world leaders could ask us, I want you to base your decision on and to reference my law into your decision, which would have made things very complicated.
Starting point is 00:20:05 What were the arguments like inside the board for people who were interested in restoring Trump's account? What did they say? Well, most of the arguments for those who thought a reinstatement, would be fair, given the fact that, again, indefinite is not a penalty that's provided by the community standards. Well, for them, mostly it was about protecting political speech and particularly protecting the ability of citizens to access speech by their politicians. Yeah. But I would say most of the conversation was not focused per se on.
Starting point is 00:20:51 should he be back or not? I think it was mostly focused on was the penalty applied here provided by the community standards? And once we quickly saw that no, well, we had to discuss then what should be done. And that's why we came up with, you know, the current solution that we offered,
Starting point is 00:21:18 which is, yeah, reassessing, but reassessing it with in mind the severity of the violation that happened and also with in mind the potential for future harms. This should really guide this reassessment by Facebook. And I understand that there was an agreement that it wasn't as simple as saying should we uphold or turn away this ban. but like there must have been a decision you know we're going to kick it back to Facebook in the meantime should we uphold or should we remove this ban so why uphold versus
Starting point is 00:22:00 remove that that's a very a very great and complicated tough question I think it's before making a decision on whether the account should be put back up I mean the page should be sorry, the access to the page should be restored because that's what it's about. Like we say, a lot of things rely on context, and there were a lot of elements here that we feel we're lacking to allow a fair decision and clear decision to be made. What is this context? Well, for instance, the potential for future harms, which we referenced in our, you know, in our decision. But also beyond that, we've made a recommendation asking the company to assess whether or not its platform had been used properly prior to January 6 and, you know, had contributed to what happened to January 6th. And also, of course, there's the history of the page, which we also asked the company to take into account when it makes this reassessment and particularly focusing on the severity of the violations.
Starting point is 00:23:24 So, yes, I would say there was a lot of context and there is still a lot of context that's missing. And that's what we're telling the company. You cannot make such an important decision without having the necessary and, you know, accurate context available to decide. And so were there people in the oversight board who were like, let's put Trump back on until we can figure this stuff out? Or was it somewhat unanimous that he should be kept off until Facebook makes this decision? I mean, we were unanimous that if a law, a regulation, a rule, a standard, no matter how you call it, is not clear, then it, it should, it shouldn't, you know, it shouldn't be used to justify a decision. that's, I guess, what we all agreed on. And that's why we're making recommendations on how to make those rules more clear to the user.
Starting point is 00:24:41 And I really insist on this because the issue, yes, of course, reinstating, not restating is, of course, of prime importance. But I think we should not lose sight of the fairness of the decision we come up with, you know. Some people in France attribute a quote to the writer Voltaire and said, I don't agree with you, but I will defend your freedom of expression. I think we should also apply this to ourselves here. We might not agree with everything that this page has been writing and publishing, but there are principles that go beyond our personal beliefs, and we should uphold this principle.
Starting point is 00:25:28 no matter what. Yeah. And now I'm, you know, in asking these questions, I'm not saying, I'm not being prescriptive here. I'm not saying the board should have put Trump back on. I'm just kind of curious, you know, again, if the rule, I'm just kind of, I want to get into your thinking a little bit more because if the rule was, you know, unclear and he was banned and the ban was upheld. And it seems like a lot of people on the board are interested in.
Starting point is 00:25:58 free expression then why still and it did seem like at least in your language there was some agreement that trump should have been banned then why still keep that ban in place uh was there a feeling that it you know that it was fair and also you you mentioned that everybody was unanimous and were on your outcome there was where well anyway let's get into that too like was that were there any like you guys voted on it and were there any no votes to the resolution or anyway, tackle those in whatever order you like. I'm curious. So the decision is adopted as soon as there is a majority. So there was a majority in favor of this decision. I personally do not have, as we're speaking, the detail of the vote. I suspect this should be available to us very soon. And I also hope it will also be explained later on to the public. but what what what what's what I find very interesting here is that they were harsh this
Starting point is 00:27:09 honestly there were discussions on the the yes on what it made whether or not we should actually substitute ourselves that was talked about I mean substitute ourselves to Facebook I mean, to tell what should have been the right call in this case. I mean, put ourselves in the shoes of the company, basically, and say, okay, we're going to place ourselves back to January the 7th and say what should have been done. There were discussions on that. There were also. But ultimately, what I think is interesting is that we're building a process, really.
Starting point is 00:27:49 And to build that process, it's important to have. these moments where, yes, you set the principles and, yes, you apply them. And then, yeah, that's how the, that's how clear, transparent, consistent rules are, are created, I think, in this dialogue, I hope, really. Yeah. And now, were there, was there any discussion among board members about, like, what the consequences would be, like, personally, if you reinstated Trump, like, or sort of, I'm curious, like, you know, if people were concerned about, like, Blow Black in their communities,
Starting point is 00:28:31 if they voted to reinstate him on the service. There were concerns for personal safety. Yes, of course. Actually, no matter the outcome, because, you know, not everybody... I mean, there will always be something to say about this decision, right, no matter where you stand. So, yes, they have been concerns about the safety of board members. And I really would like to take this opportunity to really thank our administration who has been relentlessly working to make sure that we're, you know, in a position to make this decision without fearing for our safety. So, but I wouldn't say this has prevented us. from having the hard discussions and, you know, making the decisions.
Starting point is 00:29:27 Yes, I don't think it has really prevented us from doing our work. I mean, it hasn't prevented me. Because, you know, Zucker. No, it's brave of you to, I mean, look, there's a lot of, we've talked about it on this show in the past. Like, there's a lot of, you know, elements that goes into joining a board like this. It's not a straightforward thing. But it's also somewhat brave. I mean, Mark Zuckerberg has lots of personal security.
Starting point is 00:29:53 I don't know if Facebook is paying for that for you and the board. So it is a brave position to be in to say, okay, let's go ahead and be part of this. Yes, no, there is a budget for our security that's provided in the seat funding, can we call it that, for the operations of the board that was given through it. a trust by Facebook. So concern over safety are, you know, a priority. But it's not like you have like a secret service set of body guards that are standing outside your door after a day like today.
Starting point is 00:30:36 No, we have a, we have a very responsive security team. And yes, the threats are being taken care of as soon as they arise. Okay. Yeah. I mean, like the reason, and I know I've asked you about this a couple of times in our discussion, but it is sort of, it gets to the heart of this, which is, you know, which the board's explicitly told Facebook. Like, you can't, you know, make a decision outside the rules and then use the board to shield yourself from accountability. And, you know, that acute, that goes to the fact that, you know, you are, again, putting your safety on the line in some way in order to be. I mean, Zuckerberg, so he spends millions of dollars a year on secure. I believe. He does. Anyway, the last thing we should talk about is the fact that you've asked, the board has asked Facebook to come up with a better decision and a better policy and get back to the board.
Starting point is 00:31:36 I think within six weeks. Is that right? Six months. Oh, sorry. Six months. Okay. Three things, actually. So they have seven days to implement the decision.
Starting point is 00:31:47 So in this case, they have seven days to say when they're going to. re-examine the case of a meeting's decision. The second thing is they have 30 days to respond to the policy recommendations which are separate from the binding decision. So we have made policy recommendations and Facebook has to respond within 30 days. And within six months, Facebook must come up with its new decision on this case based on the decision that we've shared with it. Yeah. And so from my understanding, the board can make its decision on content choices and then policy recommendations and everything else after that is up to Facebook. So can you compel Facebook to make a new decision within six months? Or can Facebook just say, well, that's nice. You know, the board made
Starting point is 00:32:35 its decision. You know, we're not compelled by the charter that we set up to actually go ahead and do anything else. And that's the end of the discussion. No, Facebook is bound by the decision on the case. So if we say you have to reassess within six months, they have to reassess within six months because that's part of the binding decision. That's a binding part of the decision. Now, of course, like you rightly said,
Starting point is 00:33:06 we don't have a force to have Facebook, to make sure that Facebook enforces our decision. That is true. But I think from a moral perspective and also to show its commitment to human rights like the company has, you know, announced very recently and has repeated in the past years, well, to show that it has really committed, it is really committed to protecting freedom of expression, I think, yes, that the ball is on their side now and to show that, yes, this matters to them. I do think it does. Otherwise, they could have not sent this case in the first place because, like you rightly said, at the beginning, this is not the type of cases we were initially created to look at. But I'm really confident that the decision that we came up with offers enough guidance, sorry, to the company and to help it make the most principal decision in this very complex. case. I didn't realize that it could be a decision could be binding where you can punt the
Starting point is 00:34:20 decision back to Facebook, but I guess that's part of the board's remit. No, it's part of our remit. We were asked about the indefinite suspension and we said it's not, I mean, the suspension was right, but the indefinite is not right. You have to reassess this. So this is absolutely binding. What I can say also is what we see with the board is also it's agility. It's agility. Like I said, we were not supposed to look at accounts as pensions. Now we're able to do so. And it really speaks to how much I think the company does need this type of guidance. And that's why they were so eager to send this case, for instance, because they need the guidance and we're here to provide it.
Starting point is 00:35:07 Right. And I think I'm generally in favor of this idea of the board. But yeah, it's going to take a long time. I think, to get it completely right. There has been discussion that the board can't really get to, like, the core issues of what's wrong with Facebook, and it's mostly just, like, making content decisions. And that's why I think the questions that you asked Facebook that Facebook didn't answer are so interesting.
Starting point is 00:35:32 And, okay, so I'm just going to read. Okay, in this case, the board asked Facebook 46 questions, and Facebook declined to answer seven entirely and two partially. The questions that Facebook did not answer included questions. about how Facebook's newsfeed and other features impacted the visibility of Mr. Trump's content. And whether Facebook had researched or plans to research those designs, those design decisions in relations to the event of January 6, 2021, and information about violating content from followers of Mr. Trump's account. The board also asked questions related to the suspension
Starting point is 00:36:06 of other political figures and removal of other content, whether Facebook had been contacted by political office holders or their staff about the suspension of Mr. Trump's account. and whether account suspension or deletion impacts the ability of advertisers to target the accounts of followers. Facebook stated that this information was not reasonably required for decision making in accordance with the intent of the charter was not technically feasible to provide, was covered by attorney, client privilege, and or could not be shared or provided because of legal privacy, safety, or data protection concerns. That's fascinating because it shows essentially that, you know, if I'm Facebook and I'm seeing, these questions, I'm starting to feel, oh, God, we created a monster because now the board actually is going beyond its, you know, actual remit to assess the outputs and actually go to look at the machine itself and asking about the way that our algorithms work and our newsfeed works. So I'm just kind of curious, like, where your perspective is there, because it seems like the board is now trying to get into the actual mechanics of the way that Facebook, the platform works. And is this something we can expect more of? what I can say on this is it's it's just another proof for us that in order to make the best decision possible we need all the context and if that context includes looking at what happened
Starting point is 00:37:36 before January the 6th and how it happened um it yes I think uh it should be it should be discussed. And that's what we are trying to do. And that's why we also did not shy away from recommending the company, despite its unwillingness to respond to us. We recommended to the company to conduct the assessment on what happened before January the 6th from a technical perspective, but also from a policy perspective. And it's really this dialogue that I find absolutely fascinating in helping the company evolve
Starting point is 00:38:14 with its policies and how it deals with content and while protecting our freedom of expression. This open dialogue between the board and the company will, yes, bring, I hope, some answers and a way forward. I'm confident on that. Great. All right, Julie, I know we're over time. It's been a pleasure having you on the show. How can people get in touch with you or the oversight board? whatever you prefer? Well, I'm on Twitter, Julia Wendow, and our organization is as well, Oversight Board.
Starting point is 00:38:54 If you're at Oversight Board. We have a website, oversightboard.com, if I'm not mistaken. And yes, we also have a lot of cases coming up and you are able to provide public comments we have. I'm not sure we've announced new cases yet, but we do have some very interesting cases. So please do check those.
Starting point is 00:39:14 them out. Come on, break some news. No, I'm kidding. I pressed you hard enough over this conversation. So I really want to say thank you, Julie, for the really great answers and giving us some insight into this new process that I think is going to become more and more important as social media evolves. So I do appreciate that. Thank you to everybody for listening. We will actually be back tomorrow with our regularly scheduled episode. That's with Sridar Ramoswamy, who is the CEO of Neva, a subscription search company. He used to run ads for Google, and now he is creating a new search engine. No ads, but you pay to use it.
Starting point is 00:39:52 It's a really interesting concept. So stand by for that tomorrow. Thank you again, as always, for listening. If you're at this point and want to give us a review on Apple's podcast or the podcast app that you use, that would be much appreciated. It's your first time here and you want to subscribe. Hit the subscribe button. We come out with these conversations every Wednesday.
Starting point is 00:40:11 and sometimes we do a bonus episode like today's and we'll just push the normally scheduled episode until tomorrow. Thanks again. We'll see you then. And once again, it's been great having Julie on the show. We look forward to more of these type of discussions.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.