Advisory Opinions - What's Next for TikTok?

Episode Date: January 10, 2025

In a special emergency recording, Sarah Isgur and David French react to today’s oral argument in TikTok v. Garland. The Agenda: —Tiers of scrutiny —What is TikTok? —Content creators’ Fi...rst Amendment rights —The data sharing argument —Can SCOTUS punt to Trump? —Cases of the new Cold War Show Notes: —How TikTok Reads Your Mind Advisory Opinions is a production of The Dispatch, a digital media company covering politics, policy, and culture from a non-partisan, conservative perspective. To access all of The Dispatch’s offerings—including Sarah’s Collision newsletter, weekly livestreams, and other members-only content—click here. Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 You ready? I was born ready. Welcome to a special emergency episode of Advisory Opinions. I'm Sarah Isger, that's David French. This is the TikTok oral argument case. So David, right off the bat, how are you feeling after listening to that? A lot less confident that the Supreme Court is gonna uphold the DC circuit than I thought going in.
Starting point is 00:00:39 No question about that. It was, the questioning of the Solicitor General prelogue was pretty intense. I thought the advocates for TikTok and the creators did a really, really good job because what they did not do was really contest China's control or China's malignancy, shall we say. They contested the means, the way in which Congress did this. So I'm much less confident this is coming out against TikTok than I was before. What about you, Sarah? Way less confident, but I want to break the potential decision tree into four different areas because the problem with any oral argument is you're trying to count to five. And oftentimes,
Starting point is 00:01:32 by the time a case gets to the Supreme Court, you know, maybe we're like, well, standing, but if they find standing, then we're moving on to the merits. You know, there are sometimes these decision trees, but oftentimes it's like a yes or no, and we're just trying to count to five on a single question, and that's usually not that hard coming out of a moral argument. Problem with this case is again, I count four different decision points.
Starting point is 00:01:57 I don't mean four different potential outcomes. I just mean the decisions themselves. So first, is the First Amendment even implicated? Because remember, bite dance, which is the First Amendment even implicated? Because remember, ByteDance, which is the one that we think the PRC and China controls, ByteDance owns TikTok. TikTok is a subsidiary of ByteDance. ByteDance has a lot of subsidiaries that run the TikTok equivalent in a bunch of different countries. This law only touches bite dance. So does this even implicate the First Amendment?
Starting point is 00:02:27 Because bite dance doesn't have First Amendment rights and the law doesn't actually say anything about TikTok. It's all about bite dance divesting. Okay, so that's number one. Are we deciding this on just whether it's First Amendment grounds? Number two, what level of scrutiny applies if there is a First Amendment problem for TikTok?
Starting point is 00:02:44 Is it strict scrutiny or is it intermediate scrutiny? If you remember in the past, I talked about that Austin case about the billboards, right? But about the digital billboards and whether that's actually content or it's content neutral. And it's going to matter a lot for how narrowly tailored, how compelling the interest has to be if we're in number two. Number three, is it narrowly tailored enough? Was it a compelling interest enough? A lot of the argument turned on that, David, a lot of the questions. Or at least that's where Noel Francisco,
Starting point is 00:03:12 the guy arguing for TikTok, wanted to spend a lot of his time giving examples of the areas that Congress didn't consider and how not narrowly tailored he felt that it was. But David, there was a number four in this whole thing that I guess you and I dismissed too quickly. And that was the injunction or the administrative stay. And this is the idea for the injunction
Starting point is 00:03:38 just to enjoin the law from taking effect till some time. I'm not sure how the injunction would actually work. But the administrative stay is the interesting one. It's the idea that the court itself needs some more time to decide this case and to write the opinion. So an administrative stay is we take no position on anything dealing with this case. It has nothing to do with the case. We just need more time. So we're not going to allow the law to take effect. The injunction would be something more like what Trump was asking for. Hey, and join us until I get into office, because I think I can negotiate a resolution to this somehow legally.
Starting point is 00:04:12 But that's this number four bucket that was sort of picking up steam there at the very end of the argument. So. Let's do it. Where do you want to start on one, two and three, four buckets? That's a great question. I want to start with, why don't we just take them in order? I mean, isn't that the easiest thing? I suppose it is. So for number one, is the First Amendment even implicated? I could count to like two and a half. Maybe three and a half, maybe three and a half actually. Okay. So I thought that the chief
Starting point is 00:04:47 and Barrett were a hard like, I don't even know why we're here. This is about bite dance. The fact that TikTok is affected by a law that affects bite dance is interesting, but is not a First Amendment problem. And at one point the chief asked, can you name another case where a corporate structure law that has incidental effect on speech has ever been said to implicate the First Amendment? And Noel Francisco, the advocate for TikTok was like, well, I don't have one at the top of my head. And the chief responds,
Starting point is 00:05:20 I don't have one on any part of my body. The chief was feeling feisty today. Lots of chiefie jokes. I don't know, I was feeling the chief and he was feeling his oats for sure. Barrett was asking similar questions. I'll tell you, CT didn't talk a lot in this argument. But the two times that he made pretty pointed points,
Starting point is 00:05:42 one, for the advocate on TikTok and two, for the advocate for the content creators on TikTok, both times his questions were, why does this implicate you at all? How does this implicate TikTok's First Amendment rights? And then second, how does this have anything to do with the creator's First Amendment rights? This is a law that affects bite dance.
Starting point is 00:06:00 And at any point, TikTok can continue having the same content, the same users, the same videos, everything. It just can't be owned by ByteDance and giving information back to the PRC to do it. And this makes sense to me from a CT, Justice Thomas perspective, if you go back to the angry cheerleader case, David, because he wasn't feeling it for the angry cheerleader. And that's me. I was on the other side of him on that one. So, you know, CTs, I think, I'm going to count him as number three on that one. Kagan, I'm counting as like a quarter or
Starting point is 00:06:37 a half vote. And then Alito was all over the place. He may be up in number one category. Alito was all over the place. He may be up in number one category. He definitely seemed number four curious. He was very active in this argument and I don't know where to pin him down. You know, on bucket number one, it really does raise the question in a sense, in a sense that what happened,
Starting point is 00:07:02 what it really raises the question, what is TikTok? Because Justice Kagan had this exchange where she was like, wait a minute, can TikTok just offer a product without that specific algorithm? Which does raise the question, what is this? And TikTok pushes back on that and says, no, no, no, no, no, that's not possible. What is TikTok, right? And so if before the oral argument, TikTok,
Starting point is 00:07:36 basically ByteDance said, they're not selling this algorithm, they're gonna shut down. They're gonna shut down on the 19th. And I do wonder if that's ultimately going to hurt them because what is it that they're protecting here? And if it's a normal business, if it's a normal company engaged in business for money,
Starting point is 00:07:58 then you have 170 million domestic users that even if you don't have the exact same algorithm are worth an immense amount of money with a different algorithm. And it's not like American companies don't know how to create algorithms, right? But by Danza saying- But David, can I just take a moment to say, as a former TikTok user, Instagram isn't even close. It's like the difference between dark chocolate
Starting point is 00:08:26 and milk chocolate. I'm sure if I did drugs, I could give you a really good drug analogy. But like the Instagram algorithm is nothing compared to the potency and addictiveness of the TikTok algorithm. I was convinced to delete it off my phone because my husband thought I was fully addicted
Starting point is 00:08:44 and also because of the national security concerns. I don't even like bother with Instagram. Like what's the point? Well, and that is in many ways, that's the key to TikTok success. I mean, the Instagram, I mean, the algorithm has been said, there's this great times analysis a couple of years ago where they said, well, how does TikTok read your mind?
Starting point is 00:09:05 And there are interesting TikToks that my kids have talked about where once the algorithm is trained up on you, it gets just downright spooky. So interestingly enough, I mean, so the question then is, is TikTok TikTok without that specific algorithm? And that specific algorithm is what Chinese law bars the export of.
Starting point is 00:09:27 And so... Can I just share, by the way, a line that Justice Kagan said at one point, which makes me... So none of the justices actually ever disclosed whether they have or use TikTok now. I'm gonna guess no, just because of the national security concerns. But at one point, this sounds more like an Instagram user
Starting point is 00:09:47 than a TikTok user. Justice Kagan said, you get what you get and think that's puzzling. That's funny. That is funny. Justice Kagan always has great lines when it comes to the internet cases, I swear. But here's the interesting thing though.
Starting point is 00:10:03 If, just as a conceptual matter. If the sticking point is the algorithm, and the algorithm is bite dances, then how is that implicating the First Amendment? Because that puts us back into bucket one. If this is bite dances speech, and you know, from the NetChoice case, we know that algorithms are speech, and this is bite dances speech, not TikTok America speech, this is bite dances algorithm.
Starting point is 00:10:34 Bite dances algorithm doesn't implicate the First Amendment. And so you could still take TikTok USA and have TikTok USA exist with a non-Chinese algorithm. And that would comply with the law. So that bucket one, that question about the specific ByteDance algorithm to me really exposed some of the vulnerabilities in the TikTok position. I thought it was the worst bucket for TikTok. Because remember, we actually had three different people arguing today.
Starting point is 00:11:11 You had Noel Francisco, Trump's former solicitor general, arguing on behalf of TikTok. You also had Jeffrey Fisher from Stanford's Supreme Court Clinic arguing on behalf of the content creators. And by the way, Jeffrey Fisher, just being at Stanford's Supreme Court Clinic, has behalf of the content creators. By the way, Jeffrey Fisher just being at Stanford Supreme Court Clinic has gotten some really cool arguments. He argued in the Fulton case, David. He argued in Our Lady of Guadalupe.
Starting point is 00:11:36 He argued in Ramos v. Louisiana. Lange. I mean, some of the coolest cases that we're into are Jeffrey Fisher arguments. And then you had Elizabeth Prelogger arguing on behalf of the law itself. So on the content creator side, how does this implicate their First Amendment rights? I mean, this was Justice Thomas's question, or as someone else summarized from the Angry Cheerleader case, hit the children with sticks. Like. They can still create this content. They can still put this content on Instagram. Nothing about this law comes even close to touching them.
Starting point is 00:12:12 So what's the First Amendment interest? And the answer is, well, in the same way that if the government shut down the New York Times, it would not be enough to say, Sarah, you were publishing your op-eds with the New York Times. Now you can publish them with the Wall Street Journal that I would have some First Amendment right to associate, if you will, with the New York Times. Justice Jackson tried to hand them this associational argument, and for a lot of reasons they rejected it because it was a trap that Justice Jackson was setting and she was like, yeah, but let's just say it were associational, don't all these cases foreclose it.
Starting point is 00:12:50 And they're like, yeah, that's why we said it wasn't associational. But number one, I think is a huge problem for both sides. More the content creators, even than TikTok. As I said, I can't quite count to five, but there's a chance I can count to eight, right? Like basically I'm somewhere between three and eight on just bucket number one.
Starting point is 00:13:11 Justice Gorsuch, by the way, if you're wondering who the ninth justice is, Justice Gorsuch, definitely not going along with any of this. No, he was, the tone of his voice, Sarah, it's interesting, the more you listen to these oral arguments, the more you recognize some things. And Justice Gorsuch has this very polite way of speaking
Starting point is 00:13:34 that in an interesting way gets more polite, the more emphatic he is. Yes, it's professorial and conversational, and then it starts getting tensely professorial. But in a weird way, it's almost like, do you know how a parent, sometimes when they're really upset... I do this. This is me. You're describing me.
Starting point is 00:13:58 ...might actually slow down and moderate their voice tone a bit more? That's how I feel with Justice Gorsuch. I want you to think about what you're doing right now and recognize the consequences. Yeah, this is me. It's, by the way, a total pendulum swing just because, you know, my mom raised her voice at me a lot, so therefore, like,
Starting point is 00:14:19 I try never to do that. But the alternative, my four-year-old will say, Mommy, why are you so angry? Because I'm talking really calmly. Yeah. So it doesn't matter is the point. OK, so bucket number one, very interesting. Does the First Amendment even apply to this? Bucket number two.
Starting point is 00:14:38 OK, the First Amendment applies. But is it strict scrutiny? Is this a content based law or not? In in which case it might just be intermediate scrutiny. Sotomayor seemed really interested in this piece of it. She seemed the most interested in making maybe this finer distinction between intermediate and strict scrutiny. Noll Francisco basically saying, we went under either one Because congress so thoroughly failed to consider any of the alternatives that noll wanted them to consider even though I'm not sure That they don't are aren't worse in some ways so for instance no argued that they would win under intermediate scrutiny because they
Starting point is 00:15:19 Didn't try barring by dance from sharing Sorry barring tik-tance from sharing, sorry, barring TikTok from sharing information with ByteDance. But I mean, from what I can read in the record, they were like, yeah, we don't trust that they won't do it. And Noel was like, you can make huge criminal penalties so that the downside of doing it would be so huge that yeah, there's a low likelihood you get caught. But, you know, these servers are in Virginia, they're run by Oracle. If you make the punishment on Oracle, they'll make sure nothing gets shared. So that would be like an example of that. But I felt like eight to one the other way
Starting point is 00:15:56 with maybe Sotomayor hanging out over there in intermediate scrutiny land. Pretty much everyone else was like, fine, if it's content based, it's strict scrutiny. But the issue between strict scrutiny and intermediate scrutiny, I felt like was this law may implicate content, but it is not directed at content. And this was that Austin billboard case, right? Like, yeah, you have to look at the content to determine what billboards are about. But the law wasn't about content. It wasn't trying to reach content. Now, Nolan, the content creators, Nolan, Jeffrey Fisher, certainly tried to say,
Starting point is 00:16:31 they don't want critical stuff about American officials. They are very interested in the content. But the pushback to that, that Justice Barrett kept offering was, but they can keep up all of the same content. So clearly it's not about content. It's about who has control of that content. And of course, the data sharing part, which was, I mean,
Starting point is 00:16:52 nobody was disputing this data sharing part, which we can get to a little bit. Well, you know, and I think one thing when it was a bit of a weak point, I thought, in the argument when they were saying, well, no, Congress should have considered x or y. Because the answer to that saying, well, no, Congress should have considered X or Y. Because the answer to that was, wait, wait, wait. Congress is saying that,
Starting point is 00:17:10 and this is a point the government made in the briefs, you can't think of Chinese corporations the way you think about American corporations. You just can't do that. It's a category error. And whereas certain kinds of corporate forums really matter in the United States and the separation between corporations and government really matters in the United States, it's
Starting point is 00:17:27 a matter of constitutional law, for example, just not the case with China. And in fact, you know, they talked about in the government talked about in its brief that there's a communist Chinese Communist Party committee at bite dance. So this sort of idea that, you know, the Chinese government is going to say, hey, ByteDance, we need all the secret data of these Americans. And ByteDance is going to say, well, my communist overlords, American corporate law is prohibiting us from sharing that with you. At some point, judges don't have to be gullible.
Starting point is 00:18:04 I thought that was a weaker argument. At some point, judges don't have to be gullible. I thought that was a weaker argument. And the data argument, by the way, was the one where I felt like TikTok was really flailing there. That Kavanaugh raised, I thought Justice Kavanaugh raised a really good point. He was, wait a minute, a lot of these young users
Starting point is 00:18:26 are future soldiers, future judges, future legislators, future FBI agents. And aren't we just handing the PRC giant amounts of blackmail material on future government officials? That they won't even know exists, that won't be used for decades. Right. And it doesn't have, Congress have an interest
Starting point is 00:18:44 in preventing that. This gets by the way to an evidentiary question. So let's just take a little tangent here because what evidence was available to everyone in this case? For instance, on the data question and the data sharing, there was evidence in the record that's public that we know that the Chinese government told ByteDance to share information so that they could track the location data of certain journalists because they wanted to find leaks of ByteDance information.
Starting point is 00:19:15 And so they used TikTok data to do that. That would be just one very, very small example of the way that they could use TikTok data. But there were other pieces of the record, David, that were sealed. And as Solicitor General Prelogger pointed out, everyone in the case had access to the sealed data. It was sealed to protect the IP intellectual property of TikTok and ByteDance. So the justices and all of the sides have access to the sealed data. We, David and Sarah, do not. But there was also classified data. And Solicitor General Prelogger had access
Starting point is 00:19:54 to the classified data. The justices, in theory, have access to the classified data. But Noel Francisco and Jeffrey Fisher do not have access to the classified information in this case, which you saw Justice Gorsuch get pretty upset about, but you even saw the chief, the chief, my heart throb for this argument, was like, doesn't that seem ridiculous? Yeah. Yeah.
Starting point is 00:20:20 And he seemed to be implying that the justices were not going to look at classified data that wasn't available to all parties. Yeah. Well, you know, speaking of classified data, there were classified briefings for Congress before this law was passed. And so, you know, one of the questions that I had was how much is that classified data going to bear on the outcome of this case? Yeah, I don't think the justices were too pleased with that being like a trust us argument.
Starting point is 00:20:45 You also heard at one point, this falls more into bucket number four, about what President elect Trump may do. And Solicitor General Prelogger goes, well, he may want to get an updated national security briefing, a classified briefing about this because some things have changed in the last four years. Yeah. I was like, oh, sick bird. Yeah. Okay. So that was bucket number two, is it strict scrutiny or intermediate scrutiny? Bucket number three again is now, okay, we're in strict scrutiny. Noel did very little to push
Starting point is 00:21:20 back against the compelling interest. To your point about the data sharing, David, and what Justice Kavanaugh raised about future blackmail, even to the point about the Chinese government wanting to sow chaos and discord. Although the chief at one point then interjected to say, if that was their goal, they're doing a great job. Yeah. But his point was on the tailoring aspect.
Starting point is 00:21:44 It was like they should have considered the data, they're doing a great job. Yeah. But his point was on the tailoring aspect. And it was like, they should have considered the data, the sharing, making sharing illegal and criminal penalties. But what he really pushed was the idea that a disclosure would have done it. And he pointed to the Foreign Agent Registration Act and said, see, look, when you were concerned about foreign influence from these foreign governments back, I believe it was in the run up to World War II, it was when Farrow was passed, and that there'd be all these Americans being paid and that no one
Starting point is 00:22:09 would know that they were actually mouthpieces for these foreign governments. What we said was, you have to disclose that. So why wouldn't that work here? If you really wanted a narrowly tailored law that implicated the First Amendment, you could just have something when you open TikTok that says, the First Amendment, you could just have something when you open TikTok that says, the Chinese government has access to this algorithm, or whatever else. And I mean, David, I'm curious what you would think about that argument with Farah. But for me, I thought Solicitor General Prelogger made a pretty good distinction here, which is one, yes, and Farah is done by the person, as in, you know, the piece of information that they're saying was paid for by that foreign government. Here, she compared it to walking into a store
Starting point is 00:22:53 and someone saying, there's a million products in the store. One of them will kill you. Yeah. Disclosure. Check. Yeah. And Justice Alito at one point said, you know, I never heard Noel Francisco, the advocate for TikTok, say that he would have been okay with every video
Starting point is 00:23:09 having a disclaimer before the video plays that says, this video may have been manipulated by the Chinese government for the purpose of tricking you and manipulating you into, you know, X, Y, or Z. Do you still want to watch it? So here, this gets to a point that I think is very interesting. How much is TikTok legitimately,
Starting point is 00:23:30 how much is this a real business and how much of this is and Chinese op? And because- Well, they're not letting them sell it. A real business would definitely wanna keep the money flowing. A real business would pocket billions of dollars. Another one is how much does a real business say, I'm okay with mandatory disclosure?
Starting point is 00:23:53 How is that less restrictive? How is it mandated speech? Right, that's what I think is crazy about that argument. Like that every video would have that and you think that's less implicating of the first amendment? Yeah. They're just banking on the fact that TikTok users
Starting point is 00:24:07 are addicted enough to sit there and look at whatever while it says on the bottom on a running scroll, potentially manipulated by the Chinese government. I mean, what are we- What percentage of TikTok users know that it is owned by a Chinese company that is, you know, as you say has PRC members in it. Well, that was one of the weakest points, I think, from when the justices were like,
Starting point is 00:24:32 who doesn't know that China controls... They all know now said, was it Justice Kagan, maybe? I think it was Justice Kagan, maybe. Yeah. Like, they all know now. What? No, they don't. You think, we're the only people listening to this. Yeah. I mean, the idea that your median 17-year-old knows that the People's Republic of China has a proprietary interest in this algorithm is maybe what, 1% of them? I mean, I-
Starting point is 00:24:59 I would love a poll. I know it's under 20%. Oh my gosh. I think it could be single digits. And I think if you ask more sophisticated questions, the number drops maybe even below 1%. One of my pet peeves with people who do what we do, whether you're on the side of reporting about it like we are, or you're in the middle of deciding it like the judges,
Starting point is 00:25:20 or you're a pundit, you're a legislator, you're a judge, you're a public official, so you're a legislator, you're a judge, you're a public official. So many people sort of in our world who are in, you know, the game, so to speak, sort of in the industry, they whatever, presume that everyone else is operating with their information set. And they are not.
Starting point is 00:25:43 This is one of the things like just going to the back to the election that was just a constant pet peeve of mine is I just seeing on social media all the time how are people voting for Trump in spite of the fact that he did A, B, C, D, E, F, and G. They only know about A and B maybe they don't know through C through G. You stop presuming that people have civic knowledge here. They do not. They do not. So, you know, the other issue that I thought really hindered that bucket three narrow tailoring conversation is, of course, Congress doesn't have to consider every alternative.
Starting point is 00:26:18 No, this isn't the Administrative Procedures Act, where you kind of do have to consider everything, and it just leaves the courts with this huge out under the Administrative Procedures Act. They're like, well, you didn't consider this random thing I just thought of that you didn't read my mind, so I'm sending it back to the agency. That's not true for Congress. It's not true under strict scrutiny. It needs to be narrowly tailored, the least restrictive means. But it doesn't mean you had to consider everything else and explain why it didn't work. If it just doesn't work, then it's still the least restrictive means.
Starting point is 00:26:54 You don't have to announce it to everyone or explain it to everyone because, again, this isn't an agency action. It's Congress, y'all. And the concept of least restrictive means does not mean, as you were saying, that Congress in its act says, we considered this and rejected it for this reason and this and rejected it for this reason. No, no, no, no, that is not the way that works. All right. Last up is this idea of an injunction or an administrative stay, punting this until Trump gets into office,
Starting point is 00:27:25 what would happen if Trump either used his power under the law to say that a sale was underway so they get this extra time? That to me is a non-starter that's just not contemplated under the law. But what Trump could do is just say, we're not gonna enforce it. So what the law actually does,
Starting point is 00:27:44 it prevents TikTok from being available in the app store. So no new person could get the app. And then all of the American-based companies that provide the grist, if you will, for the mill, they would be barred from helping TikTok function. So if the Trump team comes in, a day, an hour, whatever, in and says, just so you know, if you're one of the companies affected by this TikTok law, we're just not going to enforce it.
Starting point is 00:28:15 So you keep doing helping TikTok and we're fine with that. There's a few issues that were also pointed out in the argument. One, of course, each day that you did it would be a new act, and the statute of limitations is five years. Yeah. So a non-enforcement agreement may not help you as much as you think. There's stuff called estoppel though, David. If this administration says we're not going to enforce it,
Starting point is 00:28:41 and you rely on that to continue doing the thing, and the next administration swoops in and slaps handcuffs on you or whatever, you can say, look, that's ridiculous. We were told that this wouldn't be enforced, but I wouldn't do it. If you're Apple's general counsel or Google's and you're talking about access to the App Store
Starting point is 00:29:01 or Google Play, and you're going to say, well, the Trump administration isn't enforcing this, then one of the first things I'd say is, wait a minute, you're going to rely on that? The Trump administration, the first Trump administration had this idea to ban it to begin with. Like he could change his mind at any moment, much less he's not going to be president forever. What are you thinking? This has to go off the App Store now. So that's the injunctive option, if you will. Meaning, I think even under the injunction, all of the companies in question still stop providing necessary stuff to TikTok and it would cease to
Starting point is 00:29:39 do anything on your phone. The administrative stay option though, very different. This is just the court saying, we need more time. This has nothing to do with the merits. So we're just hitting pause and the law doesn't take effect on the 19th. It doesn't take effect until we release our opinion. Then TikTok would continue to work on your phone
Starting point is 00:29:55 for as long as it takes the court to write their opinion. Right. And I think that's possible. It doesn't even offend me that much, honestly, because, like, we're talking a week or two, fine. Like, if they really need that to write what is going to be a pretty complicated opinion, like, okay, I don't think that's a huge deal.
Starting point is 00:30:14 It would be a bigger deal to me if I thought they did it because it would allow the president, like, if it was a back door to allow the president-elect to do something or other, that would annoy me because again, then do the injunction, but then you have to find a likelihood of success on the merits, which as I said, 8-1 doesn't sound very likely to me, but I could see there being some votes for an administrative stay. Certainly, Alito and Kavanaugh seemed very interested in the administrative stay aspect,
Starting point is 00:30:43 maybe even the injunction aspect. And then once you've got those two, of course Gorsuch, the dissenter is gonna join with that, certainly. So now I'm at three, but I need two more. I don't know where they'd be coming from. No one else chimed in, but it was at the very end of the argument. I don't know.
Starting point is 00:30:57 You know, it's funny, Sarah, as we talk about this, I have, as we've walked through this, I now have a more, I have more confidence that the case is gonna turn out like we expected, but I'm still remembering that the first instinct I had after it was over, which was, whoa, wait, this isn't as clear as I thought.
Starting point is 00:31:19 But as we're walking through it and talking through it, I feel like maybe what we're doing is actually acting as if we were counsel for the case and answering some of the trying to answer some of the court's objections. But I don't know. I don't know. It's that first bucket. It is that first bucket of who's speaking here, who's speeches this that really is, I feel like the crux of the issue. Well, let me give you then some of the reverse side hypothetical problems they're trying to prevent, right? Because it's really easy to be like, this one-off,
Starting point is 00:31:53 yeah, TikTok bad, ban TikTok. But it's all the knock-on effects that the justices have to think about, that frankly the people reading headlines don't. I mean, this was the Trump immunity case, right? Everyone wanted you to throw Trump in jail, but they're thinking about the next president. Okay, Politico is owned by a German company. What is Congress allowed to do to prevent Politico from operating the United States? Could Congress pass a law saying Politico must divest from its German
Starting point is 00:32:22 owned company and that company must sell Politico to an American company. And interestingly, I mean, I'll tell you, my instinctual answer was like, yep, they can do that. That was my answer as well. It was not Solicitor General Prelogger's answer. She was like, I think that would be different because it would be clear because Politico has a certain editorial perspective, that that would be targeting
Starting point is 00:32:47 the perspective, the content of Politico's message in a way that's not true for TikTok because TikTok's content is all over the place and the content in theory could stay the same under a different algorithm. Actually, I wasn't particularly persuaded by that answer. To the extent, Politico has a perspective, so does TikTok. That's the Uyghur problem where 80% of Uyghur content on YouTube is negative toward China, 11% of content on TikTok is negative toward China. But I'm not sure I see a problem with saying Politico has to be sold to an American company. It doesn't mean that I think an American... I mean, work this out with me, David.
Starting point is 00:33:33 Politico stays owned by Germans, and Congress says it has to be sold to continue to operate here in the United States. But that's not quite how it would work. It would be more like the distribution centers couldn't help distribute Politico or something like that. The Politico app, I guess. It does start getting weird. Why wouldn't Americans be able to read that content? Because we have said that they have a right to access information. Yeah. It's a really interesting issue.
Starting point is 00:34:04 Al Jazeera is maybe a, it's a really interesting issue. Al Jazeera is maybe a better example for a lot of people. Al Jazeera is exactly the same as Politico. We just think of Al Jazeera as being a little bit more nefarious than Politico. Right, right. My instinct was also yours in response to that hypo that is to say, yeah, yeah, Germany
Starting point is 00:34:22 doesn't have a right to own an American newspaper. And I'm still there. Like, I'm still there. Germany doesn't have a right. Because what we're dealing with here is not prudence, prudent, prudential judgments or congressional judgments. We're talking about constitution. Now, it might not be prudent to ban foreign ownership of American newspapers in the sense
Starting point is 00:34:46 that we live in a globalized economy that could hamper the financial prospects of American papers. It could harm in some downstream ways marketplace, but all that's prudential. And it certainly doesn't stop an American-owned company from publishing any pro-German or pro-Chinese or pro-Gaza, anything you want content that they want, that is the most First Amendment protected. Exactly. And so that strikes to me as a prudential judgment. Do we want foreign companies to be able to own, whether it's a hostile foreign company or a foreign entity like China, or a friendly foreign entity for now, like Germany.
Starting point is 00:35:26 Germany has not always been, but the idea there, to me, that's a prudential judgment. The Constitution does not protect the right of a German company to own an American newspaper. Now, the question then would be, do Americans have a free association right to voluntarily place themselves under the corporate control of a German entity, but to me the instant you're under the corporate control of the German entity the relevant speaker becomes the controlling entity and that
Starting point is 00:35:59 controlling entity doesn't have a First Amendment right. And so that's what I was actually kind of surprised by prelogger's answer there. Yeah, just some other notes that I took from the argument, right, Justice Barrett making the point that those content providers can actually still post their content to ByteDance. Like, ByteDance can't work with TikTok anymore
Starting point is 00:36:21 under this law, but the content providers who currently work with TikTok can work directly with ByteDance. They can also work directly with TikTok if TikTok gets a new algorithm. So like, how have the content providers been changed at all except that, you know, anytime there's some sort of antitrust issue that providers are affected. Like one of the justices, I think it was the chief pointed out, like could the AT&T users have sued with the breakup of Mama Belle?
Starting point is 00:36:50 Like, no, it may indirectly implicate them and their desired whatever, but we don't say that's a First Amendment issue. Yeah. You know, Barrett also making the point that the algorithm is the speech in question, which you mentioned, David, but like that is different than the Politico example or the Al Jazeera example, right?
Starting point is 00:37:10 It's the algorithm that's the speech, not the content of it. I mean, that just gets really messy. It's got messy in all of these cases. Yeah, no, because when you realize that the actual speech, and this is again, in an interesting way, ByteDance reaffirmed this when they said we're just gonna shut this whole thing down on the 19th, the operative speech at issue here is that is the ByteDance algorithm, not the speech of the individual content creators. And as Kagan pointed out, she said we've never said that speaker based restrictions trigger strict scrutiny.
Starting point is 00:37:48 So this is a restriction on bite dance, not even the algorithm itself. They're happy to have the algorithm sold to TikTok. By dance says they won't do that. But it's about bite dance as a speaker, not anything about their content, the algorithm, like all of that can stay the same. That's up to ByteDance. And that was another theme that I heard throughout, was this all depends on third party's actions, but we don't control what they decide to do.
Starting point is 00:38:12 So the fact that ByteDance's decision not to sell the algorithm, not to give TikTok any of the source code, that's their choice. But why should that choice trigger First Amendment strict scrutiny? Because they could choose something else, and then what, we'd say that choice trigger First Amendment strict scrutiny? Because they could choose something else and then what? We'd say that it didn't trigger strict scrutiny? Oh, and David, lastly, I just wanted to talk about sort of what standard was applying. You heard Jeffrey Fisher, the content creator advocate, talk about text history and precedent.
Starting point is 00:38:44 First of all, I've said, I don't understand why any advocate, especially those coming in from the quasi legal left, don't hit hard on this text history and tradition stuff. And he tried to, he sounded like it was his, you know, second or third language, to be honest. It was like speaking in broken English, yes. Yeah, but like then that's up,
Starting point is 00:39:03 like is the court more like the French or more like the German? Like the French find it annoying when you try to speak their language and the Germans find it charming. I think some justices felt different ways about that. But we didn't hear a ton about precedent. The cases that we heard about,
Starting point is 00:39:19 as I mentioned that Austin Billboard case did come up, but really only from Solicitor General Prelogger. We heard about Arcara. This was a 1986 case about shutting down a bookstore because of illicit activities that were occurring there. Basically, there was prostitution going on at the bookstore. They shut down the bookstore and the bookstore argued you violated our First Amendment rights. It was what a 6-3 decision that said it didn't implicate the First Amendment even though it had incidental effects on speech. When the law enforcement action is aimed at significantly expressive conduct, which prostitution is not, or when the law inevitably affects speech disproportionately. Not here. They were shutting down lots of businesses where prostitution was occurring.
Starting point is 00:40:07 This one just happened to be a bookstore. You heard Noel Francisco try to argue that if this were about data security and it just applied to all types of companies, that maybe that would be similar to Arcara. But in this case, it's only social media companies fitting all of these foreign requirements such that it really only applies to bite dance and TikTok currently. That made it more like a rule, a law that said no prostitution in bookstores. I took his point on that.
Starting point is 00:40:41 Nevertheless, Alito pushed back and said, yeah, but they're the worst offender of this data security stuff. And look, they're pointing to them using it against journalists, they're pointing to the potential use. There's the classified information that we don't have access to the potential blackmail that Justice Kavanaugh mentioned, aren't they allowed to simply take out the worst offender, they don't have to treat everyone equally or take on the whole problem all at once, which there is precedent around Congress not having to solve everything all
Starting point is 00:41:10 at once as well. But David, overall, this case, it seemed like everyone acknowledged, was not going to be decided on precedent. You may say it's closer to this precedent or that precedent. Mount Healthy was mentioned for those lawyers listening. It was mentioned quite a few times, but by and large, everyone was like, this feels really sui generis. The algorithm part of it, the bite dance, having the subsidiary TikTok, then the Chinese government, it being found to be an adversary,
Starting point is 00:41:41 the idea that it could stay totally the same with the same algorithm, the same content, the same content creators. There's just no other case that even really gets you in the ballpark of all those things. You know, from a nerd perspective, this is a super fun case on two counts. One is, as you were noting, it's kind of sui generis.
Starting point is 00:41:59 So we are, this is one of the first big cases of the new Cold War, is the way I would describe it. So that makes it- Oh, that's so true. We are, this is one of the first big cases of the new cold war is the way I would describe it. So that makes it so true. That makes this very important, very interesting. And the other thing that I love about this from a nerd perspective is that the ideological valence of this is all screwed up. Like there's no clear red outcome and blue outcome here.
Starting point is 00:42:21 You know, and this is the case with a lot of these tech cases. You know, we've talked about the TikTok blackout challenge case where you had Obama and Trump appointees combining. In this case, you had a- Trump himself. Trump himself is now both sides of this case. Gorsuch is our YOLO justice. Splitting up Gorsuch and Thomas is always interesting to me. Oh, I know. I know. And then when you know, on the, when it's all scrambled, I mean, the DC Circuit had a Reagan appointee, a Trump appointee, and a Obama appointee all on the same side.
Starting point is 00:42:51 So that's just, it just makes it very, very, very interesting. Last note, David, did you catch Justice Alito pronouncing the parent company of Facebook? No. So I think generally we pronounce it meta, right? Meta. Like metaphysical. Yeah, yeah. Metadata.
Starting point is 00:43:09 Yeah. He pronounces it meta. I did not notice that. Meta, OK. And I was like, I don't think he's on that platform a lot. No, I don't think so. I don't think so. That's hilarious.
Starting point is 00:43:21 All right, David. That will conclude. This is a TikTok only emergency podcast, but there has been a lot of legal news in the last 24 hours that we are going to cover extensively at our Monday live show at George Washington. We're also doing one at Catholic. That'll be a little more big picture
Starting point is 00:43:40 on the state of legal academia. But David, do you wanna give a little preview of Monday's George Washington episode? We got a 5-4 Supreme Court ruling allowing sentencing to go forward in the Trump case. We have the sentencing, if you want to call it that, in the Trump case. We have an oral argument preview of the Texas age verification law, which is what I'm writing about actually for my column this weekend. There's a bunch of fun stuff to talk about, Sarah. I can't wait for Monday. With that, thank you for joining. Also, David, I did, you know, I've experimented with a lot of different ways to listen to Supreme Court oral arguments. My most
Starting point is 00:44:14 famous way, I think, for podcast listeners is in the shower with my phone in the little Ziploc bag. That is basically impossible to do when I know it's gonna be a really meaty argument like this because I have to take notes while I'm listening because we're not going to get a transcript in time, so it's not going to be searchable. I can't do that in the shower. So generally, I just like, you know, hole up at home. But David, I tried something new today. I don't know if you've tried other places to listen to arguments. I thought it would make a lot of sense. I haven't gotten my nails done since like September, October, bad stuff, right? It's been a really, months, multiple months for sure.
Starting point is 00:44:50 So I was like, this is great. Nobody will be at the nail salon today. I'll just go, I'll have my laptop there, my toes in the water, and I'll just be, have my AirPods, my phone, and my laptop, all three different devices open and working. It was a terrible idea, David. Why?
Starting point is 00:45:04 Because they're like, they have to ask you questions. You have to move around. It just, no, it did not work. The AirPods are falling out. You can't focus. I mean, I was about to say welcome to the party, Sarah. I exclusively listened to Supreme Court oral arguments while getting manicures.
Starting point is 00:45:21 The only way to do it, right, frankly, I mean. So you can see, I don't know if you can see through the video, but I then ended up with this like very metallic tone that I didn't really aim for. I don't know, it's, it was all a disaster. This technical struggle is real. I had to drop off something at the post office
Starting point is 00:45:39 because yesterday the post office was closed because of the funeral for president Jimmy Carter. Then there's ice and there's snow. School was on a delay today, but it's our first day back in three weeks. I'm just saying, if today wasn't our best emergency podcast, it's my fault. No, I mean, I think what you're actually saying in a kind of a passive aggressive way is you went to heroic lengths to do this podcast and everyone should be grateful. I didn't have to get my nails done. I didn't have to go to the post office. No, I mean, the heroism on display.
Starting point is 00:46:10 It was all terrible. All right, well, with that, we hope to see you Monday, but we'll have lots of great episodes coming out next week. Oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh,

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.