Texas vs. Platforms … vs. The First Amendment

Last week the Fifth Circuit upheld a Texas social media law that, among other things, prevents platforms from discriminating against users based on their viewpoint. The leading opinion declared that a bunch of things we thought we knew about how the First Amendment and content moderation work are wrong. Next stop: the Supreme Court.

Evelyn talks with Daphne Keller, director of the Program on Platform Regulation at Stanford’s Cyber Policy Center, and Genevieve Lakier, Professor of Law and the Herbert and Marjorie Fried Teaching Scholar at the University of Chicago, about what the ruling said and what it means—to the extent that’s decipherable.

Transcript

Daphne Keller:

The Texas law requires the platforms to be viewpoint neutral in their enforcement of most of their content moderation rules. There’s some important exceptions to that that I think we should get to later. But many people read that to mean, if you take down pro-Nazi speech, you must also take down the anti-Nazi speech. If you take down speech discouraging anorexia, you should also take down speech encouraging anorexia. I got that backwards. But the consequences of the viewpoint neutrality requirements seem quite dire.

Evelyn Douek:

Welcome to Moderated Content from Stanford Law School, podcast content about content moderation, moderated by me, evelyn douek. The community standards of this podcast prohibit anything except the wonkiest conversations about the regulation, both public and private, of what you see here and do online.

We are starting with a cracker. I’m not a sports person. But what’s the game called before the Super Bowl final, because that’s where we are. The Fifth Circuit handed out a ruling last week in the NetChoice litigation around Texas’s HB 20 social media law. We’re going to get into the specifics of the legislation in this episode. But for now it’s enough to say that the law wants to radically transform what you see when you log onto the internet by, amongst other things, preventing platforms from taking things down in a way that is viewpoint discriminatory.

The Fifth Circuit had stayed a district court injunction of the law back in December. But the Supreme Court vacated the stay in May. And also in May, the 11th Circuit struck Down most of a similar law passed in Florida. So we’re at this moment where everything is perfectly teed up for the Supreme Court to take the case. And that my friends will be the First Amendment Super Bowl.

So I’ve got together two of the people who I knew I wanted to talk to and hear from the moment the decision came down. Daphne Keller is the director of the program on platform regulation at Stanford Cyber Policy Center and was formally associate general council for Google. When anything to do with platform regulation happens, it’s like a bat signal to Daphne and the rest of us start refreshing her Twitter feed. Genevieve Lakier is a professor and Herbert and Marjorie Fried Teaching Scholar at Chicago Law School. She writes and thinks about the past, present and future of the First Amendment and well was cited by the Fifth Circuit a bunch of times. We’ll come back to that. But for now, congratulations.

So thank you both for coming on. No pressure, but you are my first guests. And so the entire success or failure of this new project is in your hands. We have three to five minutes as people decide whether to keep listening. So please front load your spiciest takes. Just to give listeners an idea of where we all stand and what the tone of this episode is going to be, I’m going to give you each a second to tell me on a scale of one to 10, 10 being completely off the planet, this is not even law territory, how bonkers do you think this ruling is? How surprising was it to you? So Daphne, where are you on the scale?

Daphne Keller:

Well, in terms of the actual outcome, surprise level three. In terms of the logic or illogic and tone of how we got there, this is Dobs like. I would put it at an eight or nine in terms of being sort of smirking disregard of arguments and facts advanced in the brief.

Evelyn Douek:

And Genevieve?

Genevieve Lakier:

Better be interesting. It’ll be interesting. So to Daphne’s point on the surprise of the outcome, I guess I’d be even less surprised. I thought that this court in its injunction ruling really signaled this. So this is like a one or a two. It’s very predictable in that way. In terms of the content, I suppose I would say the discussion of the cases I put at a seven, I think a little lower than Daphne, because I think that the court does do interesting work trying to reread the cases. It’s doing something law-like in making sense of all the different cases. It’s discussion of history, the common carriage parts of the opinion, wow! That’s an eight or a nine.

Evelyn Douek:

Let’s dig in a little bit more specifically to make sure that we comply with the community standards for the wonkiest possible conversation. It’s a little hard to know where to begin. Judge Oldham’s leading opinion comes in at a cool 90 pages. And it comes out of the gate strong with the declaration that the platforms argue that buried somewhere in the person’s enumerated right to free speech lies a corporation’s enumerated right to muzzle speech. And today we reject the idea that corporations have a free willing First Amendment to censor what people say.

Daphne, let’s start with you. And let’s start with the must carry obligations that Judge Oldham is really talking about here. Can you give a brief description of what section seven of HB 20 would require, and the gist of the court’s handling of it?

Daphne Keller:

so the Texas law requires the platforms to be viewpoint neutral in their enforcement of most of their content moderation rules. There are some important exceptions to that that I think we should get to later. But many people read that to mean if you take down pro-Nazi speech, you must also take down the anti-Nazi speech. If you take down speech discouraging anorexia, you should also take down speech encouraging anorexia. I got that backwards. But the consequences of a viewpoint neutrality requirement seem quite dire, although it is debatable what else, whether it might have some more limited meaning. And the court, strangely, seems to think it has some more limited meaning.

Evelyn Douek:

Genevieve, did you want to weigh in on that?

Genevieve Lakier:

Well, I’m actually working on a piece trying to think through what a viewpoint discrimination ban applied to social media platforms would mean. And I agree with Daphne. We’re not entirely sure because there are a lot of different definitions evident in the First Amendment cases. And of course this is a statutory regime. So courts, state courts, interpreting it in Texas would not have to rely on the First Amendment case law, although I think they probably would.

So I think some advocates have argued that it would disable or eviscerate content moderation in general. I think that’s very unlikely to be true. I think spam regulation would still be fine. I think a lot of privacy regulations would still be fine. I think there’d be a lot of fake account kind of stuff that would still be fine. But Daphne’s correct. It would require the platforms to keep up a significant amount of speech, at the minimum, significant maybe more maximus interpretation, a huge amount of speech that they don’t want to.

Evelyn Douek:

I mean, Daphne, you flagged this, the exceptions to the must carry obligations that the court sort of suggests would mean that platforms wouldn’t become entirely unusable because they could keep taking down all of this really bad stuff that no one wants to see. Can you talk a little bit about that and what you think of Judge Oldham’s view there?

Daphne Keller:

Sure. So there’s sort of two levels to this. There’s the court’s implicit assumption that some legal speech is more legal than other legal speech. And then there’s the division that’s actually in the statute. So the court is quite mocking about the platform’s “obsession” with Nazis and terrorists, which anybody who has dealt with content moderation at all knows is an obsession grounded in the truth that there is Nazi and terrorists content being uploaded constantly to platforms, and taking it down is something they work really hard to do.

The court seems to think that, “Oh! They can probably still take that stuff down. But they do have to leave up all of the legitimate political debate and dissent.” And that implicitly depends on an idea that the state of Texas can mandate different treatment for some legal speech than other legal speech. The legal speech it likes gets this carriage obligation. So Twitter has to leave it up. And the legal speech it doesn’t like, whatever that is, platforms actually can take it down, if it’s really, really bad, but barely legal, lawful, but awful speech. So that’s sort of implicit in the court’s opinion, although I’m not sure if the court recognized that implication.

And then in the statute itself, there are some carve outs where you can take content down regardless of its viewpoint if it falls in a couple of really ugly categories like, roughly speaking, incitement to racially motivated violence or harassment of victims of sexual abuse, both of which … Those are categories where there’s some illegal speech in there. And it’s not surprising that Texas wants to let platforms take down illegal speech. But there’s also barely legal, lawful, but awful speech in there. And it’s like the Texas legislators wanted a common carriage standard, but they couldn’t quite make themselves require carriage of that stuff. So they also are making a division between the legal speech like that has to be carried than the legal speech they don’t like [inaudible 00:09:16].

Genevieve Lakier:

Can I jump in because I do think the exceptions are really interesting because they clearly are meant to map on First Amendment exceptions speech that isn’t considered constitution protected. But I don’t know why, I suppose, for precisely the reasons Daphne suggested, why the Texas legislature didn’t just use the First Amendment categories. It uses its own language. And it is true that the language it uses suggest that these categories are configured differently than the First Amendment cases, meaning that it sweeps outside of the category of unprotected speech. There’s speech that’s recommended by child advocacy organizations that they’re weirdly fashioned.

Now you could imagine on appeal when it goes up to the Supreme Court, the Supreme Court could reread them or narrow these exceptions so that what they cover is only unprotected speech. I think there’s still a question there because under contemporary First Amendment doctrine, under a case named RAV V. St. Paul, Scalia opinion, the legislature still cannot pick and choose within unprotected categories the kinds of speech it doesn’t like and leave other kinds of speech unregulated sometimes. It’s actually a very complicated holding. So there’s still questions there. But I took the exceptions to be trying to get at unprotected categories of speech. And I think that that is relatively simple for a court to clean up on appeal. Do you disagree?

Daphne Keller:

Well, I mean, there is another enumerated exception saying platforms can take down “unlawful expression”. So it seemed like that would capture the thing you just described and leave it very questionable why they use this language that’s almost like the legal standard, but is not the legal standard, to describe these sort of violence and abuse related categories of-

Genevieve Lakier:

And just picking up on Evelyn’s commandment that we be as wonky as possible, I’ll just say in response to that, then that raises this really interesting question. So let’s assume that those exceptions are not meant to capture the unlawful exception, that they’re going to include lawful speech, then there’s this question from the Barr case, Barr V. Americans Society Manufacturers or whatever it’s called. That was the robocall case, about whether the remedy, if these exceptions are a problem, the remedy is just to read those exceptions out of the law or to strike down the law entirely. You could imagine.

Evelyn Douek:

It wasn’t my commandment. It was the community standards of this podcast. There’s nothing I can do. They mandate what we have to say. So let’s talk about editorial judgment. This concept looms large in Judge Oldham’s opinion. He says the platforms can’t just shout, “Editorial discretion,” and declare victory. He is kind of flippant and snide about the idea. The main argument here being that content moderation is platforms like editorial discretion. It’s their version of speaking and that to interfere with that is to interfere with their own free speech rights. So Genevieve, can you tell us a little about the concept of editorial discretion, how it’s been treated in First Amendment doctrine and what role it’s playing here?,

Genevieve Lakier:

I mean, this is a really fascinating part of the opinion. So the term editorial discretion I think dates back to a really important 1970s case, burger court case called Miami Herald Publishing Company V. Tornillo, in which the court strikes down a Florida right of reply statute that required a newspaper, the Miami Herald, to publish the reply of a candidate if it criticized that candidate in the run up to an election.

And the court says in Tornillo, it’s a really amazing opinion because it’s litigated by the State of Florida is represented by Jerome Barron, at the time a really important First Amendment litigator and scholar, who really believed that the First Amendment required these kinds of writer reply laws, that a healthy public sphere requires newspapers to give up space when necessary to those they criticize given their control of the market and the inequitable access to the public sphere.

Jerome Barron was all about rights of access. And so his brief includes this, I think to listeners is a really familiar argument about corporate power, monopoly control of the public sphere, but in this case targeted at newspapers rather than at platforms.

And in Tornillo, the court resuscitates all of that argument in the opinion. It says, “Oh yeah, the state of the industry really bad. Newspapers really do control a lot. There’s not a lot of competition. There’s barriers to entry. This is very bad for all of the kind of democracy and public facing concerns that we associate with the First Amendment. But too bad.”

In the end, it does a sharp U-turn. And the court says, “There is this thing called editorial discretion. Newspapers in our system just have the power to decide what gets published in their paper.” And the language in the opinion is pretty broad. It’s not just the political viewpoint of the paper, but what gets published and not published the size, the shape, everything having to do with the composition of the paper. And the court says, and this is important here, because newspapers have limited space. And so if they have to publish the right of reply, they don’t get to publish what they want to publish otherwise.

But then there’s this kicker of a lost paragraph in which the opinion says, “And even if that’s not the case, even if,” as if they were predicting the internet, “Even if there’s infinite space, it still exceeds the power of government to tell newspaper editors what it is that they can publish. This isn’t just a practical concern about space. This is a fundamental principle of First Amendment law.”

Now in saying this, the Tornillo court was picking up on decades of precedence dealing with the rights of newspapers and other media producers to decide what gets put in their broadcasts or their newspapers and what doesn’t. But it fleshed out in this really expansive way. And since then, the court has really run with this idea of editorial discretion as a core principle of how it is that the First Amendment protects speech. The idea being that private media companies, and I think often just private actors, private property owners in general, have just a lot of discretion, not only to say what it is they want to say, but to make choices about what speech doesn’t appear in their newspapers or doesn’t appear on their property. And this is how we safeguard the independence of the public sphere from the government.

Evelyn Douek:

So you just spent a lot of time talking about newspapers, because that’s what the precedents are about. And there is this argument that platforms are just like newspapers. They’re exercising editorial discretion just like newspapers. But I have to say, I also find that somewhat unsatisfying because to me they don’t really seem like newspapers. They do seem different. And so I have some sympathy for that argument. I guess, Daphne, if you could talk a little bit about how editorial discretion does play a role in content moderation, whether it is or isn’t like how newspapers do.

Daphne Keller:

So I mean, I don’t think platforms are just like newspapers. There are a lot of fairly obvious differences. But the court has taken this newspaper like editorial power, and found it in the hands of a lot of other entities that are not very much like newspapers. And so at the extreme, the parade organizer in a case called Hurley said that it had a right to exclude a gay rights organization from the St. Patrick’s day parade because it didn’t want to be compelled to carry the message that a gay rights float or whatever it was going to be would’ve conveyed.

And the court said, “Well, even though this organizer barely paid any attention to was going to be in the parade most of the time, and had maybe never kicked anyone out before, there wasn’t really a history of editing, although the organizer hadn’t been doing it for very long. Still, the choice to exclude something based on its message is an editorial decision. And we’re going to uphold a First Amendment to do that.”

And so if that’s the standard, what platforms are doing is clearly way more editorial than that. They spend all kinds of money coming up with posted policies and justifying those policies in interviews and hearings and blog posts and hiring and training people to enforce those policies, which are speech policies about what user posts they do and don’t want to carry. They’re doing this thing that I think is clearly editorial. And whether or not it’s identical to the editorial actions of newspapers, I think could matter for a case like this. But under the Supreme Court precedent, a lot of things get to assert First Amendment rights, even if they don’t look very much like newspapers.

The other major one being cable companies who in the Turner cases, we know that even though cable companies don’t do a whole lot of selection of channels and aren’t paying that much attention to what flows over the channels after they make this selection, nonetheless, they too have a First Amendment, but a First Amendment that can be overridden if there is sufficient state interest and due process.

Genevieve Lakier:

And I mean, I guess what’s so amazing about the Fifth Circuit opinion, this goes to the why it’s a seven or an eight I suppose, is because it didn’t make the argument that Evelyn you were suggesting, which is to say, yes, we protect this thing called editorial discretion, but newspapers exercise it differently than platforms. And so the rules are different. And so let’s think about how that applies in this case. That’s along the lines of what the 11th Circuit did when it reviewed a different, but similar social media law that applied there in Florida.

What the court said here, and this is why it was, I think, really shocking, it said, “Despite all of these cases,” and I agree completely with Daphne that Turner and Hurley both recognized, although they didn’t always use the code word editorial discretion, but Turner did, Hurley didn’t, they recognized that parade organizers also had this power and cable companies also had this power.

The Fifth Circuit just says, “Nope. The First Amendment protects speech. But what’s this thing editorial discretion? It’s just words that appear in some cases. It doesn’t really have any special significance.” And by doing so, I think they effectively write out any possibility that the platforms have a First Amendment interest in the content moderation or the First Amendment protects that process. The court says it protects the speech, but it doesn’t protect the process.

And then this is why I think it is doing law. It’s just a pretty surprising interpretation of the precedence. It rereads Hurley, it rereads Tornillo, it rereads Turner, all of these cases to say, “No. The only cases in which something like editorial discretion is protected is when the host of that speech is intimately connected with the speech.” And it’s using this language it picks out from Hurley, which is I think a case that squarely reaches a conclusion that is contrary to the kind of conclusion that Fifth Circuit reads here. It’s pretty [inaudible 00:20:07], as my mom would say. It’s ballsy.

They’re taking Hurley, which really stands for the opposite proposition. They’re taking these two words out of it and saying, “No, no, these are the new magic words. Only if you’re so intimately connected that in effect this is your own. We’re going to identify the speech you’re hosting as your speech. Only in those cases, is it protected.”

And so therefore, because the platforms they have so much speech, they do the court claims, mostly, although I think this is a contested claim, mostly expose rather than ex anti-content moderation. The platforms are just not intimately connected to the speech. So they have no First Amendment rights at all. And this is a very surprising and sweeping claim.

Evelyn Douek:

Judge Oldham just recasts editorial judgment as censorship. Literally kind of just says, “No, no, what you’re doing is not editorial judgment. It’s censorship.” And then repeats literally four times in the ruling. Section seven does not chill speech. If anything, it chills censorship, like saying it enough times just makes it true. But of course the decision not to publish something is as much a decision, an editorial decision, as to publish something. I am choosing to not say many things right now. And that is an expression of my judgment about what might be good for listeners to hear.

What’s missing from the ruling? So in some ways it’s really easy to criticize all of the things that the court affirmatively got wrong. It’s there. It’s sort of asking. In some ways, I think it literally was asking to get people all riled up. But what is it, as experts that know other areas of the law and other precedents, what did the court forget about? And Daphne, I’m thinking here about you tweeted about Halleck and I’m wondering if you could talk a little bit about that.

Daphne Keller:

Sure. So Halleck is part of a line of cases about cable television, and to what extent the government can compel carriage of content on cable televisions. And one thread of this is about public access channels, which the youth who are listening may remember from Wayne’s World. This is part of cable TV, where all kinds of kooky things happen. It was kind of like the local free speech channel.

And so the question in Halleck was when the … Let me back up. New York, the municipality, or maybe it was Manhattan, had hired an independent vendor to sort of administer the public access channel. And they kicked off some activists who were more or less in a fight with the administrators of the channel. And the activist sued saying they had a First Amendment to be on this public access channel, which had been preserved for public use by statute in order to serve the constitutional purposes of the First Amendment.

And in an opinion from Justice Kavanaugh, the court said, “Nope, you don’t have a First Amendment to be on this public access channel. This is a private actor you’re suing. They don’t owe you any First Amendment protections, and you lose.” And just to digress a little bit, what troubled me most about that is that the state was supposed to be protecting speech on the channel, and they hired a vendor to do that for them. And when these plaintiffs sued initially, they sued the state and the private vendor, or rather the city and the private vendor, and the lower court said, “Well, it wasn’t the city that kicked you off. So they’re no longer in the case. Continue with suing the private vendor.”

And then eventually, the Supreme Court said, “Well, the private vendor doesn’t owe you any First Amendment protection. So you can’t win there either. And that seems like a very troubling case of the government delegating its responsibilities to a private vendor, and therefore getting rid of the protections that should have existed for users under the First Amendment.

But to bring it back to this case, that is a very recent case from the Supreme Court pretty solidly supporting the platform side of this fight. And the Fifth Circuit kind of dismisses it in the footnote, and doesn’t really engage with it.

Genevieve Lakier:

I mean, I guess to say something controversial, which is to say something slightly positive about the Fifth Circuit opinion. So I’ve long thought that the cases Halleck provide too much protection for this thing we might call editorial discretion. Halleck is I think the best example of how far the court was willing to take it. And at the time, I wrote a short piece about Halleck because it was justice Kavanaugh’s First Amendment opinion for the court. And so it seemed like a useful tea leaf for what were the future of the court would hold. And I read it as, “Oh! My gosh, right? The court is doubling down on this highly private property, protective editorial discretion to the max interpretation of the First Amendment. Oh! How times change.”

We’re just a few years later. Although Halleck how does make me wonder how this part of the opinion is going to be treated if, as I think is quite likely, it goes up before the Supreme Court. I think Justice Kavanaugh is not going to like this whole dismissal of this idea of editorial discretion.

So there is something to be said for thinking that editorial discretion is less like an end of discussion trump card, as it has been used in a lot of the cases. You might think, for example, in the Hurley case, the LGBT parade float, the LGBT group who wanted to be part of the St Patrick’s Day parade, they also had First Amendment rights at stake, and the court absolutely rejects that.

There’s a case called Boy Scouts V Dale, which also involves the rights of a gay scout master to be a scout master. And in that case too, the court absolutely only focuses on the rights of the Boy Scouts rather than the individual speaker. We might think that all of that is really problematic. And so there could be an approach that balances the rights of a host to engage in editorial discretion and other kinds of speakers rights.

So that’s my word of praise for the Fifth Circuit. But then what the Fifth Circuit does is it blows completely past that idea that maybe we should think about reconciling these rights, or have a more rich conception of the rights, to just say, “There are no rights on the host side whatsoever.” And that was dramatic.

Daphne Keller:

I mean, I think there is an intellectually honest case to be made for some kind of First Amendment middle ground where platforms are kind of like newspapers, but kind of not, and so they have weaker editorial power. But that’s not the case. Well, that’s not the case that Fifth Circuit made. I think it’s also hard to reconcile with the case law, but that almost doesn’t matter. I think if there’s a plausible theory that doesn’t quite match the case law and it gets to the Supreme Court, they could easily adopt that theory.

One case that I think maybe deserves more attention in that vein is Pruneyard, which is the case saying that a shopping mall owner didn’t have a First Amendment to exclude leaf litters when California law under California’s constitution gave them a right to be there. And in that case, the court talks about how, well, of course, if the leaf litters were being disruptive and interfering, and this is not a direct quote, but interfering with the normal business at the shopping mall, that would be different. And then the shopping mall probably would have a First Amendment to kick them out or some kind of right to kick them out.

You could draw an analogy to platforms there, and say like, “Well, if it’s mobs harassing other users, that is disruptive and you have a right to kick them out. If it is civil debate that’s not bothering anyone, then it’s in that Pruneyard category. But I haven’t really seen much work trying to identify that.

Evelyn Douek:

And we’re going to come back to the real politic of this, and the does the law and the cases even matter after spending another half an hour or so. Let’s keep discussing the law in the cases. And then we’ll end with the notion that none of it mattered. Spoilers. But let’s talk about the common carrier laws, the common carriage section. Genevieve, this is the part where you are cited by the court a bunch of times. So let’s describe in a nutshell the debate that’s happening here, and what the court cited you for, and what you thought of that.

Genevieve Lakier:

One of the ways in which Texas tried to justify the law was to say this is just a common carrier law. We have a long history of implying common carriage obligations on media companies like telephone companies and telegraph companies. And under Obama, the FCC instituted a of rule making ISPs common carriers. And then this is a sign of how quickly the politics have shifted. The Republicans oppose this as a tremendous intrusion on the autonomy interest of the ISBs. But I think the FCC made a strong argument that common carrier laws were a good idea when it came to net neutrality. So Texas says, “This is just like that. We’ve done this, we’ve upheld these kinds of laws many times over our history. This is common carrier regulation.”

And then the platforms responded by saying, “This isn’t a common carrier law. First of all, platforms are unlike telephone companies and telegraph companies and ISBs in that they’re constantly moderating content. They’re constantly curating content. And so that takes them outside of the category of common carriage. And second, a legislature cannot just declare that something is a common carrier law and make it so. The First Amendment imposes constraints on what legislatures can do. And so the First Amendment recognizes a very narrow category of laws that we can think of as common carrier laws that apply only to certain kinds of entities. And this is not an entity of that sort.”

And there is case law, not terribly good Supreme Court case law. The most relevant case law is a DC Circuit opinion, but one that was handed down that then Judge Kavanaugh, before he joined the Supreme Court, happened to be on the panel that decided the case, that seemed to say more or less that a common carrier is an entity that holds itself out to the public as open to all carriage, and it doesn’t engage in selection about what speech flows through its pipes. It’s basically just dump pipe.

Kavanaugh resists this. He wants more than that. And I have to say, I think this is a very unsatisfying definition because as the Fifth Circuit says in its discussion, it allows game playing. You can just say, “We are not open to all the public. We’re open to 99.5% of the public. But we really don’t like brown haired men who are six foot two. And then common carrier law cannot apply to you, whatever it may be.”

So it’s true that the cases don’t have a terribly satisfying definition or analysis of common carriage. But there is this effort to use common carrier law as an exit ramp from the First Amendment. And I think a lot of history and case law that suggests that that shouldn’t be successful in this case, because as I argue in the paper that the Fifth Circuit sites, but they only cited certain portions, not the rest, come on, it’s a very long paper, so I understand. I’m synthetic. But they could have kept reading.

In the early 20th century when these common carrier Lords that applied to the telephone and telegraph companies were proposed, there was a proposal to extend it to this new important medium of the radio. And it came to Congress, and there was effort to embed, in federal legislation, an obligation to impose common carriage on radio.

It was soundly rejected because of the understanding that that would destroy the value of the radio because the whole point, people listen to radios because they wanted to have programs. They wanted the radio operators to select and curate the content in a useful way. They didn’t want Wayne’s World, to pick up on Daphne’s analogy. I guess we’re both dating ourselves, Daphne.

Evelyn Douek:

I know what Wayne’s World is, just saying. That’s okay.

Genevieve Lakier:

Okay, sure you do. And so there was this whole development of another body of law that in my paper I call quasi common carriage that doesn’t impose strict common carrier obligations because Congress and scholars and broadcasters, the industry recognized they weren’t appropriate.

And the Fifth Circuit in its ruling just blows right past that and says … I mean, this is also pretty amazing part of the opinion. It says, “Well, let’s go all the way back to the late 19th century jurisprudence that said we can impose pretty onerous non-discrimination duties on businesses when they’re affected by a public interest. And that’s how we’re going to define common carriage for purposes of First Amendment adjudication. We’re not going to worry about the fact that this was a famously terrible standard because judges poured into that affected by a public interest, whatever the hell they wanted to. And it was very malleable. And who knew what was and what wasn’t affected by public interest.”

And the Fifth Circuit says the only reason this was rejected was because of the Lockner Accord made a terrible mistake and was doing all these bad things. But no. Affected by a public interest was rejected because it’s a terrible legal standard. And here, the Fifth Circuit seems to be, or trying to resuscitate it.

Evelyn Douek:

So in another amazing part of the opinion, we can’t avoid the famous or infamous section 230 here, Judge Oldham spends quite a bit of time talking about this in a section that really reads like an aha gotcha kind of section. So the basic suggestion is that platforms are being hypocritical by relying on section 230 in other contexts to protect them against liability from other speech saying that we shouldn’t be considered the publisher of it.

And then they argue in the context of this litigation that content moderation is editorial judgment. Judge Oldham calls it a radical switcheroo and a stark about face, which at least made getting through the 90 pages of his opinion slightly more entertaining. In some ways, this is a superficially attractive argument. It makes a very good tweet. But Daphne, could you talk about it a little bit and what’s wrong with it?

Daphne Keller:

Sure. So what’s wrong with it? One part is it disregards the history and purpose of section 230, which was explicitly to encourage platforms to curate and editorialize, to encourage them to do exactly the thing that the court here is saying they have no right to do. And the mechanism for doing that was sort of twofold.

One was to say you’re immunized from must carry claims, from the very kinds of claims that Texas enables here from people saying you took me down, and you have to reinstate me. But the other prong, which was equally important, was the platforms had to be immunized from claims that content they were hosting was illegal because they’d be immunized for, for example, defamatory posts by users because Congress was looking at the time at a pair of cases where the platform that tried to find and take down bad content was deemed to be an editor, and held potentially liable for defamation posted by users. So the reasoning behind that part of 230 was they need immunity for claims like defamation also or they won’t go out and do this kind of curation and editorializing that we want them to do.

And so the court’s idea that there is some conflict between asserting immunity for defamatory posts put up by users under 230, but also asserting a right to moderate content, ignores the entire logic of 230, and I think kind of ignores this larger backdrop problem about the internet, which is you can’t just classify everyone, all the intermediaries and platforms, as if they were either newspapers, purely legally responsible for whatever they published, or common carriers who have to let everything through, because the platforms that are actually useful to us are the ones that are somewhere in between those two things, that do allow us to post immediately without pausing for legal review in case we said something defamatory, but that also simultaneously do try to enforce some content standards so it doesn’t all become a free speech mosh pit hellhole. And so we need a set of laws that kind of encourage both of those things at once and try to strike a middle ground. And the court is really disregarding to that.

Evelyn Douek:

That’s my new band name, Free Speech Mosh Pit Hellhole. I mean, I think we’ve been talking a lot about the law and how the court gets it wrong. But it’s equally amazing how wrong they get the facts in many parts of this in the description of how social media platforms work.

Just to pick a few examples, Oldham says that we’ve talked about how the harmful effects of forcing them to carry certain kinds of toxic speech is an obsession with terrorist and Nazis and extreme or fanciful hypotheticals, which again, as Daphne said, two seconds of looking at a content moderation review feed would make very concrete. Oldham also really emphasizes the fact that content moderation decisions are made after content is posted rather than before like a newspaper might.

And of course, platforms do sometimes make content moderation decisions after content is posted. But a significant amount, like 90 plus percent, or even higher than that of content moderation decisions happens before content appears on their sites. It’s what happens between that moment when you click post or tweet and the content going live where there’s like a loading bar that all the AI filters are looking at it. The concurrence even used the words algorithmic magic to describe what the platforms do.

So just to put a fine point on it, Daphne, is this a highly technically accurate understanding of platforms operations? And I guess given that I think I know the answer to that, how much do you think that affects where the court came out on this, this wrong understanding?

Daphne Keller:

I agree that it’s completely inaccurate for the reasons that you describe. And I am confident that the briefs the court had in front of it clarified a lot of these things, including the large amount of actual Nazi and terrorist content and the proactive moderation, some of which happens before content is posted. So it is this sort of in this X anti category that the court cares about.

I think the distinction between X anti moderation and X posts, it shouldn’t be very meaningful. Both of them are exercises of editorial control, whether the timing of moderation just depends on the facts of the situation. Is this a kind of prohibited content that platforms have seen before and have some automated tool to find? Or is it something that a user had to report? It kind of doesn’t matter. It’s all exercise of editorial control. So I found that section not particularly useful other than the entertainment value of seeing it get a lot of facts wrong.

Genevieve Lakier:

Well, the fact that the court is wrong on both the law and the facts, I think raises really interesting questions about strategies for appeal, and what the Amici are going to do, because you could imagine someone saying the easiest route is just to challenge the facts, to use the legal framework that the Fifth Circuit is suggesting and just say, “Oh yeah, but you are intimately connected to the speech because turns out 95% of the content moderation is happening before. And so it’s not the basis on which the Fifth Circuit ruled.”

That seems like a mistake to me, because I think the legal framework is problematic. But it does offer up a lot of opportunity for challenge. And I presume that there’s going to be a lot of fighting about the facts when this goes up before the Supreme Court, which I think it will.

Evelyn Douek:

And I mean to Daphne’s point, I’m sure the briefs covered it extensively, and it might be a kind of willful blindness to the facts or molding the facts to suit the argument. And even if it is just an honest misunderstanding, there’s also a question of whether the Supreme Court justices are going to be any better in their technical understanding of platforms. Yeah, Daphne?

Daphne Keller:

I think another question for petition and for the Amici is which version of conservatism do you play to ?do you go with the line of thinking that says conservative justices have historically been supportive of property owners against claims of the rebel wanting to come onto their property? This was, if you go back to cable cases like Denver area, which is a fascinating, but extremely difficult to read ruling from the nineties, you see justice Clarence Thomas offering, authoring a partial concurrence saying that cable companies basically are property owners. And this is a reason why you shouldn’t have any kind of First Amendment to come on and speak on public access channels.

That is similar, I think, to the reasoning that we see from Justice Kavanaugh in cases like the DC Circuit case, you mentioned earlier, Genevieve, and the Halleck cable case. So you could write a brief aimed at conservative justices that emphasizes that.

But as we see from the Fifth Circuit opinion here, there’s a whole new generation of conservative judges in town who don’t seem to care about that property rights line of argument and are much more interested in originalism. In fact, the court sort of contemptuous says that the platforms advanced arguments based just on First Amendment cases, and didn’t even try to, just on Supreme Court precedent and didn’t even try to engage with originalism.

So is there this other line of argument that is somehow about the originalist approach to internet platforms, or that just plays to the purely, as I see it, political stance that we see justice Thomas taking now in things like what he wrote in the Knight case and the Malwarebytes case saying platforms are censoring conservatives. Therefore, now there should be a carriage law.

Evelyn Douek:

You two keep wanting to jump ahead and talk about the politics and the real politic here, which I think really speaks to the fact that is what is looming large here. That is sort of at the center of this debate. But I do want to spend a little bit longer on the decision itself because I want to talk about the transparency obligations where I think we might have some fun disagreement, because this is I think a really important part of the decision too.

So this is about requiring platforms to report, to disclose certain things like their community standards and the number of take downs and things like that. So maybe let’s just start. Genevieve, can you describe the transparency obligations here and what standard the court used and what judge Oldham said about their constitutionality?

Genevieve Lakier:

Well, Judge Oldham thinks everything’s fine in the law. Totally cool. He says there’s two of the must carry provisions, anti-discrimination provisions. He’s going to cover all the bases. He says no First Amendment rights apply. But even if First Amendment rights do apply, this is content neutral, and we’re just going to apply intermediate scrutiny. And that there’s basically no other way that the government could have achieved an interest of ensuring broad access to the public sphere. And this isn’t unduly burdensome.

And basically it was through the same analysis here. I mean, what is interesting to me .. actually, Daphne you can correct me if I’m wrong. I mean, I think this was an interesting part of the opinion because the 11th Circuit opinion, the case involving the Florida law, the court there struck down the must carry parts, but it upheld a lot of the transparency provisions under a pretty elaborate analysis of the Zauderer standard. And here the court is much less interested in developing new law when it comes to transparency. It just thinks it’s all fine because it’s all serving a legitimate government purpose.

Daphne Keller:

I didn’t think that the 11th Circuit’s analysis was particularly elaborate. So let me kind of give a background here. For the transparency provisions, which are numerous and quite different from each other, the platforms advance a pretty simplistic argument saying, “We’re just like newspapers. And so making us say anything about our editorial process is a grave burden on speech rights. Therefore, all these transparency rules should get strict scrutiny, and therefore likely fail.” That’s their side.

And then the states and many of the Amici say, “No, no. Platforms are just like every other business. And so the standard should be the one from kind of standard consumer protection law for normal vendors of goods and services. And it should be the Zauderer standard, which is a case standard that’s been applied in a lot of cases that many of the cases are about health warnings on packaging or disclosures and advertising, kind of short consumer protective, mandated disclosures.

And to my mind, neither of those is right. Those metaphors both really miss something. The newspaper metaphor misses the consumer protection interest, and also misses this really important sort of self-governance interest in understanding how speech and public discourse are being shaped by platforms and shaped by people who are using or manipulating platforms. All of that is missing if you look at platforms as just newspapers.

And then conversely, this Zauderer standard does a good job of illuminating the interests of users as consumers, as if they were buying milk or soda pop, but doesn’t do a good job of illuminating either the user speech and information values and needs or the legitimate editorial concerns raised by the platforms. And I think there is a deep strain of potential actual burden on speech rights of the platforms and on their editorial policies kind of buried in the transparency provisions. And we can get into that more if you want to.

But what also just troubled me about both of these rulings is that they sort of blowfly apply a single standard to half a dozen or more different legal obligations that each really deserve their own different First Amendment analysis. Because for example, an obligation to publish your speech policies, which is one of the things that the Florida case said was okay, and that I assume is in the Texas law also, that’s one thing. And it’s quite another to have the obligation, as in a different provision upheld in the Florida law, to update users about any changes and not make changes more often than every 30 days. That means that if a brand new bad thing comes along, the Tide Pod Challenge or a new kind of abuse or harassment that the platform didn’t anticipate and didn’t have a rule against in their policy yet, they have to tolerate it for another month. They can’t exercise editorial discretion for another month because of this “transparency law”.*

So I just think there’s a real need to dive deeper on what these laws actually are, regardless of what First Amendment standard is applied in reviewing them. And in defense of both of those courts, none of the parties or Amici actually briefed these details. And so particularly on a preliminary injunction standard, asking a court to strike down a law when you didn’t really brief the problems with the law maybe is inappropriate and maybe it’s, in that way, reasonable that the court didn’t accept those claims.

Genevieve Lakier:

in defense of the 11th Circuit ruling, I thought it was interesting that the court did in that case strike down, I think, one of the provisions, one of the disclosure provisions that required information about, I think, each ruling as unduly burdensome because the sheer amount of labor and effort required to satisfy that requirement, the court recognized, would really make it very difficult for the platform to keep going about its ordinary business. And so it would chill a protected expression, whether that’s the speech of the user or the editorial discretion process. And so it struck that down.

And I think that is interesting. I think this idea that some … So I guess stepping back, I think the really interesting theoretical question raised by the analysis in both the Fifth Circuit and the 11th Circuit cases, and this goes to Daphne’s point, is to distinguish what are those transparency obligations that are so bad that under no circumstance, in no world, can they be upheld? And what are those transparency obligations that maybe there’ll be some cases, like say a Tide Challenge comes along or it’s like a Nazi challenge, whatever it may be, really, really problematic? And in those cases, it’s for whatever particular reasons very difficult to enforce or poses all kinds of problems to speech.

I think courts have to think about whether … I suppose I’m sympathetic to the idea, and the Fifth Circuit articulated this idea, as well as the 11th circuit, that on the whole, we want to leave these kinds of challenges to the workability of these disclosure rules to as applied challenges. Let’s kick it down the line. Let’s see what actually happens on the ground, and in particular circumstances. And I think the idea is that will allow more nuanced analysis of how these different kinds of obligations apply in practice.

We’re going to be pretty wary of striking down disclosure obligations that don’t look on their face to be ridiculous. And it’s only when there’s so burdensome that it seems like there’s no way the platforms could comply with them that we’re going to strike them down. I think that was the approach taken by the 11th Circuit. And then the Fifth Circuit has this strong argument about how this should all be handled on application. And I’m quite sympathetic to that.

Evelyn Douek:

I think that the transparency obligations here are the dark horse. People aren’t talking about them as much because they’re not as sexy as the must carry controversial political content. But we have a remarkable amount of anonymity about the constitutionality of these in the law courts now. They were upheld by the 11th Circuit. And I think the Fifth Circuit was unanimous on these provisions too. And there was some appetite for them in Alito’s judgment in May.

And again, I myself am somewhat sympathetic to the idea that maybe there should be discretion in platforms what they want to moderate. But they should at least tell us what they’re doing and how they’re doing it. But I completely agree with Daphne that this needs to be really nuanced and context specific. And there are problems with a lot of this. And it’s kind of going unrecognized, and could result is some really, really bad law if we’re all sort of focusing on these high profile should Trump have been platformed arguments, while we have these really radical potential changes happening in transparency law on the side.

So what are platform’s going to do? We now have this law upheld in Texas. Daphne, you’ve been sort of speculating about this. They have a couple of options. Obviously they’re going to appeal. They’re not going to just sit down and take it like, “Oh, okay then. Judge Oldham says we have to comply.” But what do they do in the meantime? Do they exit Texas? Do they purposely flood Texas with spam and say, “You wanted unmoderated content, here you go.”? What are their different options, Daphne? What do you think they’re going to do?

Daphne Keller:

So I think that maybe there are like six categories.

Evelyn Douek:

Oh boy.

Daphne Keller:

Flood Texas, comply with the law, don’t comply with the law. Those are the main ones. But then I think there are a couple of other more nuanced ones that I’ll get to in a minute. So blocking Texas would violate Texas law, probably. So that’s interesting. But maybe there’s a world where they block Texas and Texas sues them, and then somehow they change the posture of the case or the forum for the case in a way that’s useful. I don’t think they’re going to do that. They have employees in Texas. They have long term commercial relationships with advertisers in Texas. Maybe they have data centers, I don’t know. Texas is hot. I wouldn’t put a data center there. But it would be hard to block Texas and disruptive to business.

They could flood Texas with all the terrible content that they seem to have asked for. That would be truly awful for the people in Texas who are the victims of hate speech and abuse and harassment, and currently at least somewhat protected from that by platform efforts and now would be unprotected. And then if YouTube allowed all the hate speech in, kind of like Parloa used to. Would they then be kicked out of app stores? Would advertisers stop running ads? There are all kinds of weird collateral consequences if you let the hate speech flow.

Complying, I think nobody knows what complying would even mean, including with the transparency obligations. And so I don’t see them bending over backwards to try to do that. They cannot comply and wait to be sued. That seems like a kind of viable option. The Texas law is set up to authorize sort of legal DDoS attacks in the sense that any individual can sue over alleged violations of the must carry provisions. And whatever the outcome of that suit is, it’s not precedential if somebody else sues somewhere else in Texas until, I think, it must be the Supreme Court of Texas and sort of higher authority has resolved it.

And so it opens the gates to who knows what wacky lawsuits. But maybe wacky lawsuits are good for the platforms to start building as applied challenge or building their case in the court of public opinion, et cetera. So I think that’s a pretty viable option.

The options that are more nuanced that I’m kind of interested in, one is semi compliance, meaning for example, keep taking down the worst of the worst speech and leave up more speech than they do now. That’s weird strategically, because then it means … Well, it means that then-

Evelyn Douek:

The Goldilocks approach.

Daphne Keller:

About worse of the worst speech. Well, but it tees it up. It makes people less mad at them, including Apple and their advertisers. And it makes it so that whoever does sue them in Texas actually really bad speakers, and cases that are sympathetic for the platform. So maybe there’s a middle ground there.

And then the middle ground I’m most interested in, just because of my separate interest in interoperability and middleware and so forth, is to flood Texas with garbage by default, but give users an opt out to go back to the moderated experience. And there’s some language in the statute that kind of arguably makes that okay. And it sort of illustrates the problem with the Texas law by flooding everyone with garbage by default, while avoiding a bunch of the bad consequences of actually flooding everybody with garbage permanently.

Evelyn Douek:

So in the remaining time that we have left, I’d want to talk about what’s going to happen next, and obviously ask you both to give me very detailed predictions. I think it’s fair to say that anyone that can tell you what they think will happen next probably is trying to sell you a bridge. It seems pretty clear. The consensus is it’s going to the Supreme Court sometimes soon.

Just a reminder for listeners, what happened at the Supreme Court last time was all a little bit surprising as well when the court vacated the Fifth Circuit stay of the preliminary injunction. So Alito, who Thomas and Gorsuch joined, dissented. And Justice Kagan, without explanation also would’ve denied the application to vacate the stay. So that’s a surprising coalition for justices. We don’t really know what’s going on.

Maybe just get you both to talk a little about this real politic question, what we might expect to see, what the court might be signaling, what’s at play here? How much does doctrine matter? What do we think that the court is going to do in terms of trying to… Is this just politics all the way down as people are increasingly thinking much of the Supreme Court is? Or do you think this is an area where there might be more restraint? Maybe Genevieve, can we start with you?

Genevieve Lakier:

Well, this would be one way to … An optimistic take is that this is a great natural experiment for all the lawyers and legal scholars out there who really want to know how much does doctrine matter. Well, we’re raising that question because I think the cases, the Fifth Circuit does its best to say, “Oh no, no, no. The case is support the conclusion we reach.” But that’s not very persuasive.

And there is the conservative justices on the Supreme Court. I think most of them, if not all of them, pride themselves on being very strongly speech protective. They think of themselves, and I think maybe the school of conservative jurisprudence that they emerged out of is being very strongly speech protective, in a way that is different from how the Fifth Circuit understands this term. And Kavanaugh, for example, has been strongly committed to this kind of property rights, free market conception of the First Amendment.

And so if doctrine matters, and I’m thinking of doctrine not as separate from politics, but as a means by which we make sense of our political commitments, I think Justice Kavanaugh, Chief Justice Roberts, they think that these values are really, the values promoted by the First Amendment are really important, but the way you do that is the way we’ve traditionally done it over the last few decades, and so should not be sympathetic to what the Fifth Circuit did here. Justice Thomas is clearly in a different position.

Now conservatives don’t like the social media platforms though. And so whether or not, in this case, there’s going to be a willingness to deviate from established precedent. And the Fifth Circuit has done its very, very best. I think we should read the Fifth Circuit opinion as an attempt to provide the best possible argument for the Supreme Court to take up as a means to justify what it’s doing without saying, “We’re just turning out back on First Amendment principles. And no, no, no. We’re just applying the principles, the precedents, but to reach a conclusion that the conservatives want.”

So I think there’s just this question. I don’t know. I cannot predict what’s going to happen. I guess my instinct is that Kavanaugh, Roberts, maybe Barrett, although I know less about her, are going to be not nearly as willing as the Fifth Circuit to reject the free market property based conception of the First Amendment that we saw in cases like Halleck and Hurley and Tornillo. And I also think that this opinion is not going to win any love from the liberals. So it’s very hard for me to see Justice Kagan, for example, endorsing the kind of reasoning in the Fifth Circuit.

An earlier view I had was that you might get an interesting ruling in Supreme Court if there’s a portion of the conservatives, Daphne talked about the two schools of conservative justices, and I agree. I think there’s a division between say Thomas and Roberts. So you could imagine cobbling together a majority from a few conservative justices, and then Kagan and some other sort of more regulatory liberal justices. But I think that no liberals are going to go along with what the Fifth Circuit is doing here. It’s not going to be satisfying to them. It’s not the kind of regulatory interpretation of the First Amendment they’re going to like.

And so I think it’s probably likely, although we live in very strange times, that the Fifth Circuit’s approach to the must carry provisions and the non-discrimination provisions is not going to win favor with the court. I think this is why I agree with you, Evelyn, that the transparency portion is maybe the sleeper part, because I think that’s much more complicated. I could see a Kagan or a Jackson maybe being much more willing to go along with upholding the transparency in the disclosure mandates, but not the non-discrimination parts.

Daphne Keller:

I think another sleeper, real politic issue here is when the Chamber of Commerce really notices this case, what are they going to say, and what will the conservative justices think about that? Because I think for both the must carry issues and the transparency issues, the implications of this ruling go well beyond platforms to other brick and mortar businesses. And I’m not sure how much that has been teed up in the Amici brief so far. Certainly that the transparency part, I don’t think anybody has flagged that.

If these obligations, these sort of pervasive, ongoing tracking of complex systems and constant reporting of the details of those systems, if that obligation for platforms is permissible under Zauderer, and if the platforms, who are speech and editing companies, don’t have a First Amendment objection, then what is the standard for new pervasive reporting obligations for, I don’t know, Walmart to explain the labor conditions behind every gadget that they sell to customers. It just seems like it opens the door to a lot of mandates that we don’t have now and that conservatives don’t like.

Evelyn Douek:

Well, thank you very much. We’re going to have to have both of you back on for the Super Bowl. We all will get our beer caps and things ready. Thanks very much for your time. This is great.

Daphne Keller:

Thank you.

Evelyn Douek:

The slogan around these parts, which we’re really hoping to make happen, is everything is content moderation. And that means that the next episode could be about anything. Click, like, subscribe, follow, bookmark, obsessively refresh the feed every day to make sure that you don’t miss it. We are available on Apple Podcasts, Spotify and all of the usual places.

And now I’m going to say some words that you will probably block out in your head because at this point you tune them out like I do at the end of every podcast episode. But it really would mean a lot to me if you could rate and review this little show wherever you listen to podcasts to help get the word out there. And if you do have ideas for guests or feedback or topics that you want to hear on the show, I’m all ears. Please just shoot me an email. Thanks very much for listening. This has been Moderated Content from Stanford Law School.

*Correction: The 11th Circuit upheld most of the provision Daphne is referring to here (section 501.2041(2)(c))—including a requirement to inform each user of changes to its rules—but actually struck down the 30 day requirement.