MC Weekly Update 10/30: Warning, This Podcast Might Be Highly Addictive

Alex and Evelyn discuss the new Executive Order on AI, content moderation in the stack putting pressure on Telegram, the one year anniversary of Elon Musk buying Twitter, and a multi-state lawsuit against Meta for misleading young users about the addictive and harmful properties of its platform.

Show Notes

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

  • App store rules are restricting access to some Hamas-affiliated channels on Telegram where content moderation action is rare, allowing terrorist organizations to share messaging. The restrictions are inconsistent, with some channels only blocked on the Google Play store app in some cases. – Clare Duffy, Brian Fung/ CNN, Kevin Collier/ NBC News, Wes Davis/ The Verge
    • It’s another reminder of the power of content moderation rules in the stack — at the infrastructure or distributor level, like app stores.

X-Twitter Corner

  • It’s been one year since Elon Musk flipped the bird (and struggled to carry a sink into Twitter’s San Francisco headquarters). Our original episode on this, “Musk Flips the Bird,” held up pretty well — especially the prediction that this would be very good news for Mark Zuckerberg.

Legal Corner

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Transcript

Alex Stamos:

Do you want to murder people in the deep future through your criticism of AI? Do you believe that capitalism is the only way to have any forward movement and progress in the human race? Then join us in Galt’s Gulch and help fight against the enemies of progress.

 

Evelyn Douek:

And welcome to Moderated Content’s weekly, slightly random, and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos, head enemy of progress over here.

Kicking off this morning, I think in about a couple of hours, Biden is going to sign an Executive Order on AI that the White House has been rolling out this morning, and there’s been a bunch of press releases about this. It’s a pretty big deal. It’s a sweeping order covering areas from cybersecurity to health, competition, privacy, immigration, microchips, education, housing, copyright, labor. There’s a lot in there. And it’s been interesting to see the White House being so fast and aggressive and moving so quickly on this.

Alex, you have been flicking through it today. What are your impressions?

 

Alex Stamos:

So one, it looks like Joe Biden is now the biggest enemy of progress.

 

Evelyn Douek:

Right.

 

Alex Stamos:

Overall, I think it’s actually a very thoughtful framework here. It goes further than I expected. It will be interesting to see with everything going on with the Supreme Court and administrative law and Chevron deference and such of how much of this stuff will actually be doable by the executive branch without actions by Congress. That’s an interesting open question.

It’s very national security-focused, which is what you expect because it came out of people who are attached to the National Security Council. I believe Ben Buchanan was one of the drivers of this, generally a professor, doing a stint in the White House, and he’s been having a lot of meetings with folks and previewing this and getting feedback.

A couple of things. There is a focus on the national security competitive aspects. There’s a focus on making sure that the United States knows about models that are being trained, about the access to large amounts of hardware from cloud providers by foreign governments, they really just mean China, but they’re talking about a broader thing. There’s even stuff about microchips and such, but then there’s also a focus on some reasonable steps that you might want to take to encourage risk management.

It’s overall a very positive order about the benefits of AI. And so it is not, as you might expect, a screed against AI. It is not saying that the world is ending. It is all about real, practical risks. So none of this is about the kind of crazy existential risks, Skynet theories that have been promulgated by some people. All of the things they’re talking about are real things.

And so I’ve got to spend some more time going through. The draft I have is like 111 pages and it is pretty extensive. But overall, I think it’s very positive. I also think it’s going to… They’ve also kicked the ball over to Congress because there’s a bunch of things in here. They’re saying that the Patent and Trademark Office should make some rules and such, but clearly we’re going to have to have laws passed to change intellectual property treatment and such.

 

Evelyn Douek:

Yeah, right. I think that’s totally right. The two things that you opened with, like the fact that there’s going to be limits on what the executive can achieve without legislative action, and that may be even further curbed back by what the Supreme Court’s doing now, and also that a lot of the focus is on national security. I mean, those two things are related. The reason why a lot of the focus is on national security is that’s where executive power is at its highest.

And so yeah, I think you’re right. A lot of it’s going to be the devil will be in the details and how they manage to deliver on this. A lot of the concerns are urging Congress to pass privacy legislation and bring that forward. And then a lot of it is leading from the front, but it’s unclear how much that will change things. So I noticed that the order directs the Department of Commerce to develop guidance for content authentication of watermarking to clearly label AI-generated content and instructs federal agencies to use those tools to make it easy for Americans to know when communications are coming from the government and when those communications are authentic. And the idea there is to set an example for the private sector and other governments and things like that, and that’s great, but there’s going to be obviously real limits.

That’s a very different proposition from requiring everyone to use authentication or watermarked content, which of course is well beyond the power here and would run into all sorts of First Amendment problems and things like that. So it’s walking that line between the limits of authority and what’s able to be done here.

 

Alex Stamos:

Yeah. I think the part of it that is the weakest, and I think partially just limited about executive authority, is the labor part, which he’s a Democratic president. As of, I think, today you have the strikes ending in Detroit, but this is the backdrop of labor unrest across the entire country and Biden being attacked from the left. And so they had to say something, but it’s very weak of like, “The Department of Labor should do a report on the labor market effects of AI.” I think they’re going to be bad. It’s good for people who are really good at linear algebra and who have PhDs in AI. It’s going to be bad for people who have white-collar jobs that have resisted automation.

I spoke at a conference a couple of months ago. It was being held by the Westlaw people, by Thomson Reuters, and it was all general counsels, and I watched the CEO of Westlaw give this demo of their version of Copilot. So it’s built into Microsoft Word, but it’s branded both… It uses OpenAI core technology and it’s co-branded Microsoft and Thomson Reuters and Westlaw. And he just typed into this live demo, “I would like a asset purchase agreement under the laws of the state of Utah,” and it just blew out a 10-page agreement. And he said, “Now add a 30-day negotiation period. Take out the non-compete, do this, do that,” and it added it in real time.

And unlike these other situations where lawyers have got in trouble for using GPT to write stuff, you could mouse over each of the paragraphs and it would tell you, “This is based upon this part of the Uniform Commercial Code, and this is based upon this appellate court, and this and that and this and that,” because it’s trained on the hundreds of years of stuff that they’ve been recording it. And the general counsel lost their minds. It was like a Taylor Swift concert for these GCs because you could see them just racking up in their head how many of their outside counsel they pay $600 bucks an hour to do basic doc review and basic doc drafting and such.

But also, the thing that immediately went to my mind is, man, should my kids not go to law school? Because the entire model of training young lawyers in big law based upon document review for litigation, based upon looking through these kinds of contracts, drafting these contracts and stuff, that’s going to all go away. It’s all going to be AI and the partners on top are going to do great because they can still charge $2,000 bucks an hour for their human knowledge of how the human beings are going to act, but then they don’t need to have an associate draw something up if they can just tell Copilot, “I want this.”

And so there’s going to be huge labor market impacts on this, and it’s going to hit white-collar laborers, politically motivated rich people who have not been affected by globalization and who have benefited from globalization and the automation away of blue-collar labor, and the downstream societal effects of that are yet to be seen.

So I think actually it’s a really good document. I commend the White House team that put this EO together, but there’s just so much more we have to do. I also like it because it is a stake in the ground of the United States is maybe saying we finally woke up to let’s not have Europe be the sole regulator of American companies, that we’re going to have a pro… This is a very pro-company, it’s very pro-tech strategy that also reasonably sees the downside risk.

Now, some people would say that only an enemy of progress would like this EO, but I don’t think that would be the right… The argument I would make against that is it is much better to have the US government, which is democratically accountable to its own citizens, regulate American companies than have Europe do it.

 

Evelyn Douek:

Yeah. Although hearing you just tell that story, Alex, has radicalized me into a real enemy of progress. We need to put the kibosh on this technology straight away because it is coming for my job.

Yeah, no, I mean, it was always clear. There were all these stories when ChatGPT was first let loose that it was hallucinating cases and lawyers were relying on it, and it was always pretty clear that that was a pretty temporary problem, that this is a kind of thing that you can just have the models check against actual databases of cases and authorities. And so I’m not at all surprised to hear that these demonstrations are happening, and it’s absolutely something that law schools are going to have to be thinking about, about how do we educate students in this world where being a lawyer is going to be a completely different proposition?

I always thought that the stressful part of tenure would be getting it, but it turns out it’s getting it before they get rid of all of the law professors in the world because they are totally replaceable.

 

Alex Stamos:

Oh, you still need law professors. It’s just you’re going to have a bunch of robots in the audience learning. You’re going to have to be training a variety of different large language models to be good lawyers.

 

Evelyn Douek:

Well, at least then they’ll actually be paying attention to me.

 

Alex Stamos:

Right, right. The robot lawyers don’t browse Instagram the entire of time because they’re addicted.

 

Evelyn Douek:

That’s right. That is an excellent plan that we will come back to later in the episode.

But for now, content moderation in the stack and the ongoing stories and things that we’ve been covering on this show, about the war in Israel and Gaza, and last week we talked about how Telegram was the home of a lot of Hamas content. Alex, you were talking about how that was where you were spending a bunch of time researching because that is a place where Hamas is spreading a lot of its content explicitly because of its unabashed and pretty absolutist lack of content moderation policies.

Well, this week, Telegram experienced the pressure from a lever that we have talked about on this podcast many times, which is app stores. And the platform actually blocked access to the official channel of Hamas and the military wing of Hamas for Android users only. And it said, to a bunch of reporters, that this was because of Google’s app store guidelines and the pressure that the Google Play Store was putting on it to do content moderation.

And we’ve talked about this. This is an unexplored and unrealized lever of content moderation where if it’s not happening at the app level, we have seen app stores getting involved. But it runs into all of these sorts of problems about the lack of transparency. It’s not clear exactly what was causing… Why is it these channels and not other channels? What’s happening with Apple? All of these sorts of things where there’s not a lot of transparency into what’s going on here. But given that you’re spending time on the app, Alex, I’m wondering what you are seeing and whether this surprised you.

 

Alex Stamos:

It was interesting. It is interesting that at first, for a number of reasons, one that Google’s ahead of Apple here. So we generally have always thought that the more aggressive company about using app store rules was Apple, but it turned out to be Google to be more aggressive here. It looks like the channels are generally channels that are official channels of sanctioned entities. So it is under, effectively, the material support to terrorism. And to clarify, it is for Android users where it’s the Android app installed from the Google app store, not if it’s sideloaded or come from a different app store.

And so one of the things Telegram’s done here is they’ve clearly instrumented their app to be able to inspect itself and to report up to the server side, “I am signed by in the way that comes down from the app store,” or, “I am signed for sideloading and such.” And this is actually a big deal in this situation because a huge amount of Android phones in the developing world where you have people who are supporting Hamas and taking this credit, and actually people in Gaza and such are going to be using non-app store, non-Google Play phones. A huge chunk of Android devices out there do not have Google’s software.

I had to deal with this at Facebook. Even years ago, this was true. It creates all these interesting challenges because you sometimes have these $50 phones, which is incredible that you have this Chinese OEM that can make a phone affordable to somebody who only makes a couple hundred dollars a year, and that’s great for them, but it’s usually four versions of Android past. It never gets patched. It creates all these interesting security problems. And so as an app vendor, one of the things you have to do, one thing we did at Facebook, we had to ship our own TLS code and stuff because you could not trust the cryptographic code that was on those devices. Also, because sometimes it’s backdoored by the local government.

In any case, Telegram’s obviously thought through a bunch of this stuff. And so the number of people who are going to be affected by this, I think it’s actually quite small. It’s mostly going to be, well, it affects me because my burner phone that I used to be on Telegram is a real Pixel, so I now have to go buy a crappy or I have to jailbreak it and do some things and install a different Telegram so that I can do my work. But the number of people who actually are on Hamas orders or communicating on Telegram in operational support of Hezbollah or Hamas is going to be minimal. Almost all those people are going to be sideloaded.

One thing’s I heard this interesting story from user research. Back in the days in Egypt, one of the common things people do is they take their Android phone to a booth of the phone guy and he will plug it in and he’ll update your Android and then he’ll update all your apps. So he keeps, on his local PC, he keeps a collection of WhatsApp and Telegram and all of the popular apps, and that’s the only way it gets updated. And so this was actually an interesting challenge for a number of platforms in WhatsApp. If your app is more than 30 days old, I believe, will basically message you and say you can’t participate anymore because it creates all these interest insecurity problems.

 

Evelyn Douek:

Yeah. Oh, that’s super fascinating. I didn’t know any of that background. And also, it just underlines again though the lack of transparency in this situation because we actually have no idea how many users and things like that are being affected by this decision. It’s all completely opaque.

 

Alex Stamos:

Right. It’ll be a much bigger deal if Apple does it because it is much, much, much harder to jailbreak an Apple device. It is extremely rare for people to run non-official. In fact, what I would expect would happen is if Telegram follows for their Apple versions, they’ll make no distinction of jailbroken or not because it won’t make sense for them.

 

Evelyn Douek:

Yeah. And it’ll be interesting to see. Often we have seen in the past that the app stores move in lockstep, and it’s pretty rare for them to be in a situation where they come to different decisions or certainly most of the high-profile cases that we’ve talked about in the past have all been everyone taking shelter from everyone else and making the same kinds of calls. But so far, no news from Apple, even though this has been a couple of days now.

Okay, moving on to, we haven’t done this in a while actually, but let’s go to our Twitter corner.

All right. It’s nice to be back here, I guess, sort of. So the reason why we’re reprising the sad trombone today is that this week marked one year since Elon Musk flipped the bird and took ownership of Twitter, or then-Twitter now X, and there-

 

Alex Stamos:

Does it feel like a year to you, Evelyn, or does it feel like a lifetime? I feel like we’ve lived. This is the Star Trek: Next Generation episode where Picard lives an entire lifetime while he’s unconscious. That’s what I feel like’s happened since Elon Musk took over. I’m awaking, but I can still play flute, and I miss my family that I lived with in my imaginary world.

 

Evelyn Douek:

It feels like a long time ago, a portal to another world back in the good old days where there were blue birds flying in the sky and tweeting around. And here we sit, one year later, haggard and run down.

It was a big week for us. I mean, a big week for the world, obviously with Twitter being acquired, but it was also a big week for us because it was our first emergency edition of this very young podcast that we had just started. And so I went back and I read the transcript of that episode this week, and actually, I got to say, Alex, it feels like it held up pretty well. We talked about how Musk was coming in and saying that he was going to liberate this app and make it a free speech absolutist, but that inevitably he was going to run into regulatory problems and that he was going to find out, like everyone else, that he’s not above the law.

And that is exactly what we’ve seen. We’ve seen the FTC take action. We’ve seen the DSA and Europe taking action. We’ve seen all sorts of regulatory problems, not to mention overseas, which we’ll come to as well.

You predicted, in a statement that I think is held up especially well, “They have a real fear that the contracts go away and then eventually the APIs will go away.” And of course, that has been a massive change in the past year has been the shutdown of the Twitter API and therefore the lack of visibility into what’s going on to Twitter.

And then we both predicted that this was going to be very good news for Mark Zuckerberg, and I think we were mainly thinking about the reputational cover that Musk was going to give him by the fact that he was going to be the person in the headlines all the time with all the content moderation controversies were all going to be Musk controversies from then on. And it has been even better than possibly we could have imagined of course because now we sit here one year later and Threads is doing extremely well as a competitor to Twitter, which didn’t even exist back then. So some good predictions and most-

 

Alex Stamos:

Right. And nobody complains about, even if Mark Zuckerberg makes a mistake, people say things like, “Wow. Well, this is hard.” It’s unbelievable. Things have completely, totally changed. Mark Zuckerberg every morning must wake up and pray to a little shrine of Elon, right? Maybe it’s like the solar panel off of a reentered Starlink satellite or something. When he lights incense and says, “Please God, let Elon Musk continue to be the CEO of Twitter for as long as I’m alive because it has made my life so much easier.”

 

Evelyn Douek:

Exactly. The attitude is pretty much, “Oh, thank you, Mark, so much for trying to spin up this app for us so that we have a new place to recongregate. I know it’s really hard work, and if you have time, no hurry, but if you do have a little bit of time, if you could introduce this feature, it would make it so much better for me. We would really appreciate it,” which is a very big tone shift.

 

Alex Stamos:

It’s like he’s become the CEO of YouTube. That’s the feeling he gets every day.

 

Evelyn Douek:

Exactly. This is what Susan Wojcicki felt like. It is so much better this way. Yeah.

 

Alex Stamos:

Right.

 

Evelyn Douek:

I mean, what’s your big takeaway from the year?

 

Alex Stamos:

Yeah. I am not shocked that we were right. It would’ve been nicer for the world if we were incorrect, but I think it has reinforced something that we’ve talked about multiple times, which is when you invest your time in a social network, one of the things that you’re buying effectively with your time is you’re buying a entry into a community. And the community is multiple things. It is the people who are on that. It’s the people who are not on that. So community is both a positive and negative thing, and Musk has gotten rid of that second part. He decided, “Oh, we’re not going to eliminate people from this community and that will… You could block them if you want.”

Now, he talked about getting rid of blocking. He got rid of that. I think he probably figured out how many people he was blocking himself and a bunch of his supporters. But his theory is you can let everybody in and you can block later. And that has a real effect on the community when you let all of these known abusive accounts back in, when you loosen up hate speech rules, and especially when you get rid of the teams that look at organized manipulation.

That is the other thing that is being massively… Yes, there’s the actual white supremacists and such that are on the platform, but there’s also a ton of, if you look at what’s going on right now with Israel and Hamas, there’s a ton of blue check mark accounts that are clearly being driven by politically motivated actors to spread disinformation. And that is because they got rid of the entire team whose job, they haven’t really gotten rid of the rules against that kind of stuff, but they got rid of the team whose job it was to enforce against it. And through all the blue check mark decisions they’ve made, they’ve made it incredibly easy.

And so by bringing back the known abusers, and then also effectively, if not officially, but effectively opening the door for troll farms to come back, the quality of the community has gone through into the toilet, and it has become a very bad place to be.

I remarked on this of I made a stupid little joke on Threads about looking out over the Atlantic Ocean, a reference to Cal and Stanford being part of the Atlantic Coast Conference. If I did that on X, I’d literally get death threats. People would say… And a bunch of it would be automated because anything I post gets a bunch of automated responses. I was tagged actually from a congressman’s post on X, and within seconds, all of a sudden, he’s getting the same crap. There’s no way it’s humans. It’s got to be automated. I see it all just because I was tagged in this tweet. And then on Threads, I just get really earnest people correcting me and telling me that’s the Pacific Ocean. “Oh, I didn’t realize that when I was last doing the SF to Farallons race, but yes, thank you.”

But that’s fine. I’ll deal with super earnest middle-aged boomers who think that it’s their job to make sure that everybody gets the Atlantic and Pacific corrected. That’s a community I don’t mind being part of because I can just block those people and ignore them, whereas X has moved beyond the part of where curating your own experience there is possible at all.

 

Evelyn Douek:

Yeah. No, never going out on a boat again with you, Alex, captaining if your ability to read a map is that poor. That’s pretty shocking there. So even with my terrible US geography, I knew that one. Yes.

And I mean, it’ll be interesting to see. So all of the reporting this week in all of the major outlets that had stories about the one-year anniversary of Musk acquiring Twitter talked about how the declining quality and also usage, the Washington Post had statistics that people actively tweeting has dropped by more than 30% over the past year, and that it’s hemorrhaging advertisers and revenue and all of that’s down. And so it’ll be interesting to see what happens in a year’s time, whether it just continues its slow, steady decline as the platform basically crumbles.

There were these moments over the past year where people expected it to break completely overnight as a result of laying off so much of the engineering team. And of course, that never happened. But it’ll be interesting to see whether Musk tries and reverses this trend. He did tweet this week that any posts that are corrected by community notes will become ineligible for the revenue share because it turns out the idea is to maximize the incentive for accuracy over sensationalism. So it seems like he had this light-bulb moment where he might have realized that the incentives on his platform were not the incentives for accuracy over sensationalism. I mean, who knew?

 

Alex Stamos:

He’s so close, so close. He always gets really close to the truth, and then all of a sudden he’s tweeting a picture of Iran surrounded by US bases that’s effectively just straight-up Russian propaganda and completely untrue. Yeah, it’s kind of amazing. He’ll say something like that and you’re like, “Oh my God, he’s about to get it,” and then he’s like, “Russia today told me this thing, and here I am retweeting it.”

 

Evelyn Douek:

Yeah, listeners wouldn’t have seen, but I literally put my head in my hands over that one. I mean, this guy still manages to tweet the most reprehensible terrible things. And one of the pleasures of this year has been his increasing irrelevance as the story of the platform has basically become monotonous. Oh, there he is, doing something stupid again, and we don’t need to keep covering it, but it’ll be interesting to see what happens over the next year as well.

Okay, moving on to our legal corner. Thank you. So Colorado and California led a joint lawsuit that was filed this week by 33 states in the US District Court for the Northern District of California, alleging that Meta violated consumer protection laws by luring in youth with addictive features and giving misleading statements about whether those features were manipulative, whether the platforms were safe, and also alleged that Meta had breached its obligations under the Children’s Online Privacy Protection Act, COPPA, by unlawfully collecting the personal data of its youngest users without its parents’ permission.

Now, this is a 233-page complaint. It is also extremely heavily redacted, so it’s hard to get a sense of exactly how much evidence the Attorney Generals have in this case. So for example, there’s this statement in the complaint that these harms are pervasive and often measurable, which is one of the key questions in this lawsuit is are the harms measurable and are they pervasive? And then there’s six pages of redaction straight after that statement, which would be presumably where the evidence is laid out establishing that claim.

And so it’s hard to get a sense of exactly how strong this case is, and maybe litigating this issue is a good way to air some of these issues that have been the topic of a lot of public debate. But it is just fair to say that this would be a pretty radically different First Amendment landscape for this to succeed, and also a pretty radical finding given the current state of public research, which doesn’t support these extremely blanket findings about the effect of these features on youth. And the features that they’re talking about are things that are pretty standard across a lot of different kinds of media and a lot of different kinds of platforms. So it’s things like algorithmic feed, push notifications, ephemeral content, which enlivens in users a sense of FOMO and things like that.

So it’s hard to get a sense of how strong it is, but this is the kind of lawsuit that does reek of more political posturing and trying to get all of the headlines that it did in fact get this week rather than really being fully driven by the evidence.

 

Alex Stamos:

Yeah. So I can’t really speak as to the likelihood of its winning. First off, a legal question, why is it so heavily redacted? What is driving the redactions and will that be lifted during this process for the public?

 

Evelyn Douek:

Yeah. Ultimately, you would think so. I actually had exactly the same question, and I’m not sure the answer because it seems so heavily redacted that it’s redacting things that seem to be public statements in fact, in certain cases. Yeah.

 

Alex Stamos:

I mean, this feels to be more redacted than the Trump indictment that is referencing war plans with Iran, right? Of actual classified documents. These leaked docs are not classified. It just seemed a little odd.

 

Evelyn Douek:

Yeah, no, I don’t actually have the answer to you. It’s a great question.

 

Alex Stamos:

Okay. So I can’t speak as to what’s going to happen here. I mean, clearly a lot of people are addicted to their phones. What’s that responsibility for overall society of building these things versus individual products and how much do they know and such is… I mean, I think it’s going to be a really tough legal fight.

I think in this case, there’s two things that are problematic for me. The first is if this is just about addiction overall, it’d be one thing, but specifically about children. And whenever you talk about children and safety online, the biggest problem that companies always face is they do not have verified identity of individuals. And we’ve dealt with this over and over again in dealing with the laws, dealing with pressure put on them by a FOSTA-SESTA and such, COPA especially, is that companies have a really crappy system where they say, “How old are you?” And you say, “I’m 117 years old,” and they’re like, “wow, you look fantastic for 117. Welcome to Instagram.”

And yes, everybody knows it’s a joke, but the problem is that the alternatives are pretty bad and often very authoritarian. If you think through the actual alternatives for really knowing people’s age up to a high level of accuracy that you would be willing, that if you’re facing this level of legal risk, to accept, you eventually end up in what the British attempted, which is that you have to show ID cards at a local store to get a card that allows you to log in, which all fell apart, or the Chinese model, which is like every internet connection and every SIM card is tied to a government ID. And I don’t think that’s very compatible with the ideals of the US or the EU honestly, and most of the Western world. People would reject that kind of idea and that level of verification of age.

So whenever you talk about any of this kind of stuff and you talk about fixing it, the problem is when you ask people, they’ve never really thought through, what do you have to do to have the level of age verification necessary to truly, truly treat children differently?

And then the second on all of this stuff is this is clearly driven… Why are they suing Meta? So if you’re talking about young people, as somebody with two teens and a pre-teen in the house, Meta’s products are, I’m not going to say irrelevant, but are not the most important for them. Out of all of the kinds of electronic media they consume, the one that worries me the most right now is TikTok, and the reason why is that there’s a bunch of leaked documents from Facebook that talk about things like addiction.

Those documents exist because after the era when I was there and we found some bad stuff, and people started really caring out what Facebook calls integrity, but everybody else calls trust and safety, the company hired hundreds and hundreds and hundreds of quantitative social science researchers, people who are data analysts and people who are experts in different kinds of abuses, and created a huge corpus of studying the effect of the platform and how it could be better.

People can now see this. This was kind of very quietly announced, but your old colleagues at Harvard at the Shorenstein Center have finally released redacted versions, but lightly redacted versions, of the leaked Facebook documents at fbarchive.org. You have to register with an account. It’s not a big deal, but I strongly recommend anybody who listens to this podcast, who’s interested in this area to go look through the actual documents themselves. And what you’ll find is it tells a very different story in total than what you read in the media about those documents because what it shows is you’ve got hundreds of people who care about this stuff, who are trying to do their best, and the real effect of both that leak and maybe this lawsuit is that no company will ever again do this kind of research.

Already, Facebook’s laid off a bunch of those people in these last layoffs. They use the kind of economic situation as an excuse, but clearly also they wanted to clear the decks of anybody who was writing stuff that could be dangerous in these lawsuits. And I just don’t see that as a positive way forward here because realistically doing this kind of really good research on the effects of social media and how do you design it in ways that are better for children and such is almost impossible without the platforms.

And if Congress is not going to act on PATA or any kind of law that is going to require transparency and require cooperation, then if the only kind of regulation we have is through lawsuits like this, it’s going to end up in a negative place. And I think we’re going to look back at this period and actually say things got worse because TikTok and every company that comes after TikTok will never, ever, ever, ever, ever hire somebody who’s ever going to write down in an internal memo, “Our product is addictive.” They’ll never do it. And so that doesn’t make the world better. It makes it worse.

 

Evelyn Douek:

Right, yeah. I think there are a couple of separate questions here about the strength of the lawsuit, but then also the idea of is this lawsuit going to make the world better and actually achieve its aims? And of course, the obvious response, like you said, is, well, if you’re concerned about youth and social media or addictive properties, why are you suing Meta rather than the other sites? Which, like you said, TikTok, but I’m not going to let YouTube off the hook here as another place that has a lot of screen time for users.

 

Alex Stamos:

You’ve never talked about YouTube.

 

Evelyn Douek:

Yeah, I know. It’s just-

 

Alex Stamos:

No, but YouTube, which is the number one screen time. It is the number one for overall, and it probably is for teenagers. I’m not sure if TikTok or YouTube. We’ll actually try to find, of hours spent, which one of those is bigger? It’s definitely not a Meta product.

 

Evelyn Douek:

Right, and not to mention-

 

Alex Stamos:

It’s definitely not VR, right?

 

Evelyn Douek:

Not yet. But let’s see.

 

Alex Stamos:

Yeah, sure. Oculus 3’s coming out. No problem. They’re on it.

 

Evelyn Douek:

But video games more generally, also a massive place where youth spend a lot of time and are incentivized to spend a lot of time.

But of course, the reason why they are going after Meta or why are they starting with Meta, I don’t think there’s anything intrinsically wrong with lawsuits proceeding one case at a time and if Meta is… Again, this goes to what’s behind the redactions and the kind of evidence that the AGs might have in this particular case that might make this a stronger case than other cases, but I think you’re absolutely right that more broadly, what we need is evidence that doesn’t depend on certain particular leaks or just if we happen to be going after this company because we have the evidence in this case, rather than actually going after solving the problems where they arise most acutely. And that’s going to require legislative action, I think you’re right, is the most important thing in terms of getting transparency so that these things can actually be researched in a more comprehensive, consistent way rather than this kind of ad hoc approach.

 

Alex Stamos:

Right. If you’re going to outlaw some kind of growth-hacking stuff, I think that’s great. One, it should be based upon real evidence that doing that is bad for people. And two, it should be fairly applied. Just saying, “This one company we don’t like, so therefore, we are going to destroy all of their economic prospects among young people while we ignore these other ones for four or five years and we’ll get around to it eventually” is a pretty dumb way of… If your goal is actually making things better and not try and just get reelected.

That’s the other thing here that you always see is it’s both Democrats and Republicans, and it comes right back to, especially for the Dems who are a part of this, it’s like the Republicans who are suing them with you don’t really care about your issues. They are angry because these companies did what you wanted and actually finally cracked down on disinformation around elections. And that is why Republicans are angry and they’re looking at any way to publish it. And so the Democrats who are looking to make it to their Senate run because they sued Meta and they’re doing so with folks from Texas, need to realize that they are being used for a political project that is meant to punish these companies because the companies did the right thing on January 6th.

 

Evelyn Douek:

So this is going to take a long time. Don’t expect any answers anytime soon, and I’m sure we’ll keep checking in on this as it proceeds, but I think we are in for many years of stories probably about this lawsuit as it proceeds, by which case I’m sure that Meta will still be the most important platform for youth as it is today.

Okay. Speaking of online safety, terrible news this week, which is that the Online Safety Act has finally become law in the UK, which is terrible because it means I can’t continue to completely ignore it anymore and we’ll probably have to try and get my head around this 255-page piece of legislation. This was a piece of legislation that’s been in the works for years now, and it was something that I tried to follow quite diligently when it was first proposed because the UK is an extremely important jurisdiction, and it’s an interesting piece of legislation. And then it was just it getting changed so many times and it was passing and so many different amendments and all these different… And then it was like it seemed to have momentum, and then it didn’t have momentum. And so at some point, I kind of just checked out and was like, “All right, I will only check back in once this thing actually becomes real.”

Well, sadly, that day has arrived, and I think this is something that we should all get our heads around and start paying attention to more seriously because it’s a pretty comprehensive piece of legislation in a very important jurisdiction, and it’s going to have some pretty marked effects, I think, even around transparency, but also around operating models for these platforms and so it’ll be one to watch.

 

Alex Stamos:

Yeah. And earlier versions of it had some really problematic stuff around encryption, around age verification. So it’ll be interesting to see.

I’ll just throw out a call here. If you are a listener and you’re a UK legal expert, if you’re a law professor in the UK, if you’re a solicitor, if you’re somebody who studies this kind of stuff, we’d love to have you on as a guest to talk about it because it’s got hues and color, and it’s very hard for me to read, so it’d be great if we could find an expert.

 

Evelyn Douek:

Ah, my eyes! It makes me feel like home. It’s great.

 

Alex Stamos:

Right, right. If somebody wants to explain to the colonists how the mothership was able to actually solve this problem, we’d love to hear from you.

 

Evelyn Douek:

Excellent. Sounds good. Yes. Second, that call.

Okay. And some further reading for people this week, Alex, in the past week or so, you published with a co-author, “A Guide to Running Your Own Mastodon Instance and the Trust and Safety Issues That Arise.” So tell us a bit about that.

 

Alex Stamos:

Yeah, so all the credit here goes to Sara Shaw, one of our fantastic masters international policy students at Stanford, and she did a great job doing an overview of the different areas of critical trust and safety issues these days, CSAM terrorism and the like, and then talking about do these things, how are they affected by Mastodon?

So some of it is built upon the work that David Thiel and others have already done around child safety and she kind of brought in the lens and I think set the stage for the fact that there are great things about distributed social media, but one of the downsides is that some of these problems that are much easier to deal with when you have a big centralized for-profit company are going to become challenges and are legal challenges.

So you have around 1822 58A, there are requirements for providers of Mastodon services around child safety. From our earlier work, we think that pretty much every major Mastodon instance is in violation of US law right now because they’re hosting CSAM, they’re taking it down, but when they take it down, they do not report it to the Cyber Tip Line, and there’s no di minimis standard for what is in the electronic service provider anywhere in the law. So she talks a little bit about that, talks about the material support to terrorism laws and the complexity that’s facing there.

So not a lot of solutions, but if you’re a Mastodon server admin, I do recommend reading it. Io.stanford.edu, it’s one of the top things.

 

Evelyn Douek:

Excellent. Okay. And then we haven’t had a sports update in a few weeks now because it hasn’t felt totally appropriate, but the New York Times had some reporting this week that even as Twitter usage is declining, sports Twitter is surviving because it just hasn’t made the jump yet to any of the other platforms. And so I think our sports update is actually a public service because we are now going to provide a place for people who are hungrering after their sports updates to get that hit without having to go to the Twitter platform.

So over to you, Alex.

 

Alex Stamos:

Right. Big college football weekend. Fascinating weekend. So I was at the Cal game, Cal versus USC, perhaps the last time USC plays in California Memorial Stadium for years because next year, Cal and USC, even though they’ve played every year, except for World Wars from 1912 when they played rugby, not football altogether, unfortunately, the two universities… And I believe they weren’t the Trojans back then. They were the USC Methodists, which is a little less intimidating than the Trojans.

I do like to point out to people as a Greek that the Trojans lost the Trojan War. This is, in fact, a question on my final for… I’ll give you a little hint, any of our students who are listening, it is the Greeks who won the Trojan War, not the Trojans.

But unfortunately, the Trojans did win on Saturday after Cal led almost the entire game, but had these crazy turnovers, and they won by one point in one of the craziest college football games I’ve ever seen in my life. And there was also possibly almost an upset for Stanford in Washington. Washington, which was ranked number five coming into the weekend. Stanford got very close. I don’t think they led, they tied at a couple points. They got very close, and Stanford and Cal have both had very hard years. It would’ve been really cool to see them have revenge on these two teams that are going into the Big 10 and have relegated Stanford and Cal to the Atlantic Coast Conference. Like I said, totally the coast that I see when you look out over the rim of California Memorial Stadium, that beautiful Atlantic Ocean. It would’ve been nice, but it didn’t happen.

And so because those victories did not happen, I have to be happy about the Sacramento Kings defeating the Los Angeles Lakers. So professional basketball season has started, so we’ll be swinging our sports away from college football towards basketball. This is it. This is the year for the Sacramento Kings who finally broke their streak last year of not going to the playoffs. Young team. They brought back all the key folks. We beat the Lakers. I hate the Lakers. If the Staples Center fell into a crevasse during the next big earthquake in Los Angeles, that would be fine with me. And so the defeat of the hated Lakers was a big victory, and I expect a lot out of the beautiful Sacramento Kings this year.

 

Evelyn Douek:

Excellent, okay. Something to look forward to, although I’m still hung up on the fact that these teams used to play rugby and then switched, which means they voluntarily chose not to keep playing rugby and to play this other sport instead.

 

Alex Stamos:

Oh, they still play rugby. It’s just that they don’t call it the… It’s just not the thing that gets the attention. In fact, Cal has one of the best rugby teams in the world. They’re like NCAA Champions over and over again. You look at the rugby team and they recruit Aussies and Kiwis and guys from Tonga. It’s crazy. They go out and they pull people from all these big rugby countries and bring them to Berkeley, which is crazy. But they still play. It just doesn’t get quite the… It’s not on TV as much.

 

Evelyn Douek:

There you go. Great piece of trivia.

All right, and so with that, this has been your Moderated Content Weekly Update. The show is available in all the usual places, including Apple Podcasts and Spotify. And show notes are available at law.stanford.edu/moderatedcontent.

This episode wouldn’t be possible without the research and editorial assistance of John Perrino, policy Analyst at the Stanford Internet Observatory, and it is produced by the wonderful Brian Pelletier. Special thanks to Justin Fu and Rob Huffman. Talk to you next week.