Listen on ...
Show Notes
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:
- How else would Elon Musk decide to reinstate former President Donald Trump’s account than a Twitter poll? Okay, well maybe the content moderation council he proposed to deal with reinstatement decisions. – Faiz Siddiqui, Drew Harwell, Isaac Arnsdorf / The Washington Post
- Musk’s mind is also made up on conspiracy theorist Alex Jones whose account will not be reinstated on the platform. – Brian Fung/ CNN
- Former Twitter trust and safety lead Yoel Roth penned a New York Times opinion piece on why he left Twitter and the influence that app store operators have on content moderation. – Yoel Roth/ The New York Times (commentary)
- The EU might just scare Musk straight. After the Financial Times reported the headline “Elon Musk’s Twitter on ‘collision course’ with EU regulators,” European Commission Executive Vice President Margrethe Vestager responded that “We are never on a collision course with anyone because we consider ourselves a mountain.” – Javier Espinoza/ Financial Times, Silvia Amaro/ CNBC
- Mastodon might not be the paradise we hoped we could toot freely and safely in. Content moderation is hard and there’s less control or quality assurance in a federated model, as Block Party CEO Tracy Chou already knew too well before she had a post blocked and now faces torrents of harassment. – @triketora, @mmasnick
- A Mastodon server administrator is deciding who is a journalist while other server operators block those verified journalists from being seen on their “instances.” – Mathew Ingram/ Columbia Journalism Review
- Meta “has fired or disciplined more than two dozen employees and contractors over the last year whom it accused of improperly taking over user accounts, in some cases allegedly for bribes.” – Kirsten Grind, Robert McMillan/ The Wall Street Journal
- FBI Director Chris Wray testified that TikTok poses a national security challenge for the United States because the Chinese government may be able to access extensive data collected by the app or even use recommendation algorithms to push the country’s influence operations on users. – Chris Strohm, Daniel Flatley/ Bloomberg News, David Shepardson/ Reuters, Suzanne Smalley/ CyberScoop
- Sport ball is happening in Qatar “without controversy,” and Meta is using the moment to highlight its recently introduced anti-harassment features on Instagram to block or limit offensive messages aimed at players and encourage fans to think twice before sending potentially abusive content. – Jess Weatherbed/ The Verge, Meta
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
Transcript
Alex Stamos:
I’m trying to find the right plug so I can plug into the thing.
Evelyn Douek:
The high tech Moderated Content. We just need to find the right plug to plug it into the thing.
Alex Stamos:
Here. How’s this sound?
Evelyn Douek:
Oh, that’s so good. That’s the best.
Welcome to the Weekly News hit episode of Moderated Content, with myself, Evelyn Douek and Alex Stamos. Alex, we are basically an Elon Musk podcast right now. I hate it, but we have to give it to the guy. He is successfully managing to keep himself in the spotlight and give us plenty of opportunity to talk about a wide array of trust and safety issues. Full employment program for lawyers and podcasters it seems.
Alex Stamos:
Yeah. And I mean, it only costs $44 billion to become the main character of the day forever.
Evelyn Douek:
Yeah, right. I mean, in part I was surprised that he reinstated Donald Trump’s account because now he’s going to have… If Trump comes back, he will have to share the spotlight, which will be very hard-
Alex Stamos:
Right. But before we talk about Elon, we now have to play our official Elon theme. Right?
Evelyn Douek:
Right. Exactly. Here it is. The Elon Musk segment.
Okay. So this week on Elon Musk’s content moderation dumpster fire, where to start? I mean, after months of complaining about how Twitter is full of bots, Musk reinstated Trump’s account on the basis of a 24-hour Twitter poll that he ran where the yeses edged out the nos. I mean, what to make of this.
Alex Stamos:
Right. So one Musk spent all summer making legal arguments. Maybe you can explain the [inaudible 00:01:34] to folks at some point, making legal arguments that Twitter has a huge bot problem. He has some legitimate complaints, in that he was probably the most followed person by bots and everything that he tweets continuously before and after his takeover is flooded with spam.
Now what’s happened is he has effectively fired or caused the resignations of a huge amount of the security team and the integrity teams that work on anti-bot stuff, so the people who used to do this work are not there anymore. From my perspective, things have gotten a lot worse in the last couple of weeks. Especially if you discuss Musk, the number of accounts that come after you that are only days old or weeks old is pretty incredible.
And in that context of a huge bot problem that he has actually just made worse, he decides that the way he’s going to make this decision is a poll that anybody on Twitter could vote for, so not just verified users or even Twitter Blue subscribers, which is kind of surprising because this seems to be the kind of thing that if I was him, I would’ve used to juice my $8 a month subscription, is to wait a couple of weeks until they could have implemented $8 requirement to vote in the poll, which would’ve made him a decent amount of money probably. But everybody could vote globally on the planet whether Trump comes back or not, whether or not their account is only minutes old or 10 years. So clearly not a scientific poll.
Evelyn Douek:
Right, for just $8. You can participate marginally in the future of our public sphere.
Alex Stamos:
I mean, that would’ve been one option, but in this case you didn’t even need the eight bucks.
Evelyn Douek:
Right. And he kept tweeting Vox Populi, Vox Dei, the voice of the people is the voice of God, but in actual fact it’s the voice of the bots is the voice of God based on how this poll probably played out.
I mean, there is something here to be said for all of these people who think, and we can come back to this when we talk about Mastodon as well in a minute, but let’s just democratize content moderation. Give the power back to the people. Let’s just put all of the power in the hands or the users. That’s not as easy as it sounds, and also maybe it’s not going to come up with the solutions that you want necessarily.
So I do think content moderation by Twitter poll, probably not the answer that we’ve been searching for. I wonder if this is how he’s going to handle it if Trump does come back, if he’s just going to do a referendum, a plebiscite on every single Trump tweet to work out does this cross the line or not? I don’t know.
Alex Stamos:
I mean, he promised a content moderation council. He didn’t say it was going to be anybody with 10,000 bots was going to be a significant voting member of that council.
Evelyn Douek:
He did. He promised a careful process, he promised deliberation, no significant reinstatements will be made without a meeting of the council. Instead, over the weekend, Kanye came back, a number of other people’s accounts were reinstated.
Alex Stamos:
And the tweet he comes back with.
Evelyn Douek:
I know. Do we even, shalom smiley face, which just, I mean, it’s pretty repulsive.
Alex Stamos:
It’s pretty repulsive, yeah.
Evelyn Douek:
To which Elon liked the tweet. So it kind of says where we are on this platform right now. He does have lines though. Alex Jones appears not to have been reinstated despite people pleading. Which is interesting because Alex Jones was the big first deplatforming instance several years ago where there was this, I think you called it at the time, a trust fall, Alex, where Apple moved first by removing Alex Jones and then all of the platforms really quickly followed in succession and he lost all of his accounts, Facebook, Twitter. And that appears to be where Musk is drawing the line. Comments about that?
Alex Stamos:
Yeah. Well, it was interesting because in the response to it, he had a Bible quote about children, which was somewhat mysterious. People didn’t understand if that meant he was coming back or not. And then after being prodded by, of all people, kim.com of mega New Zealand fan fame, was asking for Musk to allow Alex Jones back. And Musk effectively said that Jones would never come back because Musk, I had not known this about him, had a personal tragedy. His first child actually died at 10 weeks old of sudden infant death syndrome. That Musk’s experience with that meant that he was not going to re-platform somebody who benefited off of the death of children.
Which I think is the right decision not to let Alex Jones back on, one of the most odious figures ever to be on Twitter. And then also a figure who really demonstrated how online platforms can cause real world harm in his effective cyber stalking of the families of children who died in school shootings, but also not a great scalable solution for figuring out Twitter’s policies of the only things that matter are tragedies that have been personally experienced by the CEO, are the ones that he is going to have empathy for.
So, a very empathetic human moment for Musk demonstrating why trust and safety work is important, it’s just unfortunate that it’s only based upon things that he’s experienced. That he hasn’t been able to extend that empathy to, say, Jewish people and what they’re going through right now with a real significant rise of antisemitism in the United States.
Evelyn Douek:
In the middle of this descending trash fire that is Twitter at the moment, Yoel Roth, who we’ve talked about a number of times on this podcast, wrote an op-ed in the New York Times over the weekend. Now, when we first discussed the Musk purchase, I talked about my fantasy being that it could be the app stores that provide some sort of constraint on Musk in this situation, because they have done so in the past with places like Parler and True Social, where the Apple App store and the Google Play Store have refused to list those apps in their stores without sufficient proof of content moderation systems and adequate hate speech rules, for example. I like the idea that this could play out as some big Silicon Valley rivalry between a bunch of CEOs.
And Yoel says in this op-ed that this is a place, when you work in trust and safety, you get lots and lots of calls from app stores when they see problems on your services because they don’t want to be seen as hosting these vile platforms. And I think it’s… I’m glad that he drew attention to it, because I do think this is a choke point that not many people do pay attention to. And it is going to be one that there’s increasing pressure on as the major platforms are increasingly… If they pick up their game, and these, as we’ve talked about before, a lot of the misinformation, disinformation, hate speech goes onto other platforms and coalesces in certain spaces, there’ll be increased pressure on app stores to do more about those platforms as a whole. Curious for your thoughts on this one.
Alex Stamos:
Yeah, so I think Yoel is right to point that out. I think he’s right to point it out, I don’t think it’s a good thing. From my perspective, the worst place to do content moderation is the app stores. There are effectively only two big ones globally, certainly really only two big ones in the United States. Outside of the US you have more side loading and you have the use of alternative app stores on Android. There is no alternative app storage on iPhone anywhere. And the app stores are an oligopoly that is supported by hardware rooted DRM.
That built deep into the hardware of an iPhone is cryptographic systems that do not allow any other code in any other app stores to run. This is the fantasy people had in the 1990s of where Microsoft wanted to go with trusted computing and such. This is the worst possible outcome of what people thought Microsoft wanted in the 90s and 2000s to control PCs, is the kind of control that especially Apple has, and the lesser extent Google.
I think that power needs to be used very judiciously. It is, seems to be, especially in the Apple case, used mostly to make Apple revenue. That they use the app store rules to effectively steal money, from my perspective, from developers. I think that’s a problem. I think it’s also a problem then if they believe that they’re the arbiters of speech on those devices. So yes, it is true that this is a constraint, but I don’t think it’s one that I would like to see used, because I think its a very dangerous mode for the manufacturers of hardware to say what kind of speech is allowed on them.
Evelyn Douek:
Completely. And it also is, this is a very disproportionate response in many cases, to kick an entire platform off because of a portion of content that may be absolutely reprehensible. I mean, I said this when Amazon Web Services kicked Parler off for pockets of hate speech. It’s like has Amazon looked at Amazon lately? There’s all sorts of trash on Amazon that gets sold. It’ll be interesting to see how this develops. And I think at the very worst, at the very best, at least, drawing more attention to it and forcing this out into the open, so the app stores have to own the decisions more is good.
Alex Stamos:
Yeah, I mean, the other thing that the app stores is… I think the Amazon one is much more supportable decision by Amazon. One because, Amazon web services is very far from a monopoly. Anybody can buy a computer and just hook up to the internet and host a website. And there are lots and lots of VPS and cloud providers out there, while there is only two major app stores.
And the second is, if Amazon is hosting Parler, they are actively a participant in providing that service. And so I think they do have a higher level of responsibility, whereas in the Apple case with the App store, it is just them allowing code to execute. Yes, they host the binary for the client, but they don’t actually host the speech. And I think it’s the same kind of justification that allows them to, again, take a huge amount of revenue from hardworking developers and slice it all off the top and then also do very anti-competitive things against people like Spotify that they use that same kind of justification here. And I think it falls down, both in the speech regulation case as well as in the antitrust case.
Evelyn Douek:
Good reminder that as we move content moderation further into the stack, we should be attentive to exactly what the provider or what that service is, at various layers in the stack.
Before we leave our Elon Musk segment, I just can’t leave without drawing attention to this quote from European Commission Executive Vice President, Margrethe Vestager, who is the biggest badass in tech regulation. Whatever you think of her substantive rules, in her comments about what Musk was doing this week she said, “We are never on a collision course with anyone because we consider ourselves a mountain.” I just really desperately want to play her in the movie when this ultimately comes out, because she just has the best sound bites. So yeah, good for her.
Alex Stamos:
You’ve been practicing your accents?
Evelyn Douek:
Yeah, exactly. Just I’m going to dye my hair blonde.
Okay, so on content moderation problems, let’s turn to Mastodon, where it turns out decentralizing everything doesn’t solve all our content moderation problems. Alex, catch us up to date on what we saw this week.
Alex Stamos:
There’s been a couple of “scandals” in the Mastodon Fediverse. As our listeners probably know, Mastodon is not one platform. It is a open source product that implements a open standard that allows many, many servers to interact with one another. And so when you “move” to Mastodon, you pick a server that you want to operate on and that server could talk to many other ones, but then the rules of what you’re allowed to say and what kind of trust and safety rules are applied is based upon your server. If a server allows a really huge amount of, say, hate speech or spam, then the entire server often gets banned from the rest of the Fediverse, and that has happened multiple times. And so as a result, people are kind of shopping for what is the trust and safety regime in which I want to operate.
And in doing that, we’ve had a couple of interesting things happen. First, a gentleman named Adam Davidson started up journa.host, a journalist focused Mastodon instance that is doing verification of everybody who signs up has to be a “journalist.” So there’s a little mini controversy of deciding who is a journalist as a unlicensed profession that anybody can do these days, and so that was a little bit controversial, but there’s lots of Mastodon servers that only let in friends or something, so not crazy.
More importantly, a number of servers block them because they did not want journalists looking at their Mastodon output and writing it up. So you had a little bit of a rebellion against the idea, of the idea, hey, we want to have privacy in the Fediverse. Which is, I understand people’s need for privacy, but this is going to be a huge issue, in that the Mastodon community has lots of ideas of privacy that are not backed up by the architecture. And a lot of them are really backed up by the fact that nobody was on Mastodon and so nobody was paying… People were like, oh, I have private conversations because there’s nobody in the room. And now that everybody’s in the room, there’s nothing in Mastodon that makes anything private. In fact, Mastodon has much, much, much worse privacy implications and design than Facebook or Twitter or YouTube or basically any commercial platform. And so it is much more public than any of those things.
And so we’re seeing that kind of brewing issue. And then we had a little mini controversy of an issue that a lot of companies have dealt with, which is, is hate speech something that you enforce in a truly symmetric way, where say you have a rule, you can never mention somebody’s race in a negative way, never mention somebody’s gender. This is the kind of thing that people have fought over for a long period of time.
There’s a great Vanity Fair article titled Men Are Scum, in which Simon van Zuylen-Wood, a reporter for Vanity Fair, embedded in a team at Facebook as they discussed at the height of the Me Too movement, could women say men are scum on Facebook? Is that hate speech? And all of the implications of trying to figure out how do you decide what hate speech is this, this pretty much the exact same thing just happened on the largest of the Mastodon servers, mastodon.social, because Tracy Chu posted something about effectively not reading white male authors and some things that were not super nice about white guys, that is her opinion, and got censored for it. Eventually they reversed after some feedback and such, but we’re starting to see this now as people choose all these servers that are run by community moderators, the exact same problems that Facebook and Twitter and such have faced, is now being faced by a thousand different servers.
Evelyn Douek:
And this was totally predictable. I think there’s been this push for decentralization. The problem has been, oh, we have these massive gatekeepers, these all powerful gatekeepers, but it just turns out that some of these decisions are really hard. It is hard to work out where to draw the line on hate speech. And whether it’s Monica Bighead at Meta making those decisions or the admins on mastodon.social, you’re going to upset some people.
And then we’re recreating the content moderation debates, which is, well, you need to take on feedback, you need to explain yourself. You need to show exactly where the line is so that people know it. There needs to be room for appeals, et cetera. And so in the end, a lot of it is just going to be the same kind of debates in a different kind of context. It turns out that decentralization just moves the issues around, I don’t think it really solves them.
Alex Stamos:
The same debates with much worse tooling. Because the truth is that none of the Mastodon open source servers that I’ve seen so far have anything close to what you have at a big company to allow for trust. People can report stuff with pretty much no context. It ends up in a single queue. There’s no weighting of the queue, there’s no preemptive scanning, there’s no ability to train machine learning. So if Facebook makes a decision that men our scum is okay, then they can push a button that that is then enforced in the rules and the scanners and then that gets pushed down. There’s no economies of scale in Mastodon trust and safety work right now. And so all of these server administrators and their volunteers are going to be inundated with this stuff.
Evelyn Douek:
My anecdote about this is I tweeted about Tracy Chu’s experience on Twitter, and at the moment my Mastodon mirrors my Twitter, and so the feedback that I got on the two different platforms was completely different. On Twitter, I had everyone liking it. Because what I said was, “Oh, it turns out that decentralization doesn’t solve all our content moderation problems.” And on Twitter everyone was like, “Yeah. Damn straight. We always knew this. Stay on Twitter.” And on Mastodon it was like, “No, you don’t understand. It just depends on the instance. This is one of the benefits of it. You can go from instance to instance, and this is a problem in that particular instance.”
I mean, I think that also falls down. If you look at Mastodon.social now, it’s got over a million monthly users. It is not a small community, which is what people are talking about when they’re talking about this you can just choose the kind of rules that you want to opt into. Ultimately people are going to converge because of network effects.
Alex Stamos:
And it’s going to be fascinating to see the legal issues here. People are tweeting about almost no Mastodon instance, only a couple have DMCA registered agents, which is a huge problem if you end up having copyrighted material. None of them are compliant with the DSA. Which I think this opens up fascinating questions for the Europeans, which I have pointed out over and over again to European politicians, that they say we want to have competition in the social media space, we want to have smaller providers, and then they write rules that can only be followed if you have have a billion dollars.
Evelyn Douek:
Right. Yeah, there’s real questions around how something like the DSA will think of Mastodon, but you could see many, many servers being required to have an appeal system open for six months for every single decision. You have to provide reasons, you have to provide recourse to a third party outside arbiter appeal system. There’s all of these really onerous things, which you don’t really expect when you’re setting up a Mastodon instance. And then there’s going to be these questions of where does responsibility fall? I mean, we were talking about the app stores previously, do we think about the app as a whole or do we think about instances? All that’s listed in the app stores is Mastodon the app, and so it’s just going to get completely unruly.
Alex Stamos:
It’s going to be interesting. The oversight board, Facebook’s oversight board, is looking for other platforms. Maybe they end up with a oversight board as a service that you just sign up for a hundred bucks a month and you can get some oversight.
Evelyn Douek:
The problem with that is that they released three decisions last quarter. So get in queue everyone, you’ll get your decision in 2056.
Okay, a story this week about Meta employees hijacking accounts. Meta fired or disciplined over two dozen employees who had found that they were improperly taking over accounts, allegedly for bribes. Alex, is this something to be surprised about? What’s going on here?
Alex Stamos:
I mean, this has been a constant problem for any kind of large consumer internet provider is, constant issues that people get locked out of their accounts. People reuse passwords and that those passwords get stolen and then used by bad guys and the account gets locked. People forget their passwords, people lose their devices or they lose the little note that they wrote their password on and then the next time they have to log in. And so you have to have the ability for people to restore their access to accounts, and that ability has to take into account a whole ridiculous space of the things that happened. I lost my computer and my phone and my password at the same time and I’m traveling and I lost my wallet and I’m in a rowboat. I mean, there’s crazy… When you run a consumer internet product, people have crazy things and they want to get access back to their life.
And the big companies are usually really bad at this. If you lose access to your account, it’s often really hard to get back into it. So to do better job, they have to have queues where people can say, I want access back to my account. And then a human being has to figure out what to do, looking at the evidence of are they logging from the same device or the same browser or the same IP address and weighing all these things. That means you end up with hundreds or thousands of people who have an interface that allow them to reset passwords, reset two factor tokens, all that kind of stuff, and that gives them a lot of power and that creates a situation where they can be bribed. And there has been this history for years of customer service folks from a variety of companies being bribed for first access to data and now things like buying check marks and access.
Meta caught these people, how it was done, I’m not totally sure. When I was at Facebook, we would regularly do stings where we’d go on the dark web and we’d do things like buy Instagram accounts or buy check marks for Instagram or Facebook. And then we’d go find the reps who did that, look into their accounts, look into their links, and then round up anybody who had the same kind of patterns. And so it was possible that this was a sting or it’s possible that some kind of internal detection software picked up the anomalies.
Evelyn Douek:
I’m just imagining you sitting there with your partner, both of you in sunglasses, sipping a coffee and eating a donut, waiting outside, watching the screen to see who takes the honey pot as part of the sting. Is that how it was?
Alex Stamos:
No, no. It was a bunch of people in the conference room. We were sipping coffee from the mini kitchen.
Evelyn Douek:
Way to burst my burst bubble on that one.
Alex Stamos:
Well, and now nobody, apparently, nobody from Meta ever shows up in the office anymore. Mark Zuckerberg’s spent billions of dollars on useless real estate over the last five years. And so now it’s people probably in front of their laptops, at their own kitchen tables.
Evelyn Douek:
In their pajamas, yeah.
Alex Stamos:
In their pajamas. Being an internet cop is not as sexy as being a real cop. Nobody’s made the Beverly Hills Cop of internet copping yet.
Evelyn Douek:
It’s coming. It’s coming. I’m going to pitch it next week. So a good example of how these stories are framed sometimes as Meta has this massive security problem, but it actually could be an example of Meta keeping an eye on its security problems, where people are always a security weakness because they can be bribed.
This week, FBI director Chris Wray testified that TikTok poses a national security challenge for the United States because the Chinese government may be able to access extensive data collected by the app or, use its recommendation algorithms to push influence operations on the millions of users. We haven’t really talked about TikTok yet on this show, Alex. It’s obviously one of the most popular platforms and one that we don’t really talk about all that much in terms of content moderation, but there is this roiling debate over its national security threat, so what’s your take on that?
Alex Stamos:
Yeah, it’s a problem. I mean, TikTok is the first Chinese company to legitimately win in the marketplace. The success of WeChat and some other companies, a lot of it was based on the great firewall. WeChat is really popular among the Chinese diaspora, partially because if you have family in the PRC, it’s the only way you can regularly communicate with them. But TikTok just built a better product and they kicked a bunch of American companies butts. And those American companies don’t like it and they complain about it. But their complaints also have some legs.
The truth is, the country in which one of these platforms is domiciled and that has the most influence over, is really important. When the Europeans complain through a privacy shield and such that the American government can get access, those are legitimate complaints. Now, that kind of stuff is constrained by the law in the US in a way it is not in the PRC. I think for all of the concerns that we’ve had about cross-border data transfer, TikTok is the best example now of a company that has a huge amount of PII, that has some amount of private communication. TikTok’s DM service is not extremely popular, but it might become in the future. And so there is a bunch of data that is going to be accessible to the engineers in Beijing.
It does not matter that the data is stored in AWS or stored in Singapore, those engineers have access to the data warehouse and they have the ability to do queries and to pull stuff out. So it’s a real problem. There’s no good easy solution. There’s been talk of CFIUS basically forcing TikTok to store data in the US. Unless there are significant controls around the code and access to backend systems, that’s going to be useless. Again, it doesn’t matter where the spinning hard drives are. That’s not how people access data in 2022, by grabbing the hard drive out of a data center.
So that’s an issue. And the other issue that Wray talked about is totally true, which is, Chinese influence operations are a big deal. We wrote up the shutdown of five influence operations right before the US midterms. Several of them were attributed to China and were directly trying to influence US elections, including the creation of an anti Marco Rubio group. And those things were shut down because Twitter and Facebook, at the time, had teams that did this. Twitter no longer has that team. TikTok never has proactively done any of that work. They have responded in some cases. In this case, they were quite slow to take down some of the Chinese stuff, which is interesting. I’m not going to use one data point to draw a conspiracy theory here. But TikTok has a real problem of Chinese influence operations, and I think it’s a totally reasonable concern and one for which we don’t have a lot of good solutions.
Evelyn Douek:
Right. I mean, it’s very fair to be concerned about national security issues, but shutting down an entire platform is a pretty drastic remedy. And there is a lot of good and important public discourse and collaboration and democratic vibrancy that happens on this platform, so a story to watch. There’s also been, just to be specific about this, TikTok is talking about Project Texas, which is this plan to allay US security concerns by having more of the data stored in the US. And your opinion on that is that it can’t really be materially effective.
Alex Stamos:
From what I have seen, Project Texas would not address the issues here. What you’d really need is you’d need TikTok US to have all operations from trustworthy nations. So it doesn’t necessarily have to be US, but you’d have US and EU and Singaporean and other places where people already have operations that haven’t been influenced by the government, all the backend operations, all the DevOps. And then engineering would have to, you could use code from ByteDance and Douyin, but it would have to be looked at and carefully dealt with.
Companies do this, but the companies that do that kind of stuff are either the big US cloud providers who do this kind of stuff for Europe, where they have built European subsidiaries where all the operations are done by Europeans and such, and code is looked at and carefully pulled over. And they have a very different business model than a consumer platform like TikTok. So yeah, I’m not totally sure if there’s a practical solution here.
Evelyn Douek:
Okay. In our, what is now becoming our final sports segment of Moderated Content, everything is a content moderation issue. There has been lots of attention on the 2022 FIFA World Cup, the world’s game, and as always as a content moderation aspect, Meta released a blog post this week outlining all of the tools that players have, because of course there’s been a lot of attention to racist abuse that many football players experience on its platform. And so, just a good reminder that there is no big public event that happens these days without a content moderation angle. What happened to the big game last week, Alex? Should I be celebrating? I actually literally do not know.
Alex Stamos:
It was a classic big game, which are often very chaotic. Stanford led for almost the entire game, and then Cal came back in the fourth quarter and ended up winning. So Go Bears, the University of California.
Evelyn Douek:
I experienced all the ups and downs in five seconds there that you had to watch for hours to get.
Alex Stamos:
So I’m trying to make a retroactive bet with Mike McFaul to make him wear a Cal tie. I don’t think that’s going to work, but yes, my Golden Bears won. That being said, I saw a lot of Stanford students showed up in Berkeley, so that’s great. I’m glad to see them supporting their fellow students on the field.
Evelyn Douek:
And that concludes the sports segment of Moderated Content and the rest of the episode as well. This show is available in all the usual places, including Apple Podcasts and Spotify and show notes are available at law.stanford.edu/Moderated Content. This episode of Moderated Content wouldn’t be possible without the research and editorial assistance of John Perrino, policy Analyst at the Stanford Internet Observatory. It is produced by Brian Pelletier. Special thanks also to Alyssa Ashdown, Justin Fu and Rob Hub. See you next week.