Alex and Evelyn are joined by Shelby Grossman of the Stanford Internet Observatory to discuss their just released study on the online child safety ecosystem, what’s working and what’s not, and why fixing it is urgent.
Listen on ...
Show Notes
Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory’s Shelby Grossman to discuss SIO’s just-released report on the Strengths and Weaknesses of the Online Child Safety Ecosystem. Read the report here.
SIO is also calling for presentation proposals for its annual Trust and Safety Research Conference. Proposals are due April 30. Details are here: https://io.stanford.edu/conference
Join the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn’t start with “X.”
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
Transcript
Alex Stamos: Yeah, let’s talk about those legal issues since we have a law professor here. Although it’s the wrong amendment for you.
Evelyn Douek: Right, I know. I can’t count that high. I get stuck at the First Amendment. This is all the way in the fourth.
Alex Stamos: At the first. Do we have any Third Amendment scholars at Stanford?
Evelyn Douek: Oh, yeah.
Alex Stamos: Is there anybody who-
Evelyn Douek: A whole pile of them. Yeah, lots and lots in studying that important issue.
Welcome to Moderated Content, stochastically released, slightly random and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos. And today we’re joined by Shelby Grossman, a research scholar at Stanford Internet Observatory and lead author of a report just released today that we’re going to spend the episode today talking about. So Alex, why don’t you intro the report and what it’s about and what you were looking at.
Alex Stamos: Yeah, so this report that folks can go get at io.stanford.edu is the culmination of almost a year of work by our team. So Shelby, why don’t you just talk a little bit, introduce the listeners to what we think is special about this report and what’s unique about our contribution here before we get into some of the details.
Shelby Grossman: Yeah, this is a report about the online child safety ecosystem. In the US Federal law requires that if platforms find child sexual abuse material, they’re required by law to report it to the cyber tip line, which is run by a nonprofit called the National Center for Missing and Exploited Children or NCMEC. And so the motivation for this project was really that there’s a sense from a lot of stakeholders who hold a wide array of beliefs about law enforcement that the cyber tip line, though extremely important, is not always living up to its potential. And so we basically spent the past nine months interviewing people across industry, law enforcement, NCMEC, and other civil society groups to try to figure out why is that? What’s going on here?
Alex Stamos: So it was cops who deal with this, federal agents, prosecutors, defense attorneys, people who work for nonprofits and people who work for platforms. What was our breakdown of US versus international, Shelby?
Shelby Grossman: Yeah, our report really focused on challenges related to the US. So of the 66 people we interviewed, probably like 58 of them focus on the US.
Alex Stamos: Right. So mostly US focused and clearly bound by US law, but there’s a lot more for us to discuss internationally going forward. I think the report’s still relevant because upstream of almost every task force around the world is NCMEC because the platforms are American. And so for the most part, if abuse happens, even if it happens in India or France or Germany, it is reported by American companies to this American pipeline and then it ends up downstream. So I think, Shelby, we all agreed that the cyber tip line was both incredibly important, but also not really keeping up with the current issues and definitely not prepared for the future. So talk about the current issues a little bit. What were some of the findings on why this tip line and the ecosystem around it is kind of falling short?
Shelby Grossman: Among people who work in this space, it is very well known that law enforcement are simply overwhelmed with the volume of cyber tip line reports that they have on their desk to investigate. And so we show that in the report, but that’s not really our contribution. Our contribution here is to highlight a related and core issue, which is that law enforcement feel like they are not able to accurately prioritize reports for investigation. So they might have two cyber tip line reports each showing an individual uploading a piece of CSAM. And if you investigate one of these reports, it won’t lead to the discovery of any other illegal activity. And if you investigate the other report, you’ll figure out that this person is actively abusing children, maybe generating original new CSAM, and they generally have very good operational security, but they slipped up once and got one cyber tip line report.
And the problem is that nothing in these two reports would’ve indicated which one should have been prioritized. And so the high level of findings, which we can go into in more depth are just first that an issue is that platforms are providing incomplete reports. Second, that there are things that NCMEC could do to improve the technical infrastructure of the cyber tip line that would enhance the ability of law enforcement at triage. And then the third, some legal issues that constrain both NCMEC and law enforcement.
Alex Stamos: Yeah, let’s talk about those legal issues since we have a law professor here, although it’s the wrong amendment for you.
Evelyn Douek: Right? I know. I can’t count that high. I get stuck at the First Amendment. This is all the way in the fourth.
Alex Stamos: At the first, yeah. Do we have any Third Amendment scholars at Stanford?
Evelyn Douek: Oh yeah.
Alex Stamos: Is there anybody who-
Evelyn Douek: A whole pile of them. Yeah, lots and lots in studying that important issue.
Alex Stamos: Yeah, so we were skipping over the second and third to get to the fourth here. So the legal issues in the document are interesting. So like Shelby said, this work is technically, theoretically, voluntary. So companies are required under 18 USC 2258A to report CCM if they find it, but they don’t have to look. And that’s a key component here is if they were forced to look, they would be agents of the government and this would be an illegal search since obviously Facebook, for example, does not have a warrant every time they run photo DNA on people’s images.
So I think this was one of my … I knew about these court cases and I had been given some lectures from the lawyers at Facebook back in the day, but it was interesting to me in doing this work of how much two cases, Wilson and Ackerman, really control the entire world here. One of which was by this totally unknown guy named Neil Gorsuch that nobody’s ever heard of, a judge that will go nowhere with his life. And so as a result, a lot of people are really, really worried about the future of this area. But Evelyn, we were talking, this is fascinating. This is actually really equivalent to stuff we talk about all the time on this podcast.
Evelyn Douek: Yeah, totally. It was fascinating reading the report, which is just … I really recommend it to listeners. There’s so much detail and so much great information in there. But reading the legal part of the report, it just was all so reminiscent of conversations that we’ve been having constantly over the last few months because the report really makes clear that you’ve got this very precarious relationship between these platforms and these law enforcement officials and NCMEC around communicating information and essentially cooperating to pursue these issues and to try and bring people to justice, but also doing so in the shadow of these rulings that make it clear that if there’s too much cooperation, if there’s too much look law enforcement involvement in this, that all of this evidence can get thrown out.
And of course that’s very similar to what we’ve been talking about in the First Amendment context with the jawboning, which is that you have the platforms cooperating with the Biden administration in the case of the Murthy v Missouri fifth circuit case. And the argument that’s being run now is if there is too much coordination, if the government officials are telling the platforms too much, this is what you need to take down, take down this piece of content, that that becomes a First Amendment violation. And we’ve talked on the podcast too about why there’s very good reasons why we don’t want the government to be able to do a complete end run around the constitution and just by telling private actors what to do, get to exceed the limits on their power and ignore constitutional limits. But it is also clear that especially in the case of uncertainty, when people aren’t sure exactly where the line is and what the legal rules are, that that really hampers effective cooperation.
Alex Stamos: And the uncertainty, Shelby, this is a big finding, was the fact that nobody really knows what the rules are here and that all of this is being determined by judges years later. One of the examples here was a friend of mine, Sean Zadig, I was lucky enough to hire him as the head of the safety investigations team when I was the CISO at Yahoo. He is now the CISO of Yahoo, which shout out to my brother there, I’m sorry, Sean. Few of us understand the pain that you’re going through, but Sean worked a case while I was his boss in 2014 and he was still testifying about it in 2021. So you’ve got possibly judges triple guessing decisions you made seven years prior. And so as a result, since nobody knows really what the lines are, everybody acts in a hyper conservative manner and then that really gunks up things, right, Shelby? Can you give some examples of the ways people act incredibly conservative because of this legal uncertainty that really messes up the whole system?
Shelby Grossman: So for example, law enforcement find it quite frustrating when they perceive that platforms are submitting reports that are of, quote, unquote, “obviously adults,” but the system is incentivizing platforms to act very conservatively and submit anything that they think could possibly be perceived by a jury down the line as being CSAM, because otherwise they could be fined. And if it’s a viral image, they could be fined many times for making the wrong choice on this one piece of content.
So there was a really fascinating example we heard about a bunch, which is this one meme. So unfortunately it is not uncommon for people to share memes that technically meet the legal definition of CSAM and they need to be reported. Usually people are sharing these memes just out of poor comedic intent. So there’s this one meme that one platform submits a lot, but law enforcement generally does not think it meets the definition of CSAM and the platform does. And so it just drives law enforcement nuts when they get all of these reports of this meme. But the platform is making their own assessment that it is CSAM. And as Alex mentioned, it’s just really the whole system incentivizes conservative behavior.
Alex Stamos: There’s no penalty for over-reporting and there’s really bad penalties for under-reporting. And there’s also other conservatisms there. One of the interviews you did, Shelby, that you’ve got a great quote in there from a law enforcement agency that has actually considered removing themselves from the ICAC system. So can you talk first about the ICAC system and then why a cop might not want to be part of it?
Shelby Grossman: Yeah, so there are 61 internet crimes against children task forces in the US. And if NCMEC can geolocate a report to a certain part of the US, they generally send those reports to these task forces. Though frequently these reports go to federal law enforcement. And so these task forces do the first look at the report and they might decide to investigate the report in-house. They might to decide not to investigate the report at all. For example, if it’s a meme. Or they might decide to delegate the report to a local law enforcement agency.
And so they delegate these reports to local law enforcement agencies that are what are called ICAC task force affiliates. And these police departments have essentially been trained by the task force on how to investigate cyber tip line reports. And so in general, the trend is that more police departments are affiliating with these task forces over time, which is great. But we also heard about some instances where police departments are intentionally unaffiliating with the task force because by getting these cyber tip line reports, it’s just adding more work to their plates, even though the reports that they’re getting are going to probably be quite high priority reports about child abuse happening in their jurisdiction. But because there’s sometimes a fear that if they don’t investigate the report quickly enough, there could be some liability. For these and other reasons, there are a bunch of instances of local police departments unfortunately unaffiliating from these task forces.
Alex Stamos: I think we had an example that somebody mentioned where a lawsuit was filed against a police department because they had gotten a tip that they didn’t properly prioritize. And so we keep on running into these situations where everybody’s trying to act in the best interest of children, they’re trying to act in good faith, but the law only penalizes failure. It does not reward success. And so if it’s optional, and even if you’re 90% successful and 10% failure, which would be an incredible success rate here, if only failure is punished, then the balance here.
We did an interview with one platform where they basically asked why would we do this? We’re basically doing this out of the goodness of our heart. And they felt unfairly targeted by a number of activist groups that they weren’t doing enough. And it’s like, well, everything we’re doing is optional. The more we do, the more we get yelled at. And I think that’s an unfortunate part of this entire ecosystem and something I hope that folks who read our report, especially perhaps staffers in Congress can first train their members not to blame CEOs for reporting stuff. That would be a good first step. It is the people who aren’t reporting things that you maybe need to talk to. Although like you said, Evelyn, maybe that’s jawboning if it comes from Congress.
Evelyn Douek: Right. Yeah, I was just looking at the figures. It was quite amazing how clear that was. The headline figures in 2023, there was something around 36 million reports, and basically all of those were from online platforms. Meta was submitting 82% of all platform tips combining both Facebook and Instagram. And then the next platform down in terms of reports was Google at 4%. And so this big, big gap between Meta’s reporting and the next best reporter. And I think something that you also pointed out in your report was like you are focusing on the platforms that are actually participating in this mechanism, and one of the platforms that you called out is not participating is Telegram. And so that’s not even having findings about the people who aren’t participating and aren’t showing up to do even voluntarily their best.
Alex Stamos: But fortunately, we’ve seen this multiple times, Evelyn, is Pavel Durov up there with his hand up in the air and then a senator making him cry, making him turn around and apologize to families. You see that a lot for Telegram. They are held responsible for the fact that they’ve become the platform of choice of child molesters.
Evelyn Douek: Famously. He’s been there as many times as Elon Musk.
Alex Stamos: And whoever the CEO of YouTube is at any moment.
Evelyn Douek: That’s exactly right. Ouch, correct.
Alex Stamos: Oh yeah. So that’s all real unfortunate. Like you point out and we try to point out the report is Telegram is like we keep on finding this in our investigations that in situations where people are in a conspiracy, and so you can choose your platform, they’re choosing Telegram. That doesn’t mean there’s not problems elsewhere, but obviously Instagram with all of its kids, TikTok with all the kids, Snap, are the kind of platforms where it is not a conspiracy, where the victim is in the conversation. That’s where you go to find teenagers to try to extort them or groom them. But if it’s two adults, you’d be crazy to use a platform other than Telegram. Unfortunately, that’s just the unfortunate fact and the bad guys know that.
Evelyn Douek: Right. Okay. So this report got quite a lot of press coverage today, which was fantastic to see. It’s the product of really, really hard work and I’m so glad that I saw it in the Washington Post and the New York Times, I’m sure a bunch of other places. One thing that was fascinating of course is that when the media is reporting on something like this, they really focus on what is the topic du jour. And so basically all of the headlines that I saw were about the impact of generative AI on this ecosystem, on this reporting mechanism and the way in which the fact that we are on the cusp of this revolution with generative AI and the fact that there’s going to be this influx of artificially generated CSAM, how that’s going to flood an already broken and overloaded system.
And so I think I was finding that bemusing from the fact that having read this 80 page report, how much hard work had gone into this and how much of this is really about the systemic issues that are already there and not necessarily just about generative AI, but curious for your reaction, how you felt about that being the focus of it, but also how do you think generative AI is going to change the game here?
Shelby Grossman: Yeah, so I was fine with that being the framing just because I think it brought more attention to various parts of the report. I think specifically the fear about generative AI is that it is going to lead to the creation of many new, unique pieces of CSAM and platforms will possibly not know that these images or videos were created with AI. And so they will report them to the cyber tip line and it will make it even harder for law enforcement to sort through their stack of reports to identify the report that’s most likely to lead to the rescue of a child.
That being said, when we talked with NCMEC about this, they brought up very egregious examples of AI related reports that they’ve received thus far that actually needed to be escalated to law enforcement. For example, people making CSAM images based on the likeness of a child to which the individual had access to and egregious prompts that were being inputted by someone who had a position of trust within society. But I think if I can just pivot a little bit on the point of the AI framing, bringing attention to some of the other findings in the report, just because I think this podcast has a lot of trust and safety listeners, I just want to highlight some of the findings related to platforms.
I think our key findings related to platforms and things that platforms can do to increase the likelihood that their reports are going to be investigated by law enforcement is just that many platforms for various reasons are submitting incomplete reports. And I think this is in part due to the fact that some platforms are simply not committing sufficient engineering resources to the cyber tip line API process. But one of the other interesting findings that we have is that many platforms do not seem to know which cyber tip line form fields are most important to make a report actionable. There are just many, many, many fields in the cyber tip line and most of them are not required. So it’s not always obvious which are the ones that you need to fill out. So I just wanted to highlight that as well.
Alex Stamos: On the AI side, I can take some blame for all this AI framing because we did a press before these things were done, and Shelby is trying to totally encapsulate this very carefully put together report, and I’m just the, quote, unquote, “Tron 3000.” They just pull the slot machine thing and I, ding, ding ding, AI, AI, AI. And so Shelby’s learning, I’m trying to teach her how to play the game here. This is how you get your report into the New York Times. So it is partially my fault, but partially I think it is also what is necessary to drive … People have been complaining about the cyber tip line for a decade. Some of the people we talked to, Shelby, had a good point, which was like that they are tired of talking about it. A bunch of people said, we just complain. We complain and everything gets fixed.
If we want this thing to be fixed, we have to demonstrate that there is a real urgent need and there really is because of generative AI. The wave is coming. One of the things we talked about, I think was news in this report was that NCMEC had their first million report day where a million images came in, and the only reason they survived that day, it still took them a couple of weeks to dig themselves out. And the reasons they were able to dig themselves out was almost all of them were this one meme that was very, very similar. So they were able to write some code, do some hash matching, and they were able to cluster all this stuff and kind of flush it out of their database.
If a million AI-generated images came in and every single one of them was totally different, which is completely plausible, that would destroy the entire cyber tip line. That would literally be the end of their ability to operate. They would have to completely … I don’t know. If the stuff truly did not have a really good technical hook that you could figure out what was fake, it would be a huge disaster. And I think that day is coming, the million unique report day is coming thanks to generative AI.
And so I don’t think people understand the whole system here is based upon the idea that you’ve got the same several hundred thousand images circulating over and over and over again. So if you have a hundred million reports, it’s only a couple hundred thousand unique images and a couple hundred thousand things that somebody has to look at, a human being has to look at. If every single one of those is unique, the entire thing breaks apart.
I think it’s a good pivot for one of the other good findings of this report, which is that NCMEC’s technology is shockingly, I’m not going to say antiquated, but is not up to modern spec, and it’s not NCMEC’s fault. They’re really restricted here by the way the law is written that they’re the only people in the country who are allowed to hold CSAM legally and they have to hold it themselves. The law massively predates cloud computing. It predates AI, it predates NVIDIA being one of the most valuable companies in the world because they sell GPUs. It predates all of that.
And so as a result, if you asked any trust and safety engineering manager at one of these companies rebuild the cyber tip line, they would do it in the cloud. They would do it at AWS or GCP, maybe Azure because they could utilize modern data structures, modern storage techniques, modern AI training sets, could do things in Kubernetes, could have on demand H100s and TPUs they could use. There’s all kinds of stuff they could do that NCMEC can’t because they have to buy hardware and rack and stack it into a data center that is approved by the Department of Justice.
And so that’s I think one of the other big things here is that NCMEC has to modernize if they’re going to survive what’s coming in the next five years, but they are legally not allowed to. And so they’re going to need Congress to authorize them and then they’re going to need a ton of money and they’re going to need a lot of help. And we’ve got a couple of things in there about the money, but then also one of the options that we proposed to them was if they do a massive cyber tip line uplift is to create a secondment program so that a Facebook, Google, Amazon, Microsoft can give them engineers for six months to a year kind of like AT&F or the USDS that we saw in the Obama administration that you can loan them support because they just don’t have an engineering team that has the ability to live up to their current responsibilities while also rebuilding this whole thing on modern technologies.
Shelby Grossman: And to give one other example of the implications of them not being to make quick technical fixes, so NCMEC themselves commissioned an API that would allow them to compare IP addresses that come in on cyber tip line reports with IP addresses that are known to be active on peer-to-peer file sharing sites for CSAM. And the idea here is that that type of data could help enrich reports. So if you have two cyber tip line reports and one of them is an IP address that’s also very active on a peer-to-peer file sharing site that would indicate to law enforcement maybe that that would report should be prioritized, but they actually haven’t been able to integrate this API that they commissioned themselves and was completed a couple of years ago.
Alex Stamos: And when we talked to them, when we had them walk through their investigative process, there’s a bunch of steps that are still really simple. We see this in our own investigations. When Shelby and I worked with David Teal and Rene Aresta and we all worked together on looking into a number of different child abuse scenarios, including people selling CSAM, all of whom were claiming to be children. Some of them actually are teenagers and the other ones are probably adults pretending and abusing a child to create the content. In those cases, if you find the name of the seller or the buyer, they often use the same name on multiple platforms. That’s known.
Sometimes you’re dealing with criminal masterminds here, but actually very few of these offenders are Lex Luthor. A lot of them are not really covering their tracks that well, they’re doing stuff under their real name. We found plenty of situations where people are using their own Facebook accounts to purchase CSAM of which you can just go and see that they’re a little league baseball coach, for example. And so a basic thing you do in these investigations that we automated at SIO was you just take their screen name and their other information and you search for it in other platforms, and then you generally have to do a human match or you maybe do fuzzing matching on headshots and stuff to see whether or not it’s a possibility those are the same person.
NCMEC does that, but they do it manually. They have a checklist that an analyst has to go through and open up Instagram and then search for a name and then open up TikTok and open up Snap. It’s just nuts. And it’s like that’s the kind of thing that they really need to have engineering talent that before the analyst even sees it, that that data was extracted from what was submitted by the platform has already been run. And then that goes into effectively a classifier with a bunch of other metadata to help do prioritization.
That’s the other issue they don’t have is they don’t really have a smart algorithm for raising up the stuff that’s really harmful, which is what inside of Facebook or Google or one of the big companies, AI is used for detection, but AI is also used for prioritization of what comes to the top of the funnel and there’s not a lot of that. And those are the kinds of things that NCMEC is going to have to build if they’re going to get through the future that we’re probably looking at here.
Evelyn Douek: Yeah, there was a quote from someone in the report that stuck out to me. “NCMEC has stood in time a little bit,” I think was the quote from one of your interviewees. And certainly it sounds like that. Reading this really the picture that’s painted is I think generally, and I’m curious if this is an accurate impression of, lots of people, just everyone with really good intentions trying to work really hard within a broken and deeply frustrating system. Every stakeholder in this project is really trying to do their best, but also somehow deeply annoyed and frustrated by the ways in which this system is failing them and failing kids.
And it just sounds like there’s just sort of roadblock after roadblock along the way of the difficulties for platforms in reporting, difficulties for reading the reports, difficulties for law enforcement in finding people. And then you make this point at some point with this huge influx of reports and material, even if law enforcement could prosecute all of these cases perfectly, the justice system isn’t set up for that at the end of the day anyway, the courts are backlogged as it is. And so it’s this sort of deeply kind of depressing problem that it’s not clear how exactly we’re going to solve it.
Alex Stamos: Yeah, the standing in time, one of the basic things here is the NCMEC API is only available via XML-RPC. So it’s like the way that you would’ve built an API for a bank in 2004. In fact, I just found it. David Teal, our coauthor, and I gave a talk at Black Hat about breaking into XML apps in 2008. None of our Stanford students, most of them were alive I think. I don’t think we have any Doogie Howsers right now in SIO, but they were babies when David and I gave that.
Fortunately David and I look exactly the same. We have not aged at all, it is kind of shocking that we look exactly the same. I’ve got some great David Teal stories and one of them is that this is a talk in which we have like 800 people in the audience and we can’t project because he’s recompiling his OpenBSD kernel because the VGA port was not working on his ThinkPad. Anyway, if you love high stakes talks, then try to do a presentation on OpenBSD using LaTeX with David Teal because that is a real pucker moment in front of your entire industry.
But yes, nobody uses XML anymore, except banks and governments. So you’ve got engineers who have literally … you’ll hear from some of these platforms, especially the new ones with a bunch of 25 year olds working there who are literally opening up old books or going to the library to learn about how do you do XML-RPC requests because they’ve never seen anything but JSON. And it would not be that hard for them to publish a JSON API that ends up in the same database and it’s been a request for years and it hasn’t happened.
And so there’s a ton of work that has to happen here. And part of our point is Congress needs to act, NCMEC needs to act, the platforms have to act and we all have to do it together and coordinate it. You can’t just do these things back and forth, back and forth for the next five years because in five years we’re going to wake up and AI-generated CSAM is going to be the norm, the absolute. To use one of the words Shelby’s been using in every press conference, the modal piece of CSAM will be AI-generated, and that is going to be a world in which real children are still being abused, but that abuse is completely buried in the AI noise.
Evelyn Douek: One of the things that I was really interested in in this report that I hadn’t realized before, and Alex, you highlighted this at the start of the conversation, was the global nature. And it totally makes sense once you explain it that because a lot of the platforms are American and they are bound by American law and American reporting obligations that the bulk of the reports are coming from these American companies, but they’re global companies and they’re responsible for global content. And so many of the reports that they’re making are not to do with American offenses or content originating in America.
But of course that then raises all of these other issues. We’re talking about the difficulties of coordination between the various actors within America and NCMEC, American law enforcement, but times that by scores when you’re talking about coordination with global actors as well. And I’m curious what you found from that perspective, how to think about these global problems as well.
Shelby Grossman: Yeah, we interviewed a couple of people who were familiar with how cyber tip line reports are processed abroad and it gave us a taste of some of the really interesting dynamics that are happening. So on the one hand, you might think law enforcement abroad have it easier because they don’t need to go through any special legal process to open a file attached to a cyber tip line report, but that benefit kind of pales in comparison to all of the other issues that they face.
So obviously in general, it’s more difficult for law enforcement abroad to get additional information from platforms about a cyber tip line report. They’re kind of just stuck with what they got in the report. In some European countries, stricter privacy laws means that it’s more difficult for law enforcement. By the time a cyber tip line report gets to the appropriate law enforcement officer, they may no longer be able to connect an IP address to an individual.
And then the challenges in developing countries were particularly fascinating. So first, most obviously we just heard about translation issues. NCMEC translates the variable names into a bunch of different languages, like the form variable names on their report, but they aren’t translating the content of the report for fear that doing this in an automated way would introduce translation errors which would cause issues down the line. But what we hear is that law enforcement abroad are literally copying and pasting content from these forms into Google Translate, which just increases the frictions to triage.
A really interesting issue we heard is that judges in other countries often aren’t aware that US federal law requires platforms to report CSAM. And so they sometimes will see a cyber tip line report and think that that information was obtained through some kind of extrajudicial violation of user privacy. And judges will sometimes make law enforcement officers re-request the exact same information from platforms which just baffles platforms who are like, we sent you this information a year ago in the cyber tip line report. And similarly, NCMEC has told us that when they do trainings abroad and they tell people about US federal law, they see jaws drop because people just really don’t understand that platforms are required to report this content. But I think we think that more research is needed into some of these issues abroad.
Alex Stamos: And the abroad issues, they can get the content of the cyber tip line, but that is usually very specific to the image. So platforms do not attach, like say it was in a conversation, they don’t attach the whole conversation. They don’t necessarily put in always IP addresses or other BSI. And so usually law enforcement has to ask for lawful process. Well, as you well know, ECBED and the SEA only helps US law enforcement and there’s only two countries have completed Cloud Act negotiations, Australia and the UK. Canada is still negotiating, the EU is still negotiating, and there’s not a single developing country that anybody’s ever even mentioned of having cloud access. And so we’ve heard this from law enforcement overseas all the time, which is the tip’s great, but they can’t get anything else. Like past that tip, everything else you need to actually prosecute somebody.
Evelyn Douek: The most important thing that you highlighted as the recommendation was the need to fund NCMEC and modernize NCMEC and have Congress address that issue. Obviously that’s totally going to be high priority, very realistic, definitely going to happen any second now. But just in case that isn’t on the congressional agenda in the next few months, you have a whole bunch of recommendations for actors that don’t require lawmakers to act as well. You’ve got a whole bunch of recommendations for platforms and for NCMEC and things like that. And I’m just curious, what’s the most important thing that you hope to come out of this?
Shelby Grossman: I’ll just highlight what I think is the most important recommendation for platforms, which is that platforms should make sure that when they’re submitting cyber tip line reports it includes the who, what, where, when. So the who being offender and/or victim information, the what being the actual file, not just the hash, the where being ideally an upload IP address in addition to any other location information, and the when being the time of the incident along with critically how the platform defines the time of the incident, which includes time zone. And so low volume reporters that potentially aren’t already including those pieces of information should think about including them. High volume reporters should be auditing their reports to make sure that they’re reporting those pieces of information accurately and consistently and they’re not making some sort of systematic errors that could be easily addressed with some engineering fixes.
Alex Stamos: I would like to see a concerted effort to uplift the cyber tip line to build AI for image detection, for prioritization, for much better clustering. I think it’d be great if NCMEC had a bunch of investigative systems that make things easier for their investigators. I hope that this happens, but it has to happen now. So hopefully we’ve distributed this to members of Congress. We have a number of members, especially in the Senate who are interested in these issues. We’re hoping that there might be hearings. We would be happy to send folks to testify, and they are looking at a couple of different laws. The touch is none of the laws that have been proposed will solve all the issues we’re talking about, but some of them are a good start.
Evelyn Douek: Well, congratulations on a really thorough piece of work. It’s incredible to see all of that effort culminating in such an impressive piece of work, and I look forward to seeing what comes out of it and the next steps because as you say, it is only just the beginning and there’s plenty more to do.
One of the places in which some of this work can be done, especially when it comes to talking to platforms and cooperation, is the trust and safety research conference, which happens at Stanford every year. And I want to insert a plug for that here because we are currently, well, they are currently soliciting presentation proposals for this year’s conference, which is happening in September on the 26th and 27th at Stanford University. And proposals are due by the end of the month, April 30, 2024. Details for how to submit and how to get involved in that are on the website, which is io.stanford.edu/conference. And this is one of the highlights of the year for me every year coming along to this. It’s such a great community. Basically our entire listener base to this podcast turns up at the conference, and so it makes for a really educational and fun couple of days.
Alex Stamos: Are we going to promise? Are we doing another live show, Evelyn? It was ridiculous how many people showed up for that.
Evelyn Douek: If we can get the security that we need to make that happen given the mobs and throngs of people, that would be great. That would be really fun. It’s tough.
Alex Stamos: I do think it was cool that you jumped out and surfed the crowd. That was a great way to turn what could have been a dangerous situation. It almost was our Altamont, and so I’m glad you’re able to save it through your crowd work, Evelyn. But yes, we’ll do it this year. There’ll just be guys in yellow vests with pepper spray in the five-foot moat between us and the crowd. Like a Beyoncé concert. Yeah.
Evelyn Douek: Yeah, yeah, it’s exactly right. It’s a real insight into how hard it is for those people that have to do that. Taylor Swift and Beyoncé dealing with this all the time. Fortunately, just once a year for me-
Alex Stamos: It’s only once a year for us.
Evelyn Douek: … at Trust and Safety Research Conference, but it’s always a good time. We would love to see you there. Please send in your proposals and thank you so much, Shelby, for joining us today to talk through the report, and congratulations again on the great piece of work.
Shelby Grossman: Thanks for having me.
Alex Stamos: Thanks, Shelby. Shelby worked very, very hard on this. I’m really proud that she got it out and pulled together the whole team. It was great.
Evelyn Douek: And with that, this has been your Moderated Content weekly update. This show is available in all the usual places and show notes and transcripts are available at law.stanford.edu/moderatedcontent. This episode wouldn’t be possible without the research and editorial assistance of John Perino, policy analyst extraordinaire at Stanford Internet Observatory, and it is produced by the wonderful Brian Pelletier. Special thanks also to Justin Fu and Rob Huffman. Talk to you next week.