“Elon puts rockets into space, he’s not afraid of the FTC”

Come for the discussion of whether Musk is going to find himself in hot water with the FTC, stay for the discussion of privacy and data security regulation more generally. Evelyn discusses Twitter’s data security problems and what this says about privacy regulation more generally with Whitney Merrill, the Data Protection Officer and Privacy Counsel at Asana and long-time privacy lawyer including as an attorney at the FTC, and Riana Pfefferkorn, a Research Scholar at the Stanford Internet Observatory.

Show Notes

Transcript

Riana Pfefferkorn:

I like to say that you’re nobody until FTC sues you. And everybody in the Valley basically has a 20-year consent decree that all these little egg timers are just ticking at the FTC offices.

Evelyn Douek:

Well, I’m feeling very left out now. I’ll have to put that on my to-do list if I want to be somebody in this world.

Hello, and welcome to Moderated Content, podcast content about content moderation moderated by me, Evelyn Douek. Let’s say you take over a company and within weeks your chief privacy officer, chief information security officer, and chief compliance officer all resigned. Oh, and they do so on the same day that your company has a compliance notice due to the Federal Trade Commission under a consent order your company has with the regulator. I would be pretty nervous. But if you’re Elon Musk and the company is Twitter. Apparently, in the words of Musk’s lawyer, Alex Spiro, “Elon puts rockets into space. He’s not afraid of the FTC.” So we are here today to discuss whether Musk should be more worried about the FTC than he is, but also consistent with my thesis that you can get a whole JD by just working through all of the issues that the Musk saga raises. I want to take the opportunity to talk about privacy law and the FTC more generally.

And I have two fantastic people to do that with. Whitney Merrill is currently the data protection officer and privacy council at Asana and is a longtime privacy lawyer having done a stint as an attorney at the FTC as well. Welcome, Whitney. Thank you very much for joining us.

Whitney Merrill:

Thank you for having me.

Evelyn Douek:

And Riana Pfefferkorn is one of Stanford’s own as a research scholar at the Stanford Internet Observatory and was previously outside counsel for Twitter. Thanks very much for making the big trek across the large campus, Riana.

Riana Pfefferkorn:

Thanks for having me. And as a former outside counsel to Twitter, I want to make it very clear that none of my comments here are based on any information that I gained in the course of that representation. This is just me reading documents like anybody else can.

Evelyn Douek:

Excellent. That is fantastic legal practice. You’re already one up on Twitter and Twitter’s other lawyers, new lawyers now. Okay. So before we jump into the events of the past few weeks, I want to set out the relevant context and go way back to the beginning, which is this 2011 consent order that Twitter signed with the FTC. So Riana, maybe we can start with you. Can you tell us a bit about that? What is the 2011 consent order and what led to it?

Riana Pfefferkorn:

Sure. So as a little bit of background, the FTC Act allows the FTC to have oversight over so-called unfair or deceptive trade acts or practices in or affecting commerce. So they can police deceptive representations by companies. They can police unfair practices by companies. And the 2011 consent order that the FTC obtained with Twitter way back in the day was something that they obtained under the deceptive prong of their unfair deceptive acts or practices authority.

And the underlying allegations that the FTC asserted were that Twitter had made public-facing statements about how it was doing a great job protecting user privacy and security, but allegedly, it had security practices that the FTC said were contrary to those statements, including letting too many staff have access to non-public user information, so IP address, phone number, email address, DMs, and protected Tweets as opposed to public Tweets that you mean to have out there on the service, not having strong enough controls for administrative passwords, and not having strong enough restrictions for other administrative access to Twitter systems.

And so, the FTC alleged that these were security failings that consequently enabled hackers in early 2009 to get into Twitter’s systems and take over high-profile Twitter accounts such as Barack Obama’s. And so, that consent order prohibited Twitter from misrepresenting the company’s security and privacy practices going forward and required Twitter to implement a comprehensive information security program that was reasonably designed to protect the security, privacy, confidentiality, and integrity of that non-public consumer information that we just talked about.

Evelyn Douek:

Great. And we’ll talk a little bit more in a second about exactly what that looked like, like what obligations the FTC put on Twitter to ensure that it upheld its end of the bargain. But Whitney, I have a question for you first, which is, can you talk a little bit about how common this is across the industry? Is this a extremely unusual thing? “Whoa, Twitter is subject to a consent order,” or is this something that we see relatively often and is not so outrageous?

Whitney Merrill:

Yeah. So one of the interesting things about the FTC is generally, their investigations are kept private until it comes out that there’s a consent order. So they seem to be pretty rare in comparison to the number of cases that are actually investigated internally. I can’t remember the stats that Maureen Ohlhausen, Commissioner Ohlhausen at the time, gave out of how many cases actually became public versus the ones that were being investigated internally. But I would say cases go through a couple of different paths.

One, you open a case at the FTC. You ask them to send you some documents. They come back and they say, “Here are the documents.” You look at them and you go, “Hmm. They did something wrong. But it’s not so bad and it looks like they fixed it. Let’s just close out the case because going any further is probably not going to be worth anyone’s time. Lesson learned here.”

The second is, you open the case. You look at it. You go, “Actually, no one did anything wrong. We were wrong here.” You closed the case. Everyone moves on their merry way. Another option is you opened the case. You did something wrong. You want to move forward with some sort of action. And a consent order is like a settlement. Right? It’s not saying that there’s any particular wrongdoing, but it is saying, “Hey, given these current facts, you need to do the following things in the future.” And the companies are entering into a consent order willingly. They could choose to not go into a consent order. And what ultimately happens there is usually litigation. And that is the other path that can be a popular one, which is if you want to challenge the FTC on either the facts of the case or their underlying authority, often it goes to litigation.

Evelyn Douek:

Okay. So what did Twitter agree to do under the 2011 consent order and yeah, what kind of things were they supposed to be doing on an ongoing basis? This was a 20-year commitment, so it is due to expire in 2031. So what are the kinds of things that they should be doing and keeping on top of?

Riana Pfefferkorn:

The 2011 consent order sets forth a number of things that Twitter agreed to do. As I mentioned, they had to design and implement a comprehensive information security program. It was supposed to be appropriate to the size and complexity of the company, which I think is interesting in that, that has changed significantly just in recent weeks. And as part of that, they were supposed to designate one or more employees to be responsible for that information security program. They had to identify what kind of risks could come up and how can they develop something to remediate that and keep that from happening. They had to have reasonable safeguards to control the risks that they had identified, do testing and monitoring of all of their controls, and they had to pick their third-party providers if they have all of these other outside parties that they have to contract with in order to make the service run that would also do a decent job of protecting and safeguarding user information.

And on top of that, every so often, they would have to submit periodic reports and undergo assessments from outside third-party auditors. And all of this, as you said, is supposed to be ongoing periodically over the course of 20 years. I like to say that you’re nobody until FTC sues you, and everybody in the Valley basically has a 20-year consent decree that all of these little egg timers are just ticking at the FTC offices.

Evelyn Douek:

Well, I’m feeling very left out now. I’ll have to put that on my to-do list if I want to be somebody in this world. Okay. So 2011 sounds like a long time ago. Surely this consent decree is completely irrelevant now. No, we know from May this year that the FTC is very much keeping on top of it and this has serious consequences. So Riana, can you talk a little bit about what we know from May as well?

Riana Pfefferkorn:

Yeah. So the May order was a modification of the 2011 consent order where it had come out a few years ago that Twitter had been collecting people’s phone numbers in order to use that for security purposes in order, for example, to use as a multifactor authentication mechanism for further securing your accounts. Again, they were supposed to do better at security as told by the FTC originally. So one of the things that they were doing was helping to secure user accounts through multifactor authentication, relying on phone numbers, and it came out a few years ago that Twitter had also been using those phone numbers for targeted advertising purposes.

And so the FTC said, “Look, we had told you. In the 2011 order, you agreed that you would not misrepresent your practices with regard to people’s non-public information such as their phone number. You made these public statements that you were collecting people’s phone numbers in order to help them secure their accounts. In fact, you were actually doing targeted advertising using those phone numbers. So knock it off. And by the way, here’s an extra modified consent order on top of the one that we had had before that is even more detailed of the, did I stutter kind of variety.”

Evelyn Douek:

Right. Good point. And that’s a fresh, what, six-month-old order, so very much not stale. And I believe there was a $150… $150. Yeah, that would be… Okay. Although. maybe for Musk right now, that would be a little bit of a stretch, but $150-million fine that accompanied that. Did you want to add anything to that, Whitney?

Whitney Merrill:

Yeah. What I thought was particularly interesting about this order, the FTC went after Instagram for something very similar, but in this case, the privacy policy of Twitter actually said that they may use contact information for marketing purposes. And the FTC said, “Given that the disclosure given right at the time of collection was only for security, it actually narrowed the rights under the privacy policy,” which shows even more, just for those listening who are interested, can you just bury something in a privacy policy and get away with it? They’re showing that no you can’t. And I thought that was a particularly interesting thing about the order because I can anticipate that just-in-time notices and notices at the time of collection are going to be a lot more popular as a result.

Evelyn Douek:

That is really, really interesting, especially because as Riana laid out, the way that the FTC enforces these is not that there’s substantive laws or requirements, but there’s this… It’s deception about what you are telling consumers that you will do and then not doing that. So the idea that you can’t be two-faced about it and fall back on a privacy policy. That’s really interesting. I didn’t know that part.

Riana Pfefferkorn:

Yeah. And my view here is that, and Whitney, let me know if you think this is wrong is that the FTC has an easier time making assertions under the deception prong than under the unfair prong at least when it comes to privacy and data security. Whitney mentioned that there has sometimes been litigation where a company might not agree to settle charges with the FTC and enter into a consent order and will push back against the FTC. And that has happened a few times. In recent years, and the Eleventh Circuit back in 2018 had told the FTC like, “Yes. Okay, we’ll affirm that you do in fact have authority to police data security as part of your unfair acts practices authority.” But the way that the FTC was going about it, their consent orders were not spelling out exactly what it was that the FTC considered to be unfair or telling them exactly what these companies would need to do going forward. They’re just saying, “What you did before was unreasonable and what you have to do in the future has to be reasonable.” And so what the heck does reasonable mean?

And so after that decision, the FTC went back to the drawing board, sort of took to heart that they had their wrist slapped by the Eleventh Circuit, and really set to work on revising how they spell out in more granular detail what obligations they want to place on companies in those consent orders so that it’s not just this vague, do you have a notice of what you are not supposed to do situation that a couple of companies had pushed back on in the past.

Whitney Merrill:

Yeah. One of the other interesting procedural things about the 2022 consent order is there are three ways that the FTC can generally move forward. They can do an administrative proceeding. They can go through their administrative court to push for settlement. They can go through federal courts. And then they can also refer things to the DOJ to handle with full force and authority.

And unique to this particular situation, I believe they use the DOJ against Facebook for the $5 billion settlement, but they actually referred this consent order to the DOJ for enforcement. And so, in 2022, it’s not just Twitter dealing with the FTC, it’s Twitter also dealing with the DOJ, which is, I have theories on, but there are a couple of things that could be happening here but the DOJ may just simply have more resources to help facilitate enforcement of Section 5. And actually, under this order they said, they also found that they violated Privacy Shield, which while dead as a transfer mechanism still does technically apply.

Evelyn Douek:

You mentioned the $5 billion Facebook fine, I wonder if you could unpack that a little bit and just remind listeners what that was about.

Whitney Merrill:

Yeah. So, oh gosh. Cambridge Analytica, a bunch of data from Facebook was basically downloaded and used by a “research group” called Cambridge Analytica that allowed them to do micro-targeted advertisements. As a result of this breach where Cambridge Analytica was basically downloading all of this user information, the FTC entered into the $5 billion consent order. And it was such a high number in fact because the FTC was really trying to hold Mark Zuckerberg personally liable for the failings of the company. They were unable to do so at the time because they didn’t have the votes. And you could see that in the dissents that were written. I believe that’s where that information comes from. And so, the company was more willing to settle at the much higher number to keep Mark Zuckerberg’s liability out. But we’re seeing now that the FTC is starting to think about personal liability for CEOs much more, given some recent decisions as well.

Evelyn Douek:

Okay, excellent. So we’re going to come back to that.

Whitney Merrill:

Yeah.

Evelyn Douek:

And why maybe Musk should be sweating a little bit more than he is. But all right, so let’s set the stage for that, which is. Enter Elon Musk, it feels like 6 billion years ago, but it was just about a month, chaos ranged. Riana, can you fill us in on the tick-tock of what has happened in the past few weeks that is making the FTC pick up its years and pay attention here?

Riana Pfefferkorn:

The ability to actually substantively carry out the demands of the order depends upon there being people at Twitter who are working on securing people’s private data and implementing security controls within the company. And the first thing that happened as soon as Musk took over was just a massive layoff that I think slashed the company from around like 7,500 employees to about half that. For perspective, even 7,500 employees is smaller than just the number of people that Meta just laid off shortly thereafter. The whole company was smaller than the fraction of Meta that got laid off recently. So Twitter’s always punched above its weight in terms of its influence around the world compared to the size of the number of people who were there. Now there are fewer people there. And there were also a bunch of departures of key people from the C-suite.

He got rid of the general council who had been navigating this ship through troubled waters for so many years, Vijaya Gadde who’s terrific. And shortly thereafter, God, every day feels like a month right now. You are correct. But on the eve of the date, 14 days after the sale closed to Musk, under the ’22 consent order, 14 days after any change of control, Twitter would have to file a compliance report about its compliance with its obligations under the order. And so that 14-day date came late last week, and literally around midnight like the night before, as you mentioned, the chief compliance officer, the chief privacy officer, and the chief information security officer all tendered their resignations. So there are fewer people there in the rank and file to make the site run and protect users and comply with the order. And there are fewer executives and officers there who would be in charge of overseeing the company’s compliance.

Evelyn Douek:

Okay. So, obviously, entirely speculative, but is it a coincidence that they all resign at midnight, the day before a compliance notice is due?

Riana Pfefferkorn:

I think it’s reasonable to infer that there is a connection between those two things. And the reason for that is that the May order says that these compliance notices have to be executed under penalty of perjury. And so, I think it’s pretty interesting that the people who might have probably been on the hook, again, under the order for overseeing the company’s privacy and data security program, all just left the night before this had to get filed under penalty of perjury. And I think that gets us into talking about some differences that we can draw in terms of what are the legal perils facing various people within Twitter here I think there’s a distinction to draw between are you actually doing the things to protect users that the FTC wants you to do versus what are you saying. And I think the latter is the easier way for people to get tripped up.

Evelyn Douek:

Okay. So the next thing we know that happened. There’s a leaked Slack message that says, that legal is saying that they’re shifting the burden to engineers to self-certify compliance with FTC requirements and other laws. And I am just curious, Whitney, as someone that is… Is that a thing? Can you self-certify compliance with an FTC consent order?

Whitney Merrill:

No. It’s such a weird thing that… I guess someone who’s not familiar with how these things work might think, “Oh yeah, we can just self-certify and everything will be okay and they’ll go away.” And maybe that’s an antiquated view of how you comply with the FTC. I think a lot of companies think, “Oh yeah, we can self-certify or we have an external third-party auditor certify for us that we’re in compliance with the FTC order and that gets delivered to the FTC and we move on.”

But an engineer without knowledge of what that order requires just doesn’t have the capability. And this is not an insult to the engineer, just wouldn’t have all of the knowledge and information to be able to make that kind of statement in the first place, and they shouldn’t have to make that type of statement. And that’s why you have someone usually at a higher level, an officer of some sort, collecting all the necessary information to be able to stand behind that and say, “We are certifying that we are fulfilling the requirements set out in the order,” and assuming that that person actually understands what that order says and requires of them.

Yeah, it’s such a bizarre situation. And I also think, given what is also happening at the same time this summer with Joe Sullivan, you cannot possibly want to be in a position to certify to the FTC that something is true if you don’t 100% know it’s true.

Evelyn Douek:

Okay. So explain that then. What’s happening with Joe Sullivan?

Whitney Merrill:

Yeah. So Joe Sullivan was at Uber, and Uber was under investigation by the FTC again. I think there are… I lost track of how many FTC and Uber-related actions there are, but there are at least two. And at some point, during one of the investigations I believe in 2015, 2016, I don’t have the dates in front of me, Uber basically got a ransom request from somebody who had pulled down and accessed a lot of data and said, “if you don’t give me a certain amount of money, I am going to release this data.” And so basically, Uber paid the ransom, put it under a bug bounty in order to facilitate it mostly because they didn’t have the ability to pay something in Bitcoin, and then did not tell the FTC about it during that investigation.

I would say that that information is generally something that is by default responsive to a CID, which is a subpoena that the FTC uses to request documents. And because he hid this information from the FTC, and the details are a little bit murky and there’s a he said, she said situation going on obviously because it was just recently litigated, but ultimately, he was found guilty of lying to the FTC, and for not disclosing to the FTC that they had this ransom. And I think that shows that the FTC really cares that individuals within a company are telling them the truth and not misrepresenting in any way the current compliance or ability to share responsive information to them if they’re under investigation.

Riana Pfefferkorn:

To add to what Whitney said, I think that, going back to my comment about there’s obligations that the company may have, which is what Musk’s lawyer who’s apparently the head of Twitter legal now I guess was trying to say, to try and reassure employees they’d be fine. The FTC order that was filed with the federal court is the company’s obligation. Yes, that’s true, but it’s not the whole story. You don’t get to lie to the federal government. And there are several different statutes that can come into play there. There’s the law against perjuring yourself. So any certification under penalty of perjury is subject to that law. That is a crime to perjure yourself. Independently, you cannot just freestyle lie to the federal government. Making false statements to the federal government is a separate crime. It doesn’t have to be something that you make under oath.

That’s actually the statute that Michael Sussmann got prosecuted under and ultimately acquitted in the special counsel’s investigation. And then, the law that Joe Sullivan was convicted under just a few weeks ago was obstruction of justice, for covering up information from the FTC. So there are all of these different ways that you could get tripped up as an individual, as a low-level employee, as an executive or an officer, CEO for example, that are not dependent upon like, “Oh, is this only an obligation that runs only to the company,” as opposed to the people within it. Nobody gets to lie to the government.

Evelyn Douek:

Right. And Joe Sullivan is still awaiting sentencing, but he faces up to eight years in prison. So these are very serious crimes. And so, the FTC following all of this issued a statement saying that it was tracking the developments at Twitter with deep concern and it spokesperson said, “No CEO or company is above the law and companies must follow our consent degrees.” Surprisingly. “Our revised consent order gives us new tools to ensure compliance and we are prepared to use them.” Okay, so what happens now? Whitney, how likely do you think it is that the FTC will investigate Twitter over this? I mean that statement suggests something? And what does that kind of investigation look like? What’s the timeline? What will we know about it? What happens next?

Whitney Merrill:

Yeah. I mean, okay, there’s one other thing we haven’t mentioned that also is playing into the Twitter scandal, which is the Mudge whistleblower complaint that came out after the May 2022 consent order but after Elon took over that I also think is probably playing into what the FTCs thinking right now. I imagine they already opened another case as it relates to the Mudge whistleblower complaint. And Mudge for those listening was the chief information security officer. Well, he headed up the security trust functions, privacy end, the CISO and a few others rolled up into him, but he left Twitter, or I guess he was fired from Twitter, let go from Twitter, and as a result, filed a bunch of complaints about their privacy and security practices after he left.

But in particular, with the FTC, they’ve definitely opened another case and are looking into whether it violates the May 2022 consent order. They might be looking at whether or not there are separate other issues not related to that underlying consent order that they need to look into and open an additional larger scope of an investigation which could request additional documents, additional information from individuals involved with that program.

In particular, in 2021, in November of last year, the FTC came out and said that they were going to expand their criminal referral program to stop and deter corporate crime. And I think this is particularly interesting here because they made this policy statement that they really want to start holding corporations and their executives accountable for crimes that the agency uncovers and they are aware that they have civil authority, but that they’re going to try and really increase the amount of times that they’re referring things to the DOJ for corporate crimes, specifically mentioning Joe Sullivan, as well as some antitrust matters. But I think, given that kind of policy decision that’s coming from last year, what we’re seeing with Joe Sullivan this year, the whistleblower complaint, and then Mudge’s statements, I have to imagine that the FTC is really looking at all of their options of what might be possible and who else can help them achieve those goals.

Because the FTC has a relatively small budget in comparison to other agencies and that has been a way that they’ve been drastically limited in their ability to go after large players. And so, if you look at somebody like Elon Musk who seems to have $44 billion to spare on a social media company, and to just set on fire.

Riana Pfefferkorn:

And just set on fire, yeah.

Whitney Merrill:

I mean it’s insane to me. You imagine that he might spend a lot of time in legal battles. So how can you use your resources accordingly? And I think this is one particular reason why the FTC probably leveraged the DOJ in that May order. That’s after the fact that Elon said that he wanted to purchase the company, and they knew that major social media companies like Facebook, et cetera, tend to push back on these things and have a lot more legal resources than other companies do. So there’s a case going on. I don’t know what that looks like or when we’re going to hear more about it, but I guarantee, they’re engaging the FTC pretty… or engaging Twitter pretty regularly.

Evelyn Douek:

If there’s anyone there too.

Whitney Merrill:

If there’s anyone there to respond.

Evelyn Douek:

To their emails. Hello, just following up on this.

Whitney Merrill:

They have a com department apparently either.

Evelyn Douek:

Right. Did you want to weigh in on any of that?

Riana Pfefferkorn:

Yeah, I mean I think the FTC is in an interesting position here. This is a very, very fast moving development where I know when you and Alex were talking on Monday, you’re like, “Let’s just keep tabs on whether the site is still up or not while we’re recording this podcast.” It’s now Wednesday afternoon and new things still-

Evelyn Douek:

It’s still up. Yeah.

Riana Pfefferkorn:

… happening. It’s still up.

Evelyn Douek:

Two-factor authentication is down, but the site is still up.

Riana Pfefferkorn:

You know that’s required by the order too. But it’s an interesting position where I think that this is a unique situation against a unique individual who has taken over this company. Whitney had mentioned that a lot of the times companies will agree to settle, and then if they get accused of violating their original consent order, they’ll often agree to settle that as well. But sometimes companies push back. I think the paragon of somebody who is not going to just roll over and settle if there is another accusation of violating the renewed order from May, it’s going to be Elon Musk. The man has flouted and just laughed in the face of the Securities and Exchange Commission before, and just dissed on them publicly even after he and Tesla had both gotten…

Evelyn Douek:

The scariest of the regulators.

Riana Pfefferkorn:

The scariest regulators. After, they had both gotten penalized because he tweeted, I never tweet, that he was going to take Tesla private, and then that turned out not to be true. And so he does whatever he wants. He doesn’t seem to think very highly of the government. And so, if there’s anybody who’s going to just refuse to say, “Okay, you guys, we’re not going to admit or deny a liability, we’ll just put this behind us and move on with our lives,” it’s going to be him. So I think the FTC if they’re going to come at Twitter, they need to come correct. And so that means, I think that they’re not necessarily going to be looking to enforce for little Picayune Mickey Mouse violations of paperwork-type requirements of the order like, “Oh, this filing was due after 180 days and you filed it after 181,” or something like that.

I think it’s going to be that they’re going to… As Whitney said, they may be trying to get documentation or do interviews or whatever in order to try and find out what are the actual privacy and security failings that actually affect users and harm users. The FTC doesn’t have to show actual consumer harm, but that’s what they’re there to prevent from happening. And so I think if they sit on their laurels for too long, there is a risk potentially that there might be another incident based on the fact that the staff there have just been decimated. And so, just their ability to keep things going and to adequately protect user information may be suffering. And so, we don’t want another incident to happen. But I think the FTC needs to proceed very carefully because they know they’re up against this very volatile anti-government CEO that is, I think a very unusual situation for them.

Evelyn Douek:

That’s great. Could we dig in on that? Because I want to pick up on that and make it concrete, about what are the actual harms? What’s at stake here? Because we’re talking about the legal battle and ooh, Musk might be in trouble with the FTC, but I think it’s important to make it concrete about why this matters, not just to Musk and for the headlines. But what is actually at risk here? If Twitter isn’t complying with this consent order, what’s the harm that people could suffer?

Whitney Merrill:

I mean, if they don’t have a comprehensive security and privacy program, we can talk about misuse of data that should not be used for a particular purpose. We could talk about… Misuses that I think a lot of people have been talking openly is that they’re afraid Elon will look at their DMs and ban them from Twitter based on that information. It could be a breach. I know that in Mudge’s whistleblower complaint against Twitter, he mentioned that Twitter is not actually deleting user data because they don’t know where all the data is. And so as a result we have… Well, if you thought you deleted previous existing data and it’s still there, that could be breached. Lots of people have private conversations in DMs, what could be leaked there? To me, the biggest fear is that yeah, my email address, my name, my phone number, I care about all those things, but I’m actually really worried for the content data that’s been living in more private spaces on the platform.

Evelyn Douek:

Yeah. And I guess, it’s also worth noting that that’s not just in the US as well, that this is a global thing. The data issues and the privacy issues and the security issues are things that people, I guess globally, should be concerned about.

Whitney Merrill:

Yeah. I mean obviously, the FTC is scary and there are lots of reasons why I think the FTC is starting to become a “scarier regulator” than European regulators. But GDPR, the 4% of gross annual turnover was the big boogeyman for GDPR because of the fines that could come down. And so we have multiple things that are easy to violate under GDPR that could be going wrong within Twitter, anything from the inability to delete data, not fulfilling user requests for access or deletion of their data, misuse of that data to begin with, if Twitter employees are accessing that data in a way that they shouldn’t data being given or transferred to parties that shouldn’t have access to that data, to begin with, all of those things could trigger GDPR. Breach is a very, very broad term under GDPR. I think people don’t really appreciate how much bigger a breach is under GDPR than it is here in the US which generally is associated with the security breach. But in GDPR land, it could be just a misuse of data.

Evelyn Douek:

Great. So GDPR is the general data…

Whitney Merrill:

Protection regulation.

Evelyn Douek:

Protection regulation, which is Europe’s massive-

Whitney Merrill:

Europe’s massive… Yeah.

Evelyn Douek:

Privacy-

Whitney Merrill:

Exactly.

Evelyn Douek:

… regulation. And so you are saying that it’s much broader and potentially high liability under that?

Whitney Merrill:

Yeah, it’s mostly going to be civil. We’re talking civil penalties. We’ve seen some large penalties come out against major players, Facebook, Amazon, Google, et cetera. In the EU, there’s this idea of having a single authority that can actually go after you for violations of GDPR. That’s under challenge so there’s this idea that potentially every regulator of every European country could come to you for 4% of gross revenue annual turnover. We haven’t seen that happen in practice. Generally, one regulator like the French CNIL, which is their FTC basically, or the Ireland’s DPC, which is again their data protection authority. So we’ll see. I mean, we’ll see what kind of teeth they come out against Twitter. They tend to act for things that are very clear=cut problems or they try to push a particular… I don’t know, agenda’s probably not the right word. But the popular thing happening right now is they don’t like Google Ads and Google Analytics. And so there have been a lot of actions under GDPR as it relates to the transfer of data to the US and Google Analytics data collection. So we’ll see what happens. I don’t know.

Evelyn Douek:

Excellent. The regulators are-

Whitney Merrill:

Chomping in there.

Evelyn Douek:

Exactly. Rubbing their hands. Chomping at the bit. All of those are metaphors. I wanted to ask you if you had anything to add on substantive concerns that you might have about security breaches at Twitter that just might make this more real for listeners about things. Why we should care about this beyond the fun or the potential liability?

Riana Pfefferkorn:

I mean, it seems like Elon’s plan for Twitter is like, “We’re just going to pivot to being some sort of payments-focused app like WeChat in China.” And it’s like, “Okay, so your plan is, Phase 1, get rid of all of your privacy and security people by firing them or just telling them you have until 5:00 PM today to decide whether you’re with me or not,” is the latest development.

“Phase 2. Add a bunch of payment systems where we’re going to collect financial information details from all of these users.” And then whereas the Phase 3, where those two things connect in a safe and secure way. If there’s any point where you’re just not getting under fire from enough regulators already that you want to bring in everybody who gets involved when it comes to financial payments, which I know, Whitney has had to think about in some of her past roles as well.

And this is all operating against a background of risks from insider threats like Whitney mentioned, or people are wondering, “What if Elon just goes trolling through people’s DMs?” But there was already a prosecution of… An indictment against a couple of Saudi Arabian employees who were accused of secretly spying on Twitter users for Saudi Arabia and misusing their access as internal employees within the company. And that’s an example of the kind of insider threat that malicious employees can potentially pose to users.

And we’ve seen these kinds of things happen that at various companies, there was a guy who worked at Yahoo who got convicted for hacking into people’s Yahoo accounts in order to look for nude photos of women. These sorts of examples are legion. And so it’s not even just about, is the system going to be secure against external threats but what are the worries about internal threats?

I’d sort of advocated for like, the first thing that Elon should have done was end-to-end encrypt DMs. A very, very difficult thing to do, but something that the Saudi Arabia incident pointed up what is at stake in those situations. Whereas you said, we’re not just talking when we’re thinking about who are the users abroad in countries that really double down and care about rule of law and democratic practices, but we’re talking about people in all the countries that led to the Arabs Spring uprising where Twitter was an important part of that as well. And people who, if their data is released, it’s not just a matter of like, “Oh, I need free credit monitoring for a year.” It’s a matter of like, “Am I going to go to jail and get killed?”

Evelyn Douek:

Right. For sure. And I think that’s a really important thing not to lose here is that in some way, yes, this is an American thing about the FTC as an American regulator, but these systems, these security processes are important for global security and for very real reasons. So, thank you.

I want to talk about the broader architecture here of this as privacy regulation consistent… Again, Elon gives us an opportunity to talk about privacy more generally. So not just focusing on this issue, but the architecture of privacy regulation, privacy and security regulation in the US where we’ve talked about it, that in this case, Twitter was making these representations in its policies that this is how it’s going to protect data, breaches of that, and so it entered into this consent order with the regulator around deceptive practices, and then it was breaches of that consent order that is leading to enforcement actions rather than there is some statute out there that says, “You should not do X, Y, Z with people’s data because that would be bad.” So Whitney, is this a good example of an excellent regulatory scheme to protect users’ privacy? It seems like there’s something happening here where, if Twitter does bad things, it may face bad consequences. Is this the system working?

Whitney Merrill:

Yes and no. I mean I think the FTCs authority has been broad enough to adapt with change over time. The FTC Act is 1919 or something, 19… I should know this and I apologize for not knowing this off the top of my head, but it’s been around a while. And if you think about how it’s adapting to our technology for enforcement, I’m impressed.

Evelyn Douek:

1914.

Whitney Merrill:

Oh, 14. I was close. Okay. So in that way, I think there are some things working. On the other hand, this deceptive prong has led us to a notice and consent model. You can bury things into your privacy statement and therefore you’re not deceptive if you do something else. But we’re seeing that shift in change. The FTC is adapting to the changes and saying, “Well, you have to look at the totality of the circumstances. You just can’t rely…” I mean it would be nice/ I think a lot of us are clamoring for a federal privacy law. We need some general rights and guidance. One, I mean, just to handle data breaches alone. We’re living in a 53-state regime and territory regime. For data breach notification, it’s vastly narrower than what the GDPR requires from a data breach notification. The vast majority of leaks of data that could be harmful, my private DMs might not trigger data breach notification laws in the United States because a lot of them are tied to usernames and passwords.

Sensitive information like my driver’s license number, my social security number, financial data, credit card information, et cetera, they’re not thinking about things like your private messages. So we need to change the laws there because people aren’t going to do anything about what they don’t know about. And so, as a result, the FTC is stuck to only finding out about things from a privacy or security standpoint when it either makes the news or somebody tells them about it. And that can be a researcher. There have been researchers at Stanford I know that have reached out in the past about particular things they’ve discovered as it relates to consumer protection laws. So I think we need more. We need a better framework. And in particular, the FTC, some people want there to be another authority.

I’m a big fan of the FTC’s been working for a really long time, give them more authority, give them more budget, and let them do their thing. And I think they’re going to land in the right direction. That can be pretty controversial depending on who you’re talking to. But I think if they had more clarity, they wouldn’t have to take such creative measures that you were starting to see come about for enforcement. One, in particular, I think them leveraging the DOJ, the Joe Sullivan as a statement I think is another one. And then most recently, I don’t know, month and a half ago, Drizly, they had a decision against Drizly, the alcohol delivery app. They had a data breach in, I don’t know, 2016 or something. They held the CEO personally liable for the failures of the security program.

And so, they’re showing that they’re going to start holding individuals liable and that that CEO, the drizzly CEO, his name’s Cory, he will have to implement a data security program for every company he leads or has a C-level title in the future. And so you now have a spot on your back. I don’t know what that means for D&O insurance. And now, you’re starting to see… If I were Elon, I would definitely be worried that the FTC is looking, “You’ve made statements that you don’t care about us. You’ve made statements that you don’t believe this order has any enforcement or that you’re not as scared as going to space or whatever it was.” I sure might think maybe we need to bring Elon personally into the discussion since he’s taken the company private.

Evelyn Douek:

It seems like there’s a lot of pressure on the FTC here, I think, to do something like these statements of, “No CEO is above the law.” I mean they’re going to have to back that up somehow in this situation where it seems like the entire security team quit. If the FTC doesn’t take action in this case, I think it would raise some questions about like, “If not here, then when?”

Whitney Merrill:

And you see these letters from Congress, congressmen, “FTC, do something. Investigate this thing,” and you’re like, “Give them a budget, give them people…”

Evelyn Douek:

Give them monetary penalty authority from the get-go.

Whitney Merrill:

Right. Monetary penalty. People… Okay. For those listening, the FTC cannot fine on the first offense. For the vast majority of the situations that they’re in if it’s just unfair deceptive practice, and that’s what you’re going for so deception is the popular one, you cannot just go and issue fines. You have a consent order. I actually think the consent order’s scarier because the more you leverage that to enjoy and create certain behavioral requirements on the company, I think the better you are.

In comparison with the GDPR, they’re just putting out fines. They may be issuing guidance and saying, “Hey, don’t do that again.” But they’re not entering into some injunction with the company going forward to comply where they have auditing ability. And I think the FTC is really leaning into that knowing they can’t find and saying, “Okay, let’s really make sure you do things like implement a 2FA program that does not require somebody to hand over their phone number.” That’s pretty specific requirement that was in the May 2022 consent order from the FTC. So I think they have a lot of creative abilities here, but I think they’re also going to need some help to do it.

Evelyn Douek:

Riana, do you think this is an example of the system working. In a perfect world, is this how we would be doing this, or do you have a wishlist of how we might do this somewhat more effectively or efficiently?

Riana Pfefferkorn:

I mean, I agree with Whitney that it’s been impressive to see how the FTC can do so much with what little they’ve got. Kind of like Twitter actually, FTC is really small. There aren’t that many people that actually work there for an entity that is the consumer protection watchdog for 330 million people. And so they have had to get creative.

I think also in part, because like I mentioned, after being told by a federal appeals court, “You got to spell out what you mean when you’re talking about proper reasonable data security practices.” I think that has induced the FTC to get a lot more specific, which I think that level of flexibility is actually helpful in the particular context of privacy and data security because what is the best practice for those things changes all the time. This is a really difficult tightrope to walk in regulating those areas because if you are too vague, nobody knows like, “Well, what am I supposed to do? Am I not supposed to do? How far? What’s the buffer zone between reasonable and unreasonable data security practices where reasonable is the watchword under the FTC Act under various states?” That’s usually the standard. It doesn’t really say what it means.

If you get too specific in a statute that spells out, “Well, you have to do this, this, this, and this.” Well, time moves on and what yesterday was the best practice is today deprecated. And then you’re trying to enforce laws that have not kept up with technology as it’s usually the case. And so being able to have some amount of flexibility I think is helpful. With that said, it’s been frustrating to see how we keep moving the football down the field and each Congress a little bit to get some sort of national federal-level privacy law or data security law, and it just never happens.

It’s also a compliance burden on companies to have to comply with so many different regulators in the room where not only do you have more than 50 state and territorial entities, you have various federal regulators, not just the FTC, but the Department of Financial Services in New York if you’re a financial company. You have… California has our own little mini GDPRs that we’ve been trying to pass, or at least the billionaires who made the money off of real estate have tried to have gotten passed on top of European regulations, et cetera. And so to the extent that those could be streamlined and not be in conflict or intention with each other, I think that would be beneficial for companies as well. But this is just such a weird area where, unlike a statute that says, “Don’t murder somebody. Don’t lie.”

“Do data security good and do privacy good,” is actually a lot harder.

Whitney Merrill:

Yeah. I’ll also add, unlike other places in the world, EU comes to mind. We don’t really have a don’t do this with data rule. It’s up to companies to define what the governance policies around the data are. And so, as a result, you can’t sell data. It’s more we’re putting restrictions on, “Well, if you sell data, then these are the rights being tacked on top of it.”

And I think a lot of people would like to say is, “What are certain types of data processing of certain types of data that should not happen?” We need to think through what are things that are just never okay. And I think that would also help from a federal legislation perspective. What are not okay processing types of data? A popular one in the EU is you can’t ask certain types of sensitive pieces of information as an employer of your employees in the EU. Are there things like that in the US? Can you not do certain things with health data? And we haven’t really even gotten that far. We’re still in the like, “Well, how do we put restrictions on companies for things they’re already doing,” instead of thinking, what are the best practices moving forward?

Evelyn Douek:

Okay. Is there anything else that I should have asked you about or that you want to add or subtract or at all, comment on before we close out?

Riana Pfefferkorn:

Trying to keep up with not just Twitter, but anything that happens in our respective lines of work is so hard. Trying to teach or comment on the law, having anything to do with the internet, is like watching a big, wet, muddy dog just come bounding into the kitchen and shake itself off vigorously like 15 minutes after you had just mopped the floor. As soon as you’ve gotten on top of recent development, something else happens, you have to go back to the drawing board. And so, this is such a rapidly evolving area that I think gives us great job security to keep pontificating-

Evelyn Douek:

[Inaudible 00:49:10] program. Yeah, definitely.

Riana Pfefferkorn:

… on the podcast forever. And I want to plug my friend’s Bingo card while we’re here. My friend Sumana Harihareswara put out a bingo card yesterday and I think 24 hours later already had to cross off three squares on this bingo card for predictions of what Elon Musk’s going to do at Twitter going forward. Like instituting a real names policy, making Rudy Giuliani, the CTO or the new general counsel, et cetera, or the head of PR, getting people to dog pile onto the FTCs Twitter account, et cetera. And if you want to have a little fun following along with this ongoing saga at home, the Twitter predictions bingo card is a fun one.

Evelyn Douek:

I like crowdsource moderation with gamification. I think that’s it. We haven’t solved content moderation, but it’s because we haven’t tried that. I think that’s definitely going to work.

Riana Pfefferkorn:

That will do it.

Whitney Merrill:

I guess the final thing I’ll add is the consent order is just one piece of where everything is going. Often, with a consent order or some sort of action, the commissioners put out statements worth reading those, digging into those, they give a lot of indication about where they’re thinking about taking a program. So if you’re listening to this and you’re in house at a company and you’re going, “What do I tell people about what I should be worried about,” or you’re a professor teaching it/ look at the statements from the chair or the commissioners on these particular cases. They kind of give the context about what they’re thinking about, what they’re hoping to see, and where they’re going. And I find these to be really helpful and really interesting because it shows that they’re thinking about this beyond just unfair and deceptive practices.

Evelyn Douek:

Well, as Riana said, this is moving very quickly, so we need to push this podcast episode into your feed before it gets out of date. And then, we will wait and see which aspect of platform governance and trusted safety that Musk throws into the spotlight for our next episode. In the meantime, this podcast is available wherever good podcasts are hiding, including Apple Podcasts and Spotify and transcripts are available at law.stanford.edu/moderated content. This episode of Moderated Content was produced by the wonderful Brian Pelletier. Special thanks to Alyssa Ashdown, Justin Fu, and Rob Hoffman.