Big Brother and Big Data
Constitutional Conversation with Barry Friedman
In a way George Orwell—the author of the dystopian surveillance novel, 1984—tried to capture but never could actually have imagined, policing authorities are collecting vast amounts of information on all of us. Artificial intelligence is being used to make sense of this massive data grab, capable of presenting intimate pictures of our lives. Proponents argue this will make us more safe. A remarkable, recently-declassified report from the Office of the Director of National Intelligence paints a different picture: “in a way that far fewer Americans seem to understand, and even fewer of them can avoid,” data is being collected “on nearly everyone that is of a type and level of sensitivity. . .that could be used to cause harm to an individual’s reputation, emotional well-being, or physical safety.” ODNI neglects one harm that preoccupied Orwell and should be on all our minds: the risk of totalitarian government. Should the data be collected? Should these AI analytics of our lives be permitted? All of this is of crucial importance, and will be discussed, but there is a prior question that courts and policing agencies are ducking and needs to be addressed: Is any of this constitutional? And does (or can) our very old Constitution provide any safeguards?
Barry Friedman is the Jacob D. Fuchsberg Professor of Law and Affiliated Professor of Politics at New York University. For several decades he has written, taught, and litigated about constitutional law, civil rights, judicial procedure, policing, and public safety. He is the author of The Will of the People: How Public Opinion Has Influenced the Supreme Court and Shaped the Meaning of the Constitution (2009), which won the American Bar Association’s Silver Gavel Award, and Unwarranted: Policing without Permission (2017), as well as numerous academic publications on topics of constitutional law, the courts, policing, and public safety. He writes frequently in popular media, including The New York Times, Slate, the Los Angeles Times, Politico, and the New Republic. He, along with co-authors from law and the social sciences, published an interdisciplinary set of teaching material, Judicial Decision-Making: A Coursebook (2020). He also is the co-author of Open Book: The Inside Track to Law School Success. He was awarded the Podell Distinguished Teaching Award for his classroom teaching.
Friedman is the Founder and Faculty Director of the Policing Project at NYU School of Law. The mission of the Policing Project is to bring democratic accountability to policing and public safety, and to ensure that it is equitable and effective. He also served as the Reporter for the American Law Institute’s Principles of the Law: Policing.
So welcome everyone to tonight’s Constitutional Conversation, the first one of winter quarter 2025 and it is my pleasure to welcome Barry Friedman from NYU Law School. Barry is, I mean, he’s written about so many different things, it’s hard to describe. Probably his best known book is is We the, not we the people, will of the people he has a major field of research which is about how the cons the Supreme Court responds to popular opinion and he is one of the famous debunkers of the notion of the counter majoritarian difficulty.
He says that’s not really a thing. He has been. You know, he’s written in criminal procedure, constitutional law theory, you know, almost everything and in the last, I’m not sure how many years it’s been, but he’s turned his attention to policing and has done quite a lot of work about possible controls, controls on police misconduct.
He’s also written about what progressive constitutionalism. And tonight he is going to be speaking to us about a problem that I think is probably on, you know, almost everybody thinks about, at least some. Some of us worry about it more than others, but it is big data and big brother. So. Welcome, please Barry Friedman from NYU.
Thank you, Michael. I need to thank Michael for more than the introduction, not only the invitation, but the truth is that Michael chose the topic for my, or the title of my talk. And my guess is that if we had used the title that I had chosen for the talk there’d be half as many people here, so I’m not going to tell you what the topic, what my title was, but, but Michael did better.
Thank you all for coming. It’s, it’s great to see you here. I was trying to figure out whether we call it evening at five o’clock, but whatever hour this is let’s start with a story. We could call this the story of two Davids. So, on March 10th, 2020, David Zayas was driving his great Equinox Chevrolet down the parkways of Westchester County in New York, which is just north of New York City, when he was stopped by Westchester police officer David DiRenzo.
The ostensible basis for the stop was that Zayas had crossed the lane line without signaling and was exceeding the speed limit. But that’s not why Zayas was stopped. Zayas was stopped because an AI algorithm indicated that he should be stopped. This is how that works. So Westchester has a real time crime center, which is one of those if you’ve seen the, you know, the big screens, kind of like this and all of the officers sitting at desks looking at data.
And as part of that real time crime center, Westchester has One of the largest collections of automated license plate reader reads in the country. So automated license plate readers, as the name suggests, capture the license plates of cars as they go by. And they’re used for two reasons. One reason is to compare them to a hot list and see if there’s some reason that the car is wanted.
Whether it was stolen or there are unpaid fines. But the more important reason for our purposes is to put them into a database and store them. And all of these license plate reads are geolocated and time stamped so they can be used to trace the movements of the individuals who drive the car. And on, in fact, I don’t even call them license plate readers anymore.
We call them vehicle surveillance systems and if you have a better name that’s kind of neutral but captures what they’re about, I’m happy to hear it. But with AI, it’s transformed even automated license plate readers completely. So, as digital license plate readers, they were both quite expensive to purchase, and they captured only the license plates.
But with AI, every camera can be an automated license plate reader, and much more information can be gathered, and cars can be searched for. In fact, I’ve seen these in operation, all of these other traits. So, what happens is, Zaius is driving through Westchester, and in 2021, Westchester had partnered with a company called ReCore which used AI and traffic information to do things like figure out traffic patterns and optimize the way that roads are used, but also to help law enforcement with their programs involving license plate readers, and in particular to do something they call interdiction analysis.
So what ReCore claims to be able to do and what happened with Westchester is that because Zayas had made a number of quick trips from his home in Massachusetts to New York City and back, combined with other information, including the fact that Zayas had a criminal record, it was determined that he was trafficking in drugs.
And so, when he came onto the parkways of Westchester, an alert went to the real time crime center. That alert was sent to Officer Dorenzo, who was not on regular traffic duty, that’s not why he was stopped, but was on proactive enforcement, with his drug sniffing dog, either ironically or not, named Liberty.
I could argue that both ways. And they came up to the car, Zaya seemed to be nervous, Liberty signaled that there were drugs in the car, and they found 112 grams of crack cocaine, 34, 000 in cash. Two automatic weapons and a lot of ammunition. Wow, somebody said. So great! Right? We’re using modern technology to identify criminal behavior and do something about it.
Of course, like many stories, this one has a dark side as well, which is the algorithms don’t always work perfectly. The data sometimes is bad. The officers make mistakes. It turns out, and this still mystifies me, that states use the same license plate numbers and the license plate readers can’t discern always what state the car is from, and so mistakes happen because of that.
And when these mistakes happen, often the consequences are quite tragic, like a situation in Aurora, Colorado, when an entire black family was held at gunpoint in the pavement, a case that was settled for 1. 9 million. Or when the police stopped a car in Oakland that happened to be being driven by the head of the Oakland Privacy Commission and his brother, Brian is here with us tonight and they were also held at gunpoint.
And so the question arises, and what I want to talk to you about tonight in one way, is whether we should use these tools. Now there are some people who say, great, we’ve got the technology, let’s make the most of it that we can. Let’s get out of the way of innovation. And there are other people who say, no, let’s ban it all.
And I’m neither of those people. My perspective and the perspective of the policing project that I started about ten years ago, and our executive director is here, there he is right there. Max has just arrived from New Zealand if he seems groggy, but he’s a graduate of the college here at Stanford and NYU Law School.
We take a position that says, look, what you should do with these new technologies You should look at the benefits that they afford society. But you should also have a clear eyed view of the harms. And then not just to weigh them, but actually to figure out whether through regulation we can figure out a way to maximize the benefits while minimizing or eliminating the harms.
And I’ve written a great deal about this regulation, these technologies, including a new article in the Virginia Law Review with Danielle Citron. But that’s not what I want to talk to you about tonight. What I want to talk about tonight is, I would have thought that at the beginning of this project, rather than perhaps the end, I would have stopped and asked the following question.
Is any of this constitutional? But I hadn’t, and when I did it turned out to be remarkably difficult. So in the first part of the talk tonight, I want to talk to you about indiscriminate data surveillance, which is my topic, and explain what it is and how it works. In the second part of the talk, I’m going to stop and ask the question, is indiscriminate data surveillance constitutional?
And my answer is no, it’s not. And then in the third part of the talk, I’m going to try to revive it. And argue that if indiscriminate data surveillance is properly regulated, then there is perhaps a route to finding that it’s constitutional. So let’s start with the question of what is indiscriminate data surveillance.
So when I talk about indiscriminate data surveillance, what I mean is that the police are capturing data on all of you, all of us, whether we’re suspected of something or not, and storing it away in bulk. Zayas’ lawyers said, this is indiscriminate by design, they’re searching without any reason to suspect any individual, without any connection to any particular criminal investigation.
In fact, they, in effect, searched hundreds of millions of people who were in the Westchester database. Westchester brings in 16 million license plate readers a week. It holds them for two years. The database holds 1. 5 billion license plate reads, and in fact Westchester shares those reads with other agencies that are part of the database.
It’s network. The kind of searching that Justice Kennedy called searching of a citizen who is accused of no wrong. So this is Zayas brief. This is searching that occurred without a suspect, unconnected to any particular crime. It was a digital dragnet. Now, back in the day, like when I started teaching criminal procedure, there was a way that criminal investigations work.
It’s a point that Kerry Leonetti made in an article, which is, we started with the suspect, somebody that there was a tip or a lead that somebody had done something wrong, and then the police engaged in surveillance to figure out whether there was in fact evidence that they’d violated the law. But in the day of indiscriminate data surveillance, we flipped that.
Instead, what we have is surveillance of all of us. Looking for suspects. Now, to understand how we should feel about this, we should think, how do the police get this data, and what data are they getting? So they get the data in two general ways. One thing they do is they collect it, like I just explained, using visual surveillance, vehicle surveillance systems, or mobile forensic data terminals, which are machines that can just, in a matter of seconds, suck in the entire contents of your cell phone, and then there are wonderful analytics that show you, that sort of map out all of your contacts, and who you communicate, and in what ways.
There are local DNA databases, so you may well know that the federal government maintains a system called CODIS that records the DNA of people who have been convicted of crimes and who have been arrested and compares that to cold cases. But many local police departments and prosecutors have gotten frustrated with the limited amount of DNA in that database, and so they’re collecting it on their own, trailing somebody in McDonald’s, grabbing a straw, getting it tested, putting it in their database.
But more important than the ways that the police collect the data is that increasingly, they just buy it. They buy it from vendors or work with vendors like ReCore to develop these databases. What do they have on you? Well, start with data brokers. So companies with names that will be familiar to the law students in the audience today because they publish our books.
West and Thomson Reuters claim that they’ve got 10, 000 data points in each and every one of us. There’s location data, including visits to mosques or to abortion clinics. So the FTC just disciplined a company named Ventel, who was collecting this kind of information. A company called Safegraph claimed on its website that this is no longer there, that it could track where we go, how long we stay, what we do when we get there.
There are companies like Shadow Dragon that collect dating app information, menstrual cycle tracking information, health information. Companies like Data Miner and Giant Oak. You have to love all of these names. That collect all of our social media data and analyze that. And there’s even a company called SpyCloud that acquires data that’s been hacked and sells that to policing agencies.
Of course, all this data was less valuable before AI managed to help police departments make sense of it. It’s AI that can chew through all of this data and analytically figure out who they think might be a suspect. Now, this situation is virtually unregulated. One of the most regulated forms of indiscriminate data surveillance is license plate readers.
Some 16 to 18 states, depending upon how you count, regulate automated license plate usage. Most of the things I just described to you are regulated, not at all. More here in California than pretty much anywhere else. And so, the question is, how should we feel about it? Well, let’s start with the risks, and then we’ll move to the benefits.
So, I’ve already talked to you about the data errors, about the ways in which things can go wrong. There’s also misuse. It turns out that when you have a big database with all of this information at a police department, sooner or later there’s some officer who thinks it would be useful to gather that information.
to figure out what’s going on with the romantic part. Happens quite often. There’s hacking. I was surprised to find once when I went looking on the web that policing agencies get their data hacked all the time and held in ransomware attacks, where they have to pay to get back their own databases. All of this, I think, is nothing compared to the privacy interests at stake in collecting all of this data and holding all of this data on all of us.
Racial bias, because aspects of these databases were built in racially discriminatory ways because of policing in vulnerable communities. For example, gang databases. Chilling of rights. You might think twice about going to the mosque or to the abortion clinic if you knew that that was being recorded, or a protest, Black Lives Matter protest, or a pro whoever you want protest.
And I also think a lot about the authoritarian risks. Some people tell me that, you know, I’m being It’s a little bit histrionic in that there isn’t that much of a risk, but I was in France recently teaching about this same thing and I came across this memorial, which is extraordinary and extraordinarily haunting.
I cannot recommend highly enough if you’re in Paris going to see it. It’s the memorial to the martyrs of the deportation. So it’s a memorial that was erected to recognize the rounding up and shipping out of Paris to Nazi death camps of Jews and of resistance members as not the only country that’s done things like that.
We in the In the early part of the 20th century, rounded up people who were accused of being communists. We, of course, rounded up Japanese Americans and put them into camps. And one of the things that you learn about authoritarian and totalitarian governments, and governments dedicated to rounding people up and shipping them anywhere, doing anything to them, is it’s really useful to have a bunch of data about all of it.
So, I worry, in part, about having all of this data stored away and accessible to people in government who might make bad use of it. Now, That’s only half the story, as I said. You might think, well, those are some serious risks, and we shouldn’t do this, and some people believe that. But there are benefits to these technologies as well, or at least there are potential benefits.
So one of them, and perhaps the most common, is retrospective investigations. So if somebody’s been shot somewhere, or there was a robbery of a convenience store, and you want to find the person that did it, one way to do that is to go to the automated license plate reader database and find out which vehicles were proximate at the time, or any one of the other databases.
Another is deterrence, which is the idea that, you know, if you know the government’s got all of this information on you, you’re going to keep your nose clean. That in fact is the logic of the DNA databases, or part of the logic was that people that are in them know that if they commit crimes and leave their DNA there, they’re going to get fingered.
There’s prevention, which is that analyzing all this data taken together might help prevent a terrorist incident or some other crime. And I want to stress, by the way, that I’m sort of cabining terrorism and national security off from the rest of this and talking mostly about just regular old police investigation.
But we could talk about the other in question and answer. And finally, none of that’s what happened to David Zayas. What happened to David Zayas was there was an algorithm that based on his driving patterns predicted that he was Engaging in drug trafficking. Now, I’d like to stand here and tell you that we could weigh the benefits against the costs and then talk about regulation, but we can’t really do that.
And the reason we can’t really do that is because the policing agencies are incredibly non transparent about what they’re doing. Most of what we know, we know because investigative journalists figured things out and published stories on it or freedom of information requests were granted that taught us things.
And not only do we not know often what’s going on, but we certainly don’t know how well it works. So, for example, we know they got David Zayas, but what we don’t know, and what we virtually always don’t know, is how many other people were stopped who didn’t have drugs, but were targeted by the same algorithm.
So, that’s this question of, you know, benefits and risks and whether we should regulate. Sorry, but I promised you something else, which is we’re gonna talk about the Constitution. So, in a minute, I’m gonna show you the Fourth Amendment. And the Fourth Amendment has two clauses, and I’m going to show them one at a time, and I’ll pause and give you a chance to read each clause, and then we can talk about them.
You’ll see in a moment why I’ve separated them.
That’s the first clause. There’s the second.
Now, the second one’s easy enough. It basically is an instruction manual for what it takes to get a warrant, which is that you have the probable cause, a certain amount of suspicion about the information, and a clear, particular description of what it is that you’re going to search or seize. But I want to focus on the first clause for a moment.
The first clause says, no unreasonable searches or seizures. And this clause suggests that there are two questions we need to answer. Is indiscriminate data surveillance, and particularly the collection, and I’m going to focus on the collection, though we can also talk later about the querying of the data.
Is the collection of all of this data on all of us a search or a seizure? And the second question is, if so, is it unreasonable? That’s typically the order in which we talk about them. But I’m going to reverse the order and I’m going to talk about the search or seizure question last. But there’s a reason that I’m going to do that and I just want to explain why.
It’s because of what I call the all or nothing problem. The question of whether something is a search or a seizure is incredibly complicated. If I were giving you a legal lecture about the search or seizure problem, I could probably talk to you for three times as long as this lecture will be. But my intuition is that it’s complicated, not because the question of figuring out whether something is a search or a seizure is particularly hard, but because the consequences in the law of calling something a search or a seizure.
And so let me explain that. In a case called Katz v. United States, which is about wiretapping of a phone booth of somebody who is engaging in interstate gambling, the Supreme Court decided That searches and seizures that don’t have warrants are per se unreasonable, they can’t be done, or they can’t be done constitutionally.
And because of that tying together of those two separate clauses of the Fourth Amendment, we have the all or nothing problem. If something is a search or a seizure, then the police can’t get the evidence unless they have probable cause to go get it. But one of the things about indiscriminate data surveillance is that it’s used typically to get probable cause.
It’s a tool for the police to get enough suspicion to then go get a warrant. So if you can’t get the information without a warrant, the police just can’t get the information at all. On the other hand, if it’s not a search or a seizure, then under the Constitution now, there’s no constitutional protection at all.
The police can gather all of this information and do as they will with it. And there’s no constitutional supervision. And so it’s an all or nothing problem. And I’m going to suggest we can solve the all or nothing problem by simply requiring regulation before there’s indiscriminate data surveillance.
And if I lower the consequences, then I’m hoping that the search and seizure question is going to get easier in everybody’s mind. Okay, let’s move to the second question. Is indiscriminate data surveillance constitutional? Is it reasonable? Or is it unreasonable? Well here, what I want you to understand is that there’s two different ways through the Fourth Amendment.
You’ve only seen one of them so far. The one you’ve seen and the one that I’ve been talking about is what we can call suspicion based policing. We think one of you did something and being good officers, we’re going to go out and try to get more information to figure out whether that’s true. And that’s the kind of searching that we talk about when we talk about warrants and probable cause and particularity.
It’s the kind of thing you see very often on procedurals on television. But if you stop and think about this model of policing, which is kind of the traditional model, there’s a problem. I’m guessing many of you have gone through airport security. I’m guessing you didn’t stop and think about whether it was constitutional or not.
I’m here to assure you that it was a seizure, and it was a search, and there was no warrant or probable cause or particularity. And so you might scratch your head and say, how can they do that? And some people do scratch their heads and say, how can they do that? But there’s an answer which takes us to the second route through the Fourth Amendment.
And so, remember there are these two clauses, and it’s the second clause that’s the problem. But some people say, including Justice Scalia, that the Fourth Amendment doesn’t require warrants. It just requires that searches and seizures be reasonable. And you may be wondering, what’s the second clause doing there?
Give me a second and I’ll tell you. But because of that claim that all that searches and seizures need to be is reasonable, there’s a second category of searches and seizures called special needs searches. They’re what are used to uphold suspicionless policing, and that’s what we’re talking about today.
Policing people without any suspicion that they’ve done anything wrong. And in that situation, under the Supreme Court’s special needs test, if it’s determined that there’s a special need, get to that in a minute, Then they balance. The justices just sit back, or the courts sit back, and they say, Look, the government’s doing this thing.
Is the need to do it greater than the intrusion on an individual? And as you might guess, the answer is almost always yes. So, two routes through the Fourth Amendment. Suspicion based, suspicionless. With those two routes in mind, the question that we come to is, Oh, I should say, special needs is used for things like airport security, sobriety checkpoints, you might have wondered about that one.
Also, Administrative searches, random drug testing, that’s what’s used to uphold all of these things. Okay, with these two routes through the Fourth Amendment, the question that we want to ask is, is any of this constitutional? Well, we know that the first one isn’t constitutional, that indiscriminate data surveillance isn’t constitutional under the first.
That’s easy, because there isn’t any probable cause in particularity. That’s the whole meaning of indiscriminate, that it’s not happening based on cause. Turns out it’s not constitutional under the second either, because there’s a catch that I didn’t tell you, which is the Supreme Court’s doctrine about special needs searches says they have to be for something that is not ordinary law enforcement.
They can’t be used for regular old law enforcement. But that’s why they’re doing this. So we have two ways through the Fourth Amendment. And under neither of them is indiscriminate data surveillance constitutional. Now, depending on your proclivities, you may be sitting there and going like, Wow, there are these huge databases full of all this data, and you’re sitting here telling us that all of it’s unconstitutional.
I am. And in fact, I’m telling you, it shouldn’t surprise you one bit. That’s why the Fourth Amendment was written. To prohibit indiscriminate searching and seizing. That was the point. Started in Great Britain. That’s John Wilkes, if any of you know the history. With General Warren. Which were warrants that were given to government officials to go out and search for whatever’s the cause of the search.
John Wilkes was one of many people accused of seditious libel against the crown. And with a general warrant, two things were true. You could search anyone and everyone for evidence of the thing you were off looking for. And the officer was completely immunized from any liability for anything they did.
Destroying people’s property, invading their privacy, didn’t matter. If you had a general warrant, you were golden. On our side of the Atlantic, the similar tool was writs of assistance. Which were used by crown officials to uncover our framers who were notorious smugglers and people who refused to pay the taxes that Great Britain was imposing, as many of you know.
And this is James Otis arguing the Writs of Assistance case in Massachusetts. He’s the guy that said famously in that speech, a man’s home is his castle. And John Adams was sitting in the audience watching the trial and years later said, this was the first act of opposition to the arbitrary, by the way watch that word, arbitrary claims of Great Britain.
Then and there the child independence was born. And in case after case after case the Supreme Court says, no indiscriminate searching. That was the evil that the Fourth Amendment was aimed at. Now, you may wonder why. Like, what was it about General Warren’s, though it’s probably occurred to you already, which is that the problem with General Warren’s was that it allowed government officials to target people arbitrarily, that’s the word, I’ll show you in a minute, a lot more quotes, in an unjustified way.
This is the age of reason, and just picking and choosing people on whim to search them did not sit well with the framers, which the Supreme Court has also said over and over that the point of the Fourth Amendment. is to avoid unconstrained discretion in government officials. Arbitrary decision.
So the Fourth Amendment was written precisely to prohibit the thing that police are doing now. I could stop here. If and when the Supreme Court has to resolve this issue, they may stop there. You might ask, why am I going to go on? We could all get to the reception faster. Two reasons. The first reason is I’m not certain that none of this is valuable, or to say that more affirmatively.
It’s possible that these tools can be used to keep all of us more safe. I don’t know. Like I said, a serious lack of transparency and a serious lack of analysis, I wish I did know. But my other reason is, I actually lack confidence to believe that the Supreme Court, when having to resolve this question with Entities on the other side saying yes, like the NSA and the DEA, is going to be willing to say that the whole kit and caboodle is unconstitutional.
And if they don’t and they have to figure out a way to uphold it, I’m going to worry that they’re going to decide that none of it is a search or a seizure. And that all of it can happen with no regulation whatsoever. I don’t have faith. And so I’ve tried to figure out a way to understand the Constitution in a way that would require that all of this be regulated.
And as I’m going to argue in this last part of the talk, I think there’s a way to read the amendment in just that way. Now to answer that, we’ve got to go back to the question, what makes a search or a seizure reasonable? Now, we know we have two routes, suspicion based, suspicionless, and we know under The suspicion based route that, as I’ve argued, it’s unconstitutional, but more importantly, I want you to think back to the evil of the Fourth Amendment, arbitrary searches and seizures.
Well, warrants and probable cause in particularity solve that problem. That’s exactly what they do, right? They say, this is going to tell us how to pick and choose the people who get searched and seized. We’re not going to do it arbitrarily, we’ve got probable cause about you, and so we can get a warrant.
That’s why that second clause is in the Constitution. They wrote it. to avoid the problem of general warrants. The puzzle is, what is it about the other kind of searching and the law governing the other kind of searching, the airport test, the special needs test, that protects against arbitrariness? I want you to think about that for just a minute, because my answer is, there’s nothing in the Supreme Court’s special needs test that does anything to avoid arbitrariness.
It’s completely misfit to the problem it’s supposed to solve. It asks first, is there something that law enforcement’s doing other than ordinary criminal law enforcement? And if so, it’s out. Now that has nothing to do with arbitrariness. I don’t have a problem with the Supreme Court deciding they want stricter rules for criminal cases.
That’s fine, though it’s a problem in this space. And the balancing has nothing to do with arbitrariness. It’s just the whim of judges about whether they think it’s valuable or not. Right? And in fact, it amuses me a little bit that the current Supreme Court, which is originalist and hates balancing tests and tells us that all the time, thinks that the right test substitutes the whim of police officers with the whim of judges.
So we’re kind of at a dead end. Because neither doctrinal, the doctrinal test that has the most hope of saving indiscriminate data and surveillance doesn’t. But, here’s the big but. So, if you read special needs cases, And I have read too many of them and my research assistants even more. What you find is that if you get past the balancing and the special need and all the whole talk about purposes other than law enforcement, there’s a bunch of factors in those cases that courts identify over and over and over again as part of what legitimates that kind of policing.
There’s usually a written policy, at least a law enforcement policy, but often a statute that says this is the program that we’re going to engage in. Program is good. It’s a way of letting us know what is it you’re doing. In virtually all the cases, the data is collected from everyone. Remember, avoiding arbitrariness.
Think about airport security. We put everyone through it. Or if it’s aimed at a subgroup, there’s a basis in fact gathering the information from that group. There’s a stated fit or relationship between why we’re doing it and what we’re hoping to accomplish. They can tell a story about how it is that the collection will help solve a social problem.
There often are rules about who can search, often it’s not the police, because remember it’s not to be for ordinary criminal law enforcement. And finally, and this is critical, there’s a way for judicial review, for judges to review it and figure out whether it’s consistent with the Constitution, which in part makes the written policy critical.
Judges can look at the policy and say, does this thing hold up under the Fourth Amendment? I was the principal called a reporter and author for the American Law, I mean, for the American Law Institute Principles of Policing, and we have a whole chapter on this, which goes through all the Supreme Court cases and says, these are exactly the kinds of rules that you have to have if you want to engage in suspicionless policing.
But I want to add just a few more, because I think these are all critical to regulating indiscriminate law enforcement. And you find these in a huge number of the special needs cases. I actually don’t think it’s enough that there’s a law enforcement policy. I think there should be a statute. If we’re going to gather the data on all of us, we ought to have a say in whether that’s going to happen.
There’s actual evidence to support the fact that the thing works, which is what we almost never have with indiscriminate data surveillance. This is important if we talk about searching the databases in the Q& A, but there’s a predicate. There’s a rule for when you can go in and look at the data. You can’t just go in and look at it any time you want.
And there are guardrails, safeguards, like minimizing the scope of the data searches. And importantly, how long the information can be retained. And I think that the Supreme Court could uphold indiscriminate data surveillance as reasonable under the Fourth Amendment if it said there have to be this kind of regulatory, there has to be this kind of regulatory regime.
And in fact, I’ll tell you one story as I come to the end here which is that this is exactly what the court did in another really critical moment in history. It’s a case called Berger v. New York. So remember I told you in Katz they said searches with searches without warrants are per se unreasonable.
It was a guy talking on a payphone gambling. Well, after Katz was decided, the state of New York passed a wiretapping statute. And Berger challenged it as unconstitutional. And the state said, we can’t write a statute that’s con that, you know, meets the requirements of the Fourth Amendment. Sorry. Like, it’s hard to particularly describe a conversation that has not yet happened.
And the Supreme Court said, hey, too bad. Like, if you can’t write the statute, then it’s not constitutional. But, if you read our cases, we’ve given you enough signals as to what a statute would have to look like to be constitutional. And that’s exactly how we came to have Title III, which is the federal wiretapping statute that does exactly what the Supreme Court said in Berger.
And that’s what I want the Supreme Court to do with indiscriminate data surveillance. I want it to say it’s a search or a seizure, I’m going to just spend a second on that in a moment, and because of that, because you don’t have a warrant and probable cause, it’s just flat out unconstitutional, unless or until There’s a regulatory scheme in place.
And in fact, I don’t want the court to say any more until it’s got those statutes, and can look at them and decide whether it thinks those statutes have enough safeguard to protect all of our rights if the police are engaging in indiscriminate data surveillance. Now I promised you that I’d say something about the search and seizure question.
I’m going to do it really quickly so we can get to the Q& A. Happy to talk more about it once we get there. But you might think, well of course, actually I was going to ask you to vote, I won’t. But I was curious to ask you, do you think collecting all this data on all of us is a search and seizure? 10, 000 data points and social media information and, you know, where you go and what you do.
And I’m guessing most people say, yeah, it seems like it’s one of those things. But it turns out that under the Fourth Amendment doctrine, anything you do in public, so anywhere you drive in your car, isn’t a search or seizure. Now, those were cases where somebody was looking at a car. I think there’s a big difference between that and 1.
5 billion reads of all of our locations. The courts also said anything you give to a third party, if the police get it, it’s not a search or a seizure, which is something in a world in which all of our personal information is sitting on third party servers. The Supreme Court doesn’t even believe this anymore.
In two of their more recent cases, though they’re not that recent, the court has said, eh, there are limits. You can collect too much information and then it’s a search or a seizure. But the court can’t say how much is too much. It’s driving them crazy. And so, they kind of have ducked the issue. They haven’t taken one of these cases in a long time.
But it’s because of the all or nothing problem that I told you about. They don’t know which way to go. Is one day of data okay, but two days of data isn’t okay? They don’t know. And in fact, they keep saying in their opinions, different justices, we sure wish the legislature would do something about this.
But there are ways to find that this is a search or seizure, and I think it’s easy if you just lower the cost. So one is it says searches or seizures. And all the Supreme Court cases are always asking if it’s a search. But why isn’t it just a seizure of your data? And others have suggested that. In fact, when you read the cases, it’s interesting because the cases all talk about seizing data.
Then they go on to ask whether it’s a search. Seems like a misfit. The Fourth Amendment says that you’ve got a right to be secure in your person, house, paper, and effects. And the court spends all of its attention on houses, papers, and effects. But why not persons? I don’t feel secure if the government’s got all this data on me sitting in a database somewhere.
I don’t feel secure at all. Without any protections, any regulation. Papers. Why isn’t a lot of this electronic data the, you know, 2025 version of papers? Certainly your emails and your texts. But what about, you know, the information that’s in your health app? Or your social media? Or menstrual cycle tracking app?
This is the kind of thing that, absent the technology, people would be writing down. And in fact, they’re getting all the information off of your mobile devices, effectively. I don’t think the search or seizure question is remotely difficult. If you don’t have the all or nothing problem and you’re just asking the question, does this need to be regulated in a particular way?
And in fact, I think it’s essential to do what Justice Scalia promised us we were going to do at the dawn of this sort of technological searching. Thank you. I am done.
You can queue here or on the other side to ask questions. I’m going to go and turn that one on. But just come over and stand over here and we’ll go back and forth.
Come on, Oren. I don’t know whether I want you to go to the microphone or not go to the microphone. Oren knows far more about this than I do. He is, he is one of the country’s leading experts on the Fourth Amendment and he just joined the Stanford faculty. Thank you. So, so Barry, my question for you is where would you draw the line on searches and seizures?
So we have these, this doctrine, the Supreme Court has announced in many, many cases. We’ve got, you know, maybe 70 or 80 cases on what is a search maybe total. And I want to know kind of what you would say is allowed and therefore subject, or what is subject to regulation in your view. So, officer walking the beat, looking around at people walking by, that’s indiscriminate surveillance, is that?
And the question is, when you’re talking about a search or a seizure, officer follows someone walking down the street who seems suspicious. Is that a search or a seizure? Officer does that for a minute, a day, you know, how, all these questions, where would you draw the line? Cause it sounds like once, you know, if you change what is a search or a seizure, then you’re subjecting things to fourth amendment regulation that have not been previously thought to be part of the fourth amendment.
So, so I’m curious kind of what gets covered under that category in your view. What’s fun about teaching this class, as you’ve just seen, is that any position anybody takes and gets walked to a point of total impossibility. So I’m only going to answer part of your question, which is, I am prepared to say something that I think is a dramatic enough change, and then that’s as far as I’m prepared to go.
I’m prepared to say that the collection and storage and querying in bulk of data that’s obtained indiscriminately, like we’re seeing here, that is a search or seizure, and that isn’t okay without a warrant. And the one thing I’ll point out about all of that is it’s programmatic. It’s not the officer that’s out on the beat and just happens to see somebody and wonders what’s up.
These are programs. And under the special needs doctrine, they’re supposed to be written policies. And that’s what I want, which is when you’re programmatically sweeping up all of our data, then yeah, that’s a search or a seizure. And I want the program spelled out in enough detail that a court can say, we think that’s adequate protection or not.
Thank you so much for the talk. Very interesting. So my question is, Yeah. I don’t know if it’s a question or a comment. So, are you familiar with a book called No Place to Hide by Glenn Greenwald? He wrote it like 10 years ago. No, I’m very familiar with the expression but not the book. So, Glenn Greenwald wrote this book that actually touches exactly the same themes in the aftermath of the Snowden revelations.
He warned about exactly the same issues that you are. I know the book, I just haven’t read it. So, nothing happened. And then, now you are seeing this activity. So, there was also a couple of years ago I believe it was. Matt Gates, who actually denounced that the metadata database was being abused and blah, blah, blah, blah, blah.
Nothing is going to happen because Tulsi Gabbard is now supporting. So, is, why nothing happens, essentially? Is there any reason why this has been known for a long time? There was a professor at Stanford, I think now he’s with Princeton, I don’t know. He did his PhD in computer science and a law school degree.
His name is Jonathan, I forgot his name. Yes, yes, that guy. Who did a study with what you can get from metadata, so essentially what he did is he started with an app in cell phones and he actually showed that metadata could essentially be very invasive, that was also 10 years ago, and nothing has happened, nothing seems to be happening even with the new administration, so can you comment on that?
So, I’m going to answer the question you asked and it’s flip side. So first, I’m going to tell you that something has happened. And then I’m going to tell you why more hasn’t happened. So first, something did happen, and it’s really interesting to watch what happened, which is after the Snowden revelations, Congress finally had its feet to the fire and had to do something, whatever it was going to do.
That was indiscriminate data surveillance. And what was happening was the government was collecting all of our metadata, and then the NSA were using that metadata to search for patterns of terrorism, and Congress had to figure out what to do. And despite the arguments from the NSA and the FBI that they really needed to have this database, in the USA Freedom Act, Congress actually decided that they couldn’t keep the data, that it had to sit with the cell phone carriers, and that the government officials had to get court orders to have a look.
And Daniel Citrin and I, in the article that I mentioned at the beginning, argued that any time Congress has been forced to do something By the way, they spend a lot of their time avoiding doing anything, but if they’re forced to do something, they’ve never upheld indiscriminate data surveillance. So it’s telling what they’ve done when they’re forced to do it.
Now, the better question is, why aren’t they doing something about this? I mean, there’s something called the Fourth Amendment is not for sale act, which is not moving anywhere through Congress. There’s state legislatures that thought about this. The reason that the police and prosecutors are, you know, really strong lobbying forces in these legislatures.
And they don’t want to be regulated. Who can blame them? I don’t want to be regulated either. Even though on my own I put on a tie tonight. So, you know, nobody wants regulation. That’s the whole point of my argument about what should happen. So long as the police can do anything and stop regulation, because they don’t have to do anything about the Constitution, they like that state of affairs.
But what if the Supreme Court did what I said? What if the Supreme Court said, no indiscriminate data surveillance without a regulatory scheme? The police and prosecutors would be rushing in the legislatures, begging them for a regulatory scheme. Maybe we’d have a conversation about it. It’s called a penalty default, which is you switch the default and all of a sudden you get a different outcome.
And that’s actually exactly what I think should happen. Thanks. Brian. Good to see you, my friend. I feel like we need a bottle of wine to handle this. I remember when you invited us back in 2019 after the San Francisco Facial Recognition Ban. We were on opposing sides there with the Ban v. Regulation and you’ve helped me evolve some of my positions, but I’m, I’m curious how you square this proposal and maybe you sort of answered it with the statute bullet point, but with the 10th Amendment, you’ve never expressly consented to being tracked.
So it seems like that also means that IDS is unconstitutional in its face. And the consequence, my, my real sort of question or concern is just the consequence of going this route seems to be that we would be putting surveillance on steroids, geofence warrants everywhere where you’ve already got the private and public surveillance networks, you know, rapidly expanding as they are and with more smart city applications to come.
I remember the, the NAACP. The, you know, the legacy Supreme Court case where they were trying to force them to disclose their members, it’s kind of a, I, I mean, I don’t even need to worry about that anymore because I can just drive around your parking lot with a license plate reader scanner. I can use a stingray to capture the cell phones of the members inside, like modern technology has already destroyed those privacy guardrails, and it seems like that would be the consequence of your proposal, that those privacy protections we hold near and dear.
If we create such a statute, surveillance tech is everywhere, data collection is everywhere. I’m wondering how you square that tension. So, two responses, and again, Brian’s right that the policing project that I founded, Max and I work with focuses a lot on regulatory schemes. And so we have not been the folks who call for bans on all of this.
We’ve tried to figure out if there’s a way forward. And it’s interesting, by the way, because Justice Rehnquist wrote a Law Review article years ago that, in which he talks about, you know, like, what if the police are just, you know, doing what they do, and they record the license plates of people who go to a particular bar or whatnot.
And he comes down against it except under a set of circumstances. So there’s three choices, I think, right? One of them is, we don’t use the tools that we have that are being used now. And I ended part two at a place that probably made you happy, which is it’s all unconstitutional. And I think there’s a really good argument.
I mean, I just kept, you know, saying, Wow, that’s, that seems to be the situation. It’s unconstitutional. And like I said, you could stop there. Congress stopped there with the USA Freedom Act. Like I said, I’m not sure about the value. There may be value. I mean, certainly law enforcement officials are going to tell you.
National Security Administration officials are going to tell you, the FBI agents. We’ve been talking to some of the FBI folks. They’re going to tell you, you know, the bad guys are using this stuff. We’ve got to use this stuff. The second state of affairs, the one we’re living in now, which I think is completely and totally unacceptable.
I am unequivocal on this. Is that this is just going on like crazy. Everywhere. And there’s just no regulation of it at all. I mean, I remember in the conversations about whether to ban facial recognition or not, and I understand the point of view of everybody who wanted to ban it, but one of the points I tried to make to them was, it’s happening everywhere.
Like you, you’re not succeeding in the ban, and so what you’ve done by insisting on a ban is you’ve just left it open to happen all the time. And this is a third way, which is you either do or don’t believe that it is possible to write a statute that has sufficient protections in it that gets you where you want to go.
One last thing I’ll say, I’ll just put in a little plug because I’ve been working with all the people in this row here over the last couple of days. Including especially Jennifer Eberhardt and Dan Sutton who are both faculty at, at at Stanford is and our folks from the Microsoft Justice Initiative who’ve been helping us with all this, is that there’s something called a data trust which I’ve become completely fascinated with.
And a data trust is a way to collect all of this data and hold it, but not let the police hold it. It’s held by some private public entity, we could talk about that. And there are a set of rules set out by statute that govern access by the police, what kind of court orders they need, what they have to show.
And the cool thing about a data trust is it’s not just the police who could get the data, so could defense counsel, so could researchers, like folks who are in the audience here, with different permissions to access the data. And this is something that’s Just starting to happen in the world often with computer, with commercial data.
So, that’s kind of another option. I’ve got a paper about that and a paper about this because that’s what I do is write too many papers. But that’s another option. But there are only so many options. And I just think it’s problematic to not just step back and be candid about what they are. And I think it’s super problematic to let all of this go on in a completely unregulated fashion.
And that is the world in which we are living today. It’s true that probably the stuff I’m talking about is happening most with federal law enforcement agencies. Most of your state and local agencies don’t have all of this data yet. Some of them do. I’m sure the NYPD does. I’m sure the LAPD does. Maybe the SFPD.
That is the future that we are coming to. It is our brave new world. And the question is how we manage it. Yes, sir. I’m highly interested. I’m interested in a point that was made by the previous gentleman who asked the question. And that is, Tim Weiner wrote a book called Frenemies, which was a lot about, to do with the FBI and their practices and utilization of technology.
And his conclusion basically was, is that throughout history, whatever technology was available, Government would utilize to surveil, and so we’re going to see that, I would say, continually because it’s the nature of the beast and I’m wondering in terms of regulation, any regulation that you could set up while still allowing the practice let’s just say could be, is rather malleable, So I’m going to give you the same answer I gave the previous gentleman, which is there are only so many choices.
And the world in which we’re living now and the world in which has been the world in which we live most of the time is that there’s no regulation. And then something really bad happens and everybody gets really upset. And then we maybe try to do something. And I would rather step back and look at the reality of the situation and say what can we do that’s sensible.
So one of the things we’re doing at the Policing Project is we’re writing a legal framework to govern the use of AI by law enforcement. And, you know, we’ve looked at different models from having a statute and then an agency set up. Because one thing you’re certainly right about is that the technology moves so quickly that it’s very hard by statute to keep up with what’s going on.
You almost need an administrative agency. Another thing we’re looking at is procurement requirements, requiring disclosure of the tools that are being used and certain safeguards in place. But I think the answer is that at some point you know, before something really, really, really terrible happens, we just try to put the safeguards in place to make the tools available.
Or, by the way, I mean, don’t get me wrong, I think there should be statutes. If the legislative body of Palo Alto or All the towns that you’re from, or the state of California looks at the question and people are outraged at what’s going on and they decide not to allow it, then they decided not to allow it.
Great. That’s democracy. That’s what happened with the USA Freedom Act. After the Snowden revelations, I assure you, all the national security agencies were begging Congress to let them keep doing what they were doing. But you know, to use a technical legal term, the American people were pissed. And Congress said no.
So that’s, that’s an option. What’s not an option for me, like emphatically not an option, is that without any transparency whatsoever, it just happens. That’s what I think we should be most concerned about. All of my friends are just stepping up to the mics. Hi, Gary. So, you used the example of air being searched when you’re getting on an airplane.
And I think it’s in the terms of service, one opts in contractually to that. Arguably, you, when you choose to ride the New York City subway, you’re choosing to, you’re opting to be surveilled and so forth. There’s various levels of this. Many commercial and government establishments have a sign saying this video surveillance is here ring doorbells you know, et cetera, et cetera, et cetera.
So to what extent is some of this obviated by the opt in or contractual nature of how we interact with the world? So consent is the word that you want to talk about. The question is, To what extent is consent real or at least in the law, there’s, you know, a fiction of consent because we don’t really believe that people are thinking, Oh, sure, like, it’s fine to surveil me.
I’m getting on, you know, the airplane. I always, you know, step up to the camera, not at the TSA, but at Delta and just say no thanks and drive them crazy because I just, you know, want to make a point. But that consent is fictitious. And everybody knows it’s fictitious. There are limits to how much that would work if, in fact, it’s pushing into your constitutional rights.
There’s a doctrine called the Unconstitutional Conditions Doctrine that says you can’t be made to consent to something that would violate your rights. So there are limits even there. But I think, you know, I mean, the whole data trust idea actually came not out of. Government collecting this data, but private companies collecting all this data and claiming that there was consent.
And if you ask most people, are you consenting when you use Waze and Uber and all that? They’re like, I mean, what choice do I really have? Like I, that’s how people get from here to there now. And that’s where the data trust idea came from, which is since we aren’t sure who owns the data and we’re not really consenting, let’s put it in a trust.
And then it’s kind of cool with some of the models do, which is they suggest you then could opt into different. Trusts, you know, you have a Fitbit. One of the trusts used it for medical research and you agree to let your data be used for medical research and others as well. If you’re monetizing it, I want some of the money.
And so that’s where the idea came from. I’m thinking about it more regulatory way, but it’s because the consent doesn’t work so well. We’re trying to think of an alternative. Thanks. Hi. Hi. So my question is about like you. Talked a lot about, like, the word arbitrary, and I think, so I’m not a lawyer, but I do a lot of ML, like, coding, and I fine tune a lot of models, and it’s one of those things where there’s two sources of major arbitrariness in any model.
The first one being the black box. If I, I can put a bunch of inputs, they’ll give me an answer, I will not know why they gave me that answer. So in terms of, like, review, you never know the why. And the second one is what’s called the consistency problem. You could input every observable about me, twice. And I can get different answers.
You could do it a hundred times, a hundred different, well, depending on the problem, a bunch of different answers. I guess, like, those inherent, I guess, kind of insecurities of AI, like, how do they affect all of this? Because you never know why, and you never know what at the day. Thank you. It’s a great question.
So, I went back to the slide just because I think from Fourth Amendment purposes that any time the government intrudes into your life in a way that constitutes a search or a seizure, the question that’s got to be asked is, why me? The government’s got to be able to tell you. Why me? So, you know, one answer is because I have a warrant.
There was probable cause. Or, do you notice that everybody’s going through airport security? You’re doing it to everyone. So, the predictive models, what happened to David Zayas, are really problematic because First of all, we don’t know. Well, there’s two different things. Some of the models are algorithmic in ways that we’ve established the variables, and we kind of do know.
We maybe don’t know how everything’s being weighted. Then there are the large language models where we just don’t even clue how we’re getting to the answers. And that’s a huge problem. And the EU’s answer to that problem is to say no, which is that under the new EU AI Act, predictive policing is just out now, whether that’s the right answer or not.
And I’ve I’ve done a little bit of learning about ways that people are trying to work backwards from the algorithms and figuring out what the computer’s figuring out itself. But it’s, that’s like a huge question. Now you could ban, as the EU has, predictive policing. And somebody, I can’t, I’m trying to remember the name coming over, but wrote an article about prediction is bunk.
Not believing that any of it’s actually ever accurate. Or if you think about it, accurate enough, because eventually the standard still has to be Once you get to querying that you’ve got some reason to query that model and get that person that it’s just never going to measure up. But that’s, that’s the best I can do.
Those other uses like retrospective investigations still seem obvious enough to me. You know, we want to know which phone was in the third national bank at the time that it was robbed. Hi.
So, I’m just wondering if the regulation that you’re proposing also comes in tandem with the federal data protection law, because I just see a lot of loopholes that can be applied if we don’t, like, principles that you mentioned, like consent, if you don’t, if you don’t also apply that for, like, private companies, right, so should that be also, like, considered in tandem, or it is just, like, a
Yeah, it’s a fair question about how do we think about data protection laws generally. I mean, so I was teaching in, in Europe and they have a very different perspective on the, you know, we kind of live with the private companies, but are skeptical of the government. They don’t seem to worry about the government, but they’re very skeptical of the private companies.
And it’s complicated in this space because actually the companies in government are doing all of this together. The only thing I can note in answer to you is, you know, we don’t have much in the way of data privacy laws in this country. I mean, it is really. I mean, we do in California and we do in the federal system.
But, you know, in all of these laws, even in Europe, it’s just inevitable that what happens is the law gets passed and it exempts law enforcement. And that makes me a little crazy because, you know, I’m a little crazy. Be angry at Meta or Google or whoever it is, but they’re not going to come get me in the middle of the night.
The government might. So I think we have that when we think about data protection. It seems like you want me to stop. Well, our ti I don’t want you to stop. But the our time is up. In fact, I was so gripped by this that Morgan tapped me on the shoulder to say our time is up. I said, oh, wow, I can’t believe it.
So, I thought that was fascinating. This is such an important topic, such a difficult topic, and it really matters a lot. Thank you, Barry. And I’d just like to ask the audience to join me in applauding and thanking Professor Friedman for that talk.
