Privacy is, once again, under fire after the horrific terrorist attacks in Paris and Beirut—with calls to strengthen the government’s surveillance capabilities growing louder. In this interview with the Stanford Lawyer editor, Jennifer Granick, Director of Civil Liberties at the Stanford Center for Internet and Society and an expert in computer crime and security, counters those calls and advocates for stronger protections of personal electronic data.
Can you talk about the tension between privacy advocates and law enforcement and intelligence personnel—particularly in the wake of recent terrorist attacks? Don’t we all value the same Constitutional guarantees?
Granick: This isn’t a U.S. conversation, this is a global one. Most countries in the world don’t have a First or a Fourth Amendment. And very few countries—the U.S. isn’t one of them—respect the privacy rights of foreigners. So no—privacy and human rights advocates who want to protect the rights of people globally are not necessarily starting from the same values as government law enforcement and intelligence officials. Second, law enforcement and intelligence are focused on a particular definition of security, ensuring they have tools that they hope will help them prevent and investigate crime. This is a laudable, essential goal. But the public has a broader set of goals, which includes justice, liberty, human rights, economic well-being, and more. The tools that some U.S. officials are asking are not the best means to reach our security goals. Massive surveillance continues to fail at preventing attacks. Encryption backdoors are an opportunity to commit, not solve, crimes. But worse, what some officials are asking for can actively undermine the broader public interest.
Some companies have taken calls for better privacy seriously. iOS 8 not only encrypts iPhone and iPad data by default, but gives Apple no access to encryption keys, meaning that the company can’t help produce someone’s data even when served with a warrant or pressured by intelligence agencies. But these efforts have been coming under fire. The secure group chat site Telegram announced that it was closing down channels used by suspected ISIS members. Do you think Apple needs to review this security feature in the wake of recent terrorists attacks?
Granick: No. There’s no evidence that encryption was the reason that the Paris attacks were not detected. Rather, the evidence so far shows that these men were on the intelligence radar, but were nevertheless not selected for surveillance. This was also true of September 11th, the Boston Marathon bombing, the attacks in Mumbai, Charlie Hebdo, etc.
Can you tell us about new provisions that limit the National Security Agency’s authority to collect the phone records of Americans—and if it went far enough?
Granick: Section 215 of the Patriot Act was modified in attempt to end the NSA’s domestic phone dragnet. Those provisions have not yet gone into effect. Meanwhile, dragnet collection of Americans’ international calls continues, as does broad surveillance of international communications and massive overseas spying. The section 215 changes were a small start in reining in surveillance of innocent people.
Right now we’re focused on physical attacks by terrorists. What about cyber security risks?
Granick: Paris shows that a few people with old school technology—guns and homemade bombs—can do a fantastic job terrorizing people. Meanwhile, most network insecurity today can be remedied with a few basic security practices, including implementing strong encryption. We need to do better protecting critical infrastructure, but given that these are generally highly regulated industries that do not handle the public’s data, I believe there are opportunities for major security improvements that won’t negatively impact the civil liberties and human rights concerns that I study.