Patent trolls, companies that enforce patents for products they aren’t actually making, are the subject of one of the most heated debates in intellectual property law. Critics say their very existence is a problem, throwing a wrench into the works of productive businesses, while others argue that these suits are a legitimate part of patent law. But for all the debate, nobody knows how big the issue really is because no one knows how many cases have been filed.
“Some say half of patent lawsuits are patent trolls; others say it’s 2 percent,” explains Mark A. Lemley (BA ’88), William H. Neukom Professor of Law. “Whenever I see that kind of disagreement, I think it’s an interesting research question because they can’t both be right.”
Making this sort of data-driven research possible was the impetus for the Stanford Intellectual Property Litigation Clearinghouse (IPLC), a searchable database that Lemley helped build with the project’s Executive Director Joshua Walker, Director of Project Engineering George Grigoryev, and colleagues in the computer science department, that launched last December (http:lexmachina.stanford.edu/). Questions that can be answered with hard numbers aren’t unique to IP law. Across the law school—in corporate governance, class actions, worker safety, and media consolidation, to name a few—legal scholars are engaging in complex number-crunching like never before and changing the very nature of legal argument. This fact-based approach to law, which is emerging as an important new field within empirical legal studies, builds upon traditional legal tools of doctrinal analysis and normative argument—but also provides actual data for lawyers and policymakers, resulting, scholars hope, in better legal decisions.
“There are many sophisticated theories about the effects of media consolidation, and while we can debate in the abstract, we won’t know about the actual effect until we examine the evidence,” says Daniel E. Ho, assistant professor of law and Robert E. Paradise Faculty Fellow for Excellence in Teaching and Research. He tackled the issue by developing a measure of viewpoint diversity based on Supreme Court editorials and examining what occurred with newspaper mergers and acquisitions. The evidence revealed a more complex picture than a prevailing assumption of federal regulation that consolidation reduces viewpoint diversity. With co-author Kevin Quinn from Harvard University, he found, for example, that the merger of the Atlanta Journal and Atlanta Constitution may have resulted in more exposure to diverse viewpoints.
Another instance of facts challenging accepted theory comes from research by Michael Klausner, Nancy and Charles Munger Professor of Business and Professor of Law. The Supreme Court has taken the position that securities class actions are socially beneficial because they supplement the work of the U.S. Securities and Exchange Commission. But theoretically, Klausner says, there’s good reason to suspect that rather than penalizing derelict corporate officers, shareholder lawsuits simply circulate funds from shareholder to shareholder, while both plaintiff and defense lawyers charge for the service. Whether that’s true or not is an empirical question— one Klausner was able to answer by collecting data on class actions and looking at the sources of payment. As he suspected, for the period he looked at, only in 4 percent of cases did individual corporate officers pay anything.
Private sector efforts to protect shareholder interests are also a focus for Pritzker Professor of Law and Business Robert M. Daines, most notably in his research on corporate rating agencies. Using proprietary methods, these firms rate corporations on how well their boards protect and advance the interests of shareholders. Daines, who calls the firms “2,000-pound good governance gorillas” for their enormous power to swing shareholders’ votes, was inclined at the outset to think the rating agencies provide a valuable service, like Consumer Reports. But then objections from venture capitalists made him wonder. “These guys are making claims that they separate the wheat from the chaff, so let’s see if they do what they say they do.” (Short answer: They don’t.)
Getting the raw data to research this question was a difficult hurdle, since the rating agencies had no interest in releasing it; to varying degrees, gathering data is a challenge for all empirical research.
For one of her projects, Alison D. Morantz, associate professor of law and John A. Wilson Distinguished Faculty Scholar, is trying to determine whether a private system provides more cost efficient coverage for workers injured on the job than does workers’ compensation (whose costs have skyrocketed over the past several decades). It’s impossible to conduct a controlled experiment to answer this question, as with most empirical legal questions, but a quasi experiment occurred in Texas, the only state that lets employers opt out of workers’ comp. Because companies’ occupational-injury claims and other data are proprietary and confidential, however, getting in-house counsel to participate in the research was a hard sell. “They have little to gain and a lot to lose,” says Morantz, who recently was awarded a $200,000 grant from the National Science Foundation to fund this study. Nonetheless, collaborating with 15 large firms, her preliminary analysis already suggests that the companies that choose the non-subscription option make full use of the flexibility it affords them. She needs more data to answer two tougher questions: Is non-subscription also better for employees, and does it truly save money or merely shift costs to group health plans?
Unlike Morantz, most scholars doing empirical legal studies use data that’s in the public domain. And thanks to the Internet, data is more abundant than ever, from judicial opinions and regulatory actions to crime statistics and voting records. But having loads of data doesn’t mean it’s easy to find—not when it’s distributed across hundreds of databases and in unstructured form. “The raw stuff is totally chaotic,” says Walker about the data he sifts through for the IPLC. His team must collect information from 95 district courts around the country, each with its own website. Then, because different judges use different styles for referring to the same things, the team must, as Walker puts it, “find the oranges and identify them as oranges.” What’s more, because the raw information is in English, it must be parsed into structured form before it goes into the database, a task that requires the expertise of Chris Manning (PhD ’95), a Stanford computer science professor specializing in natural language processing. “It’s legal blood, sweat, and tears,” says Walker, who hopes his work will save judges and lawyers untold hours of research and yield more accurate results, with more effective intellectual property law.
Recent years have also seen an explosion in computing power. Clusters of personal computers working in parallel have made possible the use of more powerful analytical methods, especially computationally intensive Bayesian approaches, which enable scientists to evaluate and compare complex models of all sorts of phenomena. That’s important in cases where strongly held prior beliefs keep even careful empirical scholars from agreeing on what the evidence means, explains Jeff Strnad, Charles A. Beardsley Professor of Law. For example, in the long, bitter debate between empirical scholars John R. Lott Jr. and John Donohue about the effects on crime of concealed-carry gun laws, the challenge has been to separate the effect of gun laws from confounding factors like poverty and unemployment; but the variables that should represent these factors were themselves up for grabs. As a result, Strnad says, “Through all this debate and changes in models, nobody ever changed his mind: Researchers who saw a deterrent effect kept finding one, and those who didn’t, didn’t.” When Strnad applied Bayesian techniques to the same data, he found that, as he puts it, “All the models they were looking at were totally useless.” Bayesian techniques, which update probabilities about an unknown result based on accumulating evidence, have been around for decades, but their use remained impracticable until computers became powerful enough to rapidly crunch all these numbers.
While the Internet and personal computing have fueled interest in empirical legal studies in recent years, the movement’s roots go back to the early 20th century. Vice Dean Mark G. Kelman, James C. Gaither Professor of Law, traces its history to three sources: the law and society movement, which was interested in how law interacted with the communities that it was meant to govern; the law and economics movement, which introduced econometrics as a way to model causal relationships; and a move in economics away from the abstraction of rational choice theory toward testing how people actually behave. Kelman himself worked with the late great Stanford psychologist Amos Tversky, a pioneer in the study of human rationality, on research that tested lab subjects’ legal decision making on how, for example, offering jurors a third possible verdict sways their decision toward the intermediate option.
Today the influence of empirical analysis is ubiquitous. “I can’t imagine any issue that gets seriously debated now where the research hasn’t influenced judgment,” says Kelman. He points to the newest addition to the faculty, Joan Petersilia, a criminologist with a background in empirical research and a special advisor to Governor Arnold Schwarzenegger on California’s corrections system, who will take up her appointment as professor at Stanford Law in September. “How can you debate the serious issues we face today without the benefit of this kind of research? I don’t think you can.”
That wasn’t the case just 30 years ago, says Deborah R. Hensler, Judge John W. Ford Professor of Dispute Resolution and associate dean for graduate studies, who before coming to Stanford led the RAND Corporation’s Institute for Civil Justice, a center dedicated to empirical research. When she began her law career in the 1970s, she says, “The idea that you would bring empirical data to bear on questions having to do with legal doctrine was mind-boggling.” Hensler’s recent work represents the qualitative strand of empirical legal studies; rather than running experiments or statistically analyzing large data sets, she uses interviews to understand why class actions have become increasingly popular outside the United States during a period when they’ve come into disrepute here. To share information about class-action developments in different countries, Hensler directs the Global Class Actions Exchange (http://globalclassactions.stanford.edu/).
For the most part, empirical legal scholars see themselves as neutral providers of fact. “Too many times we see arguments based on supposition,” says Lemley. And although he’s referring to the IPLC, he echoes sentiments shared by most empiricists when he says, “Our goal is not to push an agenda, but to give people the data to make up their own minds.”