The California Consumer Privacy Act (CCPA) went into effect on January 1, 2020, giving residents key rights over some of their private data—data that is currently collected and commoditized without express authorization. In the discussion that follows, Dr. Jennifer King, Director of Privacy at Stanford’s CIS, explains the new law and how far it may go to protect consumer personal data.
What kinds of data are companies such as Facebook, Google, and Waze collecting? What’s the most surprising way that they are using that data?
The New York Times ran a very interesting series about location data and how our apps track us and sell that personal data on to advertisers and political organizations. The data is supposed to be anonymous, but it paints such a clear picture—with information about daily habits, from the doctors we visit, schools we attend, places we work, the people and stores and restaurants we visit. It was so clear a picture that the reporters were able to identify individuals from the data and they then contacted those people and interviewed them. It’s creepy—and outrageous—that these images of us are built with our deeply personal data about our daily habits and it’s unregulated. There are no restrictions on this data. Verizon and AT&T were selling data to repossession companies to collect on loans, until it was made public and they were shamed into stopping. It’s very shady. This is an entire grey area—not illegal, but very questionable.
Another article obtained information from a company that gives other companies reputation scores. It described one example in which Airbnb had handed over all of a customer’s emails—all communication that she had engaged in while using the app. They also obtained details about her use of DoorDash. The third-party had no interaction with the original person. But they had this very personal information about her, down to full emails about apartments she rented and the food she ate. So, the data that is collected about us through our mobile apps, put together, reveals a lot about us. And it’s collected and used largely without our knowledge or express approval.
Will California’s new law, the California Consumer Privacy Act, CCPA, help to address this? What are the elements of the act that you are most excited about?
We’ll now have the right to access this information that is being gathered about us—this detailed data. And that type of transparency is useful. Importantly, we now can deny companies the right to sell the data about us. So, getting a sense of what is being collected will be useful and having the power to say no to the commoditization of our data is good. Companies are now also supposed to let us know about any inferences they draw about us—not just all the photos or locations, but the ways they categorize us. For instance with Facebook, how do they sell our data—is it by race, gender, income bracket? That might be useful for you to know. I’m curious to see how this will unfold.
Academics like me may well be the biggest consumers of this information. It’s a great tool for advocates and researchers. But whether the average consumer will get anything useful—particularly if it’s delivered as a data field that’s hard to read—I’m not sure.
So businesses must now notify consumers what personal information they collect about them and why. But won’t that be in the same format we’re already ignoring—the tiny print at the end of a page or hidden somewhere? Will awareness of these new rights matter—and what is the ease of exercising them?
This new law doesn’t do anything to challenge what was already unsatisfactory. This is one of the concerns I voiced to the Attorney General during the rulemaking process. It’s supposed to be “clear and conspicuous,” but working from the current privacy law, you could argue that clear and conspicuous has never been enforced. No one has really taken it to task—whether it’s clear and conspicuous if privacy notifications are in tiny print linked to at the bottom of a webpage that we know consumers aren’t reading. I don’t think the way they are posted on the website meets that standard. And I suspect privacy notification will be unchanged with this new law—that companies will put it in the exact same place.
Another issue I see is the language that companies are using in their notifications since the new law. I was on eBay’s site recently, and I stumbled on their version of their “do not sell my data” link—so I clicked. I found it interesting how aggressive they were in describing their interpretation of “do not sell.” They said that they added this function because they were forced to by this new law and the language, in my reading, tried to discourage me from opting out by highlighting benefits I would forfeit if I did. Even I stopped and thought twice about opting out. So companies may be overtly hostile towards this—more charged than you’d expect to see from a mandated legal notice.
Will the law work? The onus is on the consumer to make herself more aware of her rights and to take action of some sort.
The new law doesn’t change what businesses can collect. So it’s business as usual—they can keep gathering and selling data unless the consumer takes the time to say no. And I suspect most consumers will not, because of the very subtle ways that privacy notifications are posted. There may be some element of public shaming in that they have to justify why they are collecting certain information. We’ll see. For the most part it doesn’t stop the practices we’ve already seen.
I would have advocated for restrictions and a change on opt in and opt out processes. For example: Should a company serving third-party ads be able to collect data about you without some say from you?
How to protect the whole system of third-party advertisers and mobile apps with ads—could be served by other ad networks—that’s the kind of data collection that consumers don’t know about. The big missing link is that you don’t know who they are, so how do you know to go to them to make a request for your data?
Any other significant downsides or potential negatives? Some commentators have noted the cost to companies—and limited benefit to consumers.
There have been some good questions raised about the cost of compliance, but companies are not allowed to price discriminate against you if you elect to “not sell” my data. It has to be a free service. But if a significant number of consumers actually do take the time to find the “do not sell my data” link and click through, that may call into question that business model.
Another issue is that the law is largely aimed at larger companies, so smaller companies may have significant challenges.
I understand that this is the first such law in the U.S. Do you expect other states to follow—or for big companies to simply use the new California law as its standard, so that the rest of the country benefits from California’s lead?
Yes. Already some large companies have said they will offer many of the same aspects of the law across the states. I’m not sure they’ll offer the “do not sell,” but they’re likely to give non-California consumers access to their data.
What are the next steps? Do you expect California legislators to press further?
The groups that brought us this law through the ballot initiative are already working on part II. The main group is called Californians for Consumer Privacy. It’s financed by a single individual—a real estate developer. I think his heart is in the right place, but it’s indicative of California’s ballot initiative process that one person can have such impact. The second initiative is even stronger than the CCPA, largely in response to pushback received about the first law.
Dr. Jennifer King is the Director of Privacy at CIS. An information scientist by training, Dr. King is a recognized expert and scholar in information privacy. She examines the public’s understanding and expectations of online privacy and the policy implications of emerging technologies. Her research sits at the intersection of human-computer interaction, law, and the social sciences, focusing on social media, genetic privacy, mobile platforms, the Internet of Things (IoT), and digital surveillance. Her scholarship has been recognized for its impact on policymaking by the Future of Privacy Forum, and she has been an invited speaker before the Federal Trade Commission at several Commission workshops. She was a member of the California State Advisory Board on Mobile Privacy Policies and the California State RFID Advisory Board.