What Are Neural Data? An Invitation to Flexible Regulatory Implementation

At its annual development conference in September, Meta revealed its work towards a new smart glasses product it is now calling Orion. Featuring Mark Zuckerberg wearing a casual T-shirt and a pair of glasses with inordinately thick rims, the firm claimed to have made notable technical strides towards making augmented reality displays on eyewear an actual reality, though with no set release date.[1]

Often buried in the coverage about the smart glasses, though, was that consumers would apparently operate the forthcoming product in part by using a “neural interface.” Delivering an experience without cords or buttons, the Orion glasses would come with both a separate wristband and external computing device. The neural interface would be built into the wristband, which appears to use electromyography (EMG) techniques picked up when Meta bought a startup called CTRL-Labs in 2019. During the reveal, Zuckerberg described “our wrist-based neural interface” as a “device that allows you to just send a signal from your brain to” the smart glasses.[2] The neural input from the wristband would likely complement other techniques like voice recognition to allow consumers to use the smart glasses without touching them or their phone.

Just three days after Meta’s conference, California enacted legislation that classifies data from the “peripheral nervous system” as protected under the California Consumer Privacy Act.[3] It does so by declaring “neural data” as a new type of sensitive information that privacy regulators should focus on, among their other priorities, starting in January 2025. Colorado took similar steps in April 2024 as well, enacting neural data protection legislation that went into effect this summer. Regulators in both states will now turn to implement the new legislation, including the California Privacy Protection Agency and the Colorado Attorney General, and may need to develop new types of expertise to do so. Over time, this may lead to new enforcement activity that could affect the neurotechnology industry and the products they develop.

But what are neural data? And if they cover information taken from the human peripheral nervous system, as well as from the brain, what does that mean for products like Meta’s Orion? Neither the technical scope nor the legal construction of neural data are clear at this point, and stakeholders appear to disagree on the policy rationale for whether and why data protection regulation is needed here. Yet, despite concern among some experts about the adequacy of these lawmaking efforts—including Patrick Magee, Marcello Ienca, and Nita Farahany’s recent argument that “neural data” is too narrow a regulatory target [4]—new law has now been set in these two states and regulators will now begin to respond. The uncertainty created by these new state legislative moves may leave privacy regulators in difficult positions and unsure about what actions they can take without prompting legal challenges. Implementing the legislation in both states will likely require involved and contentious processes of testing the bounds of these new rules.

This blog post retraces and compares how the California and Colorado legislation came about to shed some light on how lawmakers are thinking about this new and somewhat nebulous target of governance. While these debates are going on in several jurisdictions around the world, and at international bodies, this blog post focuses on the United States as a case where different but related definitions may create confusion for both regulators and stakeholders.

What Are Neural Data? What Are Neurotechnologies?

The new legislation in Colorado and California approach these questions in slightly different ways that offer privacy regulators differing and, ultimately, limited guidance in how to proceed. The resulting uncertainty taps into longstanding debates in law and technology scholarship about whether regulatory systems should treat technology as their primary target—rather than aiming at something else, such as the actors using technology or the harms that can result from its use.[5] Here, it still isn’t clear if the target of regulation should be neural data, neurotechnology, those who use the devices, their developers, privacy risks or harms more broadly, or something else. The unclear technical scope and legal construction of “neural data” or “neurotechnology” afford further uncertainty.

The Californian approach classifies neural data as “sensitive personal information” under the California Consumer Privacy Act, defining them as “information that is generated by measuring the activity of a consumers central or peripheral nervous system, and that is not inferred from nonneural information.”[3] This definition casts neural data as an object that should be regulated, but in doing so it potentially creates at least three new categories of data: (i) data from the central nervous system (the brain and spine), (ii) data from the peripheral nervous system (nerves extending from the brain and spine), and (iii) “nonneural” data.

The legislation does not provide subsidiary definitions for any of these three types of data or the legal, policy, or technical boundaries between them. Instead, these efforts to define neural data may actually generate more uncertainty and confusion about what the law covers and how different subsidiary categories may be treated, contrasted, or prioritized. Regulators and other stakeholders, then, will have to work through those distinctions through practice and over time.

For instance, whether the data from a product like Meta’s Orion—which appears likely to be classified as collecting data from the periphery—should be treated the same as central nervous system data and on what grounds remains unclear. Notably, industry groups including the California Chamber of Commerce opposed the bill on the grounds that “information about activity of the PNS [peripheral nervous system] simply is not capable of revealing someone’s inner thoughts and mental processes, which this bill seeks to protect.”[6] Regulators and stakeholders will now have to engage in difficult and likely confrontational conversations about how to implement these rules and whether to bound off different types of neural data.

These interpretive and implementation questions become more confusing when noting that the purpose of the legislation may have been to regulate “neurotechnology” rather than neural data. Upon the bill clearing the California Senate in May, its lead sponsor offered a press release claiming the bill as a win for the project of “[r]egulating neurotechnology early in its development is essential to ensure ethical use, protect privacy, establish industry standards, and address future implications.”[7] While the text of the bill focuses on data, statements like this one frame data protection as a subsidiary goal in the larger objective of regulating neurotechnologies.

What counts as neurotechnology, then, and why should it be regulated with privacy tools? An older version of the Senate bill defined neural data as “information … that can be processed by, or with the assistance of, neurotechnology.”[6, emphasis added] This version went on to define neurotechnology as devices or instruments that allow for “reading, recording, or modifying a persons [sic] brain activity or the information obtained from a persons [sic] brain activity.” Thus far, no public explanation has been provided for why the definition of neurotechnology was removed from the final bill—or why it would have mattered for data protection purposes that neurotechnologies could modify brain activity. Its inclusion and then removal raises further questions about the policy objectives of the legislation and what regulators should prioritize as they proceed.

Comparing to Colorado

The Colorado legislation may offer a potential clue, especially since several Colorado lawmakers lobbied for the Californian bill,[6] without definitively settling the matter. Amending the Colorado Privacy Act’s definition of “sensitive data,” the statue has a very similar definition of neural data: “information that is generated by the measurement of the activity of an individual’s central or peripheral nervous systems and that can be processed by or with the assistance of a device.”[8] Yet, Colorado places it within a larger definition of “biological data” that also includes genetic and physiological information.

Moreover, the legislation includes policy declarations before providing these definitions that—while not quite enforceable law—gesture more clearly at its purpose. Lawmakers felt that “[n]eurotechnologies … raise particularly pressing privacy concerns given their ability to monitor, decode, and manipulate brain activity” and that neural data “is extremely sensitive and can reveal intimate information about individuals, including information about health, mental states, emotions, and cognitive functioning.” No formal definition of neurotechnology is provided, though they apparently “includ[e] devices capable of recording, interpreting, or altering the response of an individual’s central or peripheral nervous system.”

These legislative provisions suggest that peripheral nervous system data remain a concern, though the primary regulatory objective in Colorado seems to be protecting privacy around health, emotions, and cognition. Significant technical disagreement will likely ensue about what data from the brain, spine, or periphery can actually reveal, but Colorado lawmakers have provided at least some guidance to regulators. While regulating neurotechnology more broadly motivated the legislation, lawmakers cast these innovations as threats to privacy in particular. This framing could steer regulatory attention more towards privacy harms, rather than the devices or data themselves. While the target of lawmaking here remains underdefined, it has a slightly narrower scope than the California legislation which may assist regulators, courts, and stakeholders in interpreting and implementing these provisions.

Implementing Regulation for Neural Data Protection

In practice, then, how should regulators proceed with the task of implementing new neural data legislation? The most prudent path forward may be to treat these less-then-clear legislative mandates to protect neural data as flexibly as possible. Rather than rushing towards a clear but narrow definition of what neural data are, privacy regulators can and should consult widely with stakeholder groups about their primary concerns, interests, and values in neural data protection. Critical for proceeding will be including consumer advocacy groups and other civil society organizations alongside the developers of devices that can collect and process neural data.

Regulators implementing these provisions could treat them more as a scheme of principles-based regulation, where regulators and stakeholders collectively and continuously discuss, test, and contest the meaning of rules, which types of risks to prioritize and harms to redress, and what enforcement strategies to use.[9] Even if the targets of these new pieces of legislation remain unclear and loosely defined, inclusive regulatory practices—with plenty of third parties—could help shape the meaning, target, and purpose of neural data protection in more democratic and deliberative ways.

References

[1] Lauren Goode, “Meta Missed Out on Smartphones. Can Smart Glasses Make Up for It?” Wired (Sept. 25, 2024), https://www.wired.com/story/meta-orion-glasses-augmented-reality-mark-zuckerberg/.

[2] Kyle Wiggers, “Meta Developed a ‘Neural Interface’ For Its Next-Gen Orion AR Glasses,” Tech Crunch (Sept. 25, 2024), https://techcrunch.com/2024/09/25/meta-developed-a-neural-interface-for-its-next-gen-orion-ar-glasses/.

[3] Senate Bill No. 1223, 2023-2024 Reg. Sess., ch. 887, 2024 Cal. Stat.

[4] Patrick Magee, Marcello Ienca, and Nita Farahany, Beyond Neural Data: Cognitive Biometrics and Mental Privacy, 112 Neuron 3017 (2024).

[5] Lyria Bennett Moses, How to Think about Law, Regulation and Technology: Problems with ‘Technology’ as a Regulatory Target, 5 Law, Innovation & Tech. 1 (2013).

[6] “SB 1223: Consumer Privacy: Sensitive Personal Information: Neural Data,” CalMatters, https://digitaldemocracy.calmatters.org/bills/ca_202320240sb1223 (accessed on Oct. 23, 2024).

[7] Office of California Sen. Josh Becker, “Senate Overwhelmingly Approves Nation’s Strongest Neurorights Bill,” (May 21, 2024), https://sd13.senate.ca.gov/news/press-release/may-21-2024/senate-overwhelmingly-approves-nations-strongest-neurorights-bill.

[8] House Bill 24-1058, 2024 Reg. Sess., ch. 68, 2024 Colo. Stat.

[9] Julia Black, Forms and Paradoxes of Principles-Based Regulation, 3 Capital Markets L.J. 425 (2008).