Wearable neurotech devices are becoming more prevalent; is the law behind the curve?
(Photo illustration by Sara Wadford/Shutterstock.)
At a time when watches and eyeglasses can track steps, heart rate and even emotions, wearables are entering a new frontier: brain waves.
But as these devices become more sophisticated, the legal questions around privacy and data security grow more complex, leaving attorneys to determine what protections exist and what gaps leave users vulnerable.
Neurotechnology is tech that can monitor, record or influence brain activity directly, and the technology can be implanted into the brain, worn on a wrist or used in other ways.
While the technology is still being tested, it could help with everything from management of chronic pain to monitoring the brain activity of students and criminal suspects. One example of currently used neurotechnology is cochlear implants, which improve hearing. And Elon Musk’s company Neuralink is testing products designed to help paralyzed people control devices with their minds.
But determining how to provide informed consent for complex technology and data with seemingly endless potential will be challenging, says Sara Pullen Guercio, a senior associate in Alston & Bird’s technology and privacy group and a former critical care nurse.
“The legal line between the nervous system and the muscular system is blurry,” Guercio says.
Do physiological responses count as neurodata because they are triggered by neural activity? Or does neurodata have to be collected directly from neurons? What about motor neurons, which control muscular movements? Guercio asks. “These questions of scope will be discussed and possibly litigated in the future,” she adds.
The laws and regulations applying to neurotech wearables and their neurodata will depend on the intended benefits of those devices, the proposed use of the resultant information and how the accessories are marketed, Guercio says.
Never mind the gap
One issue to note when looking at neurotechnology laws is that neurotechnologies are diverse in their use and in terms of the data collected. If neurotech is involved with the treatment of an individual in health care, then the federal law restricting release of medical information could be part of the regulatory requirements, says Jordan Fischer, the founding partner of Fischer Law and visiting research professor in the Center for Law and Transformational Technology at Drexel University’s Thomas Kline School of Law.
State privacy laws, Federal Trade Commission regulations and evolving state consumer health care privacy laws could all be in play, Fischer says.
“The question becomes, ‘What type of data is collected?’ And then, ‘What are the corresponding requirements?’” she says.
Often, neurotech data will fall into the broad definition of “sensitive personal information,” Fischer says. If neurotech data is considered personal data, then it would likely be regulated in states with data privacy laws, she adds. If it’s anonymized data, it falls outside of these privacy laws. And states that don’t have a data privacy law may not have a direct regulatory regime that is applicable to neurotech, Fischer says.
The Food and Drug Administration issued guidance in 2019 classifying wearable neurotech such as watches or eyeglasses as “general wellness products” that are too low-risk to be subject to its regulation. The FDA defines a general wellness product as something “intended to maintain or encourage a general state of health or a health activity,” such as stress management or sleep management.
So far, only California and Colorado have enacted amendments to their comprehensive data privacy laws intended to directly cover the neurodata generated by neurotech by adding “neural data” as a category of “sensitive data,” Guercio says.
Neural data refers to digital information generated by brain activity.
Other bodies of law are also relevant. Francis Shen, the co-director of the Dana Foundation Neurotech Justice Accelerator at Mass General Brigham explains that the Fourth Amendment, which protects against unreasonable government searches and seizures, could apply to brain data if the government attempts to access it without proper justification.
“This constitutional protection could apply if the government were unreasonably searching and seizing our brain data,” Shen says.
Mind the gap
But there are other legal gaps that will need to be filled.
“I am of the view that when neurotechnology becomes more prevalent in society, this will have ripples for many areas of law; because unlike other technologies of the past, this represents a merging of human and machine and increasingly a merging of humans with [artificial intelligence] systems,” says Allan McCay, the president of the Centre for Neurotechnology and Law and a co-director of the University of Sydney Law School’s Sydney Institute of Criminology. For example, McCay says, when a brain-computer interface (a system allowing the brain to communicate with an external device) causes an injury, it’s not clear what constitutes the criminal act in such cases. Can someone commit a criminal act by having a guilty mind? McCay wonders.
The law needs to address this possibility and determine how to protect people from unwanted intrusions into their thoughts or mental privacy, says Stephen Embry, the Louisville, Kentucky-based author of the TechLaw Crossroads blog and a 2025 ABATechshow co-chair.
“This is new and intrusive technology,” Embry says. “For the first time, we are confronted with the possibility that advertisers, the government and even individuals can really be clued into more than what we say or do, but what we are actually thinking.”
This story was originally published in the April-May 2025 issue of the ABA Journal under the headline: “New Wave: Wearable neurotech devicesare becoming more prevalent—is the law behind the curve?”
Write a letter to the editor, share a story tip or update, or report an error.