Your brain needs a really good lawyer
Elon Musk and Mark Zuckerberg are building tech to read our minds. Lawyers are thinking about how to protect our mental privacy from Silicon Valley.
If you take it for granted that nobody can listen in on your innermost thoughts, I regret to inform you that your brain may not be private much longer.
You may have heard that Elon Musk’s company Neuralink surgically implanted a brain chip in its first human. Dubbed “Telepathy,” the chip uses neurotechnology in a medical context: It aims to read signals from a paralyzed patient’s brain and transmit them to a computer, enabling the patient to control it with just their thoughts. In a medical context, neurotech is subject to federal regulations.
But researchers are also creating noninvasive neurotech. Already, there are AI-powered brain decoders that can translate into text the unspoken thoughts swirling through our minds, without the need for surgery — although this tech is not yet on the market. In the meantime, you can buy lots of devices off Amazon right now that would record your brain data (like the Muse headband, which uses EEG sensors to read patterns of activity in your brain, then cues you on how to improve your meditation). Since these aren’t marketed as medical devices, they’re not subject to federal regulations; companies can collect — and sell — your data.
With Meta developing a wristband that would read your brainwaves and Apple patenting a future version of AirPods that would scan your brain activity through your ears, we could soon live in a world where companies harvest our neural data just as 23andMe harvests our DNA data. These companies could conceivably build databases with tens of millions of brain scans, which can be used to find out if someone has a disease like epilepsy even when they don’t want that information disclosed — and could one day be used to identify individuals against their will.
Luckily, the brain is lawyering up. Neuroscientists, lawyers, and lawmakers have begun to team up to pass legislation that would protect our mental privacy.
In the US, the action is so far happening on the state level. The Colorado House passed legislation this month that would amend the state’s privacy law to include the privacy of neural data. It’s the first state to take that step. The bill had impressive bipartisan support, though it could still change before it’s enacted.
Minnesota may be next. The state doesn’t have a comprehensive privacy law to amend, but its legislature is considering a standalone bill that would protect mental privacy and slap penalties on companies that violate its prohibitions.
But preventing a company from harvesting brain data in one state or country is not that useful if it can just do that elsewhere. The holy grail would be federal — or even global — legislation. So, how do we protect mental privacy worldwide?
Your brain needs new rights
Rafael Yuste, a Columbia University neuroscientist, started to get freaked out by his own neurotech research a dozen years ago. At his lab, employing a method called optogenetics, he found that he could manipulate the visual perception of mice by using a laser to activate specific neurons in the visual cortex of the brain. When he made certain images artificially appear in their brains, the mice behaved as though the images were real. Yuste discovered he could run them like puppets.
He’d created the mouse version of the movie Inception. And mice are mammals, with brains similar to our own. How long, he wondered, until someone tries to do this to humans?
In 2017, Yuste gathered around 30 experts to meet at Columbia’s Morningside campus, where they spent days discussing the ethics of neurotech. As Yuste’s mouse experiments showed, it’s not just mental privacy that’s at stake; there’s also the risk of someone using neurotechnology to manipulate our minds. While some brain-computer interfaces only aim to “read” what’s happening in your brain, others also aim to “write” to the brain — that is, to directly change what your neurons are up to.
The group of experts, now known as the Morningside Group, published a Nature paper later that year making four policy recommendations, which Yuste later expanded to five. Think of them as new human rights for the age of neurotechnology:
1. Mental privacy: You should have the right to seclude your brain data so that it’s not stored or sold without your consent.
2. Personal identity: You should have the right to be protected from alterations to your sense of self that you did not authorize.
3. Free will: You should retain ultimate control over your decision-making, without unknown manipulation from neurotechnologies.
4. Fair access to mental augmentation: When it comes to mental enhancement, everyone should enjoy equality of access, so that neurotechnology doesn’t only benefit the rich.
5. Protection from bias: Neurotechnology algorithms should be designed in ways that do not perpetuate bias against particular groups.
But Yuste wasn’t content to just write academic papers about how we need new rights. He wanted to get the rights enshrined in law.
“I’m a person of action,” Yuste told me. “It’s not enough to just talk about a problem. You have to do something about it.”
How do we get neurorights enshrined in law?
So Yuste connected with Jared Genser, an international human rights lawyer who has represented clients like the Nobel Peace Prize laureates Desmond Tutu and Aung San Suu Kyi. Together, Yuste and Genser created a nonprofit called the Neurorights Foundation to advocate for the cause.
They soon notched a major win. In 2021, after Yuste helped craft a constitutional amendment with a close friend who happened to be a Chilean senator, Chile became the first nation to enshrine the right to mental privacy and the right to free will in its national constitution. Mexico, Brazil, and Uruguay are already considering something similar.
Even the United Nations has started talking about neurotech: Secretary-General António Guterres gave it a shoutout in his 2021 report, “Our Common Agenda,” after meeting with Yuste.
Ultimately, Yuste wants a new international treaty on neurorights and a new international agency to make sure countries comply with it. He imagines the creation of something like the International Atomic Energy Agency, which monitors the use of nuclear energy. But establishing a new global treaty is probably too ambitious as an opening gambit, so for now, he and Genser are exploring other possibilities.
“We’re not saying that there necessarily need to be new human rights created,” Genser told me, explaining that he sees a lot of promise in simply updating current interpretations of human rights law — for example, extending the right to privacy to include mental privacy.
That’s relevant both on the international level — he’s talking to the UN about updating the provision on privacy that appears in the International Covenant on Civil and Political Rights — and on the national and state levels. While not every nation will amend its constitution, states with a comprehensive privacy law could amend that to cover mental privacy.
That’s the path Colorado is taking. If US federal law were to follow Colorado in recognizing neural data as sensitive health data, that data would fall under the protection of HIPAA, which Yuste said would alleviate much of his concern. Another possibility would be to get all neurotech devices recognized as medical devices so they would have to be approved by the FDA.
When it comes to changing the law, Genser said, “It’s about having options.”
A version of this story originally appeared in the Future Perfect newsletter. Sign up here!
Waiver wire: Potential replacements for James Conner, Jalen Hurts
- 11 hours ago
Sources: Heat's Butler prefers trade out of Miami
- 11 hours ago
Embiid ejected in chaotic opening half in Philly
- 11 hours ago
AP women's college basketball poll reaction: Where does UConn land after loss to USC
- 11 hours ago
Bears' Brown denies late-game confusion in loss
- 11 hours ago
Chiefs capture AFC's top seed as Steelers lose third straight
- 11 hours ago