Unlike conventional biological data, neural data reveals the very essence of who we are: our mental, cognitive, and emotional states, our memories, intentions, perceptions, and even our planning processes. If devices can interpret this data, they might also be able to alter our cognitive and emotional processes, potentially influencing our very behavior through electrical impulses or magnetic fields.
"It’s imperative that we speed the development of neurotechnology," states Marcello Ienca, a philosopher and neuroethics expert from the Technical University of Munich, who formerly headed EPFL’s Hybrid Minds Project. With a third of the world's population expected to experience a neurological disorder and half to suffer from a mental health condition, these advancements are crucial for understanding the human brain and treating patients.
However, Ienca warns of a "slippery slope" once neurotechnology moves beyond medical applications. "We live in a world where our brain is our most important asset – it underpins the data-driven business models of social media operators and online retailers," he explains. Companies are eager to gain deeper psychological insights to influence purchasing decisions, exploit vulnerabilities, and "keep us captive." Neural data could make their data-driven applications terrifyingly accurate, allowing them to go "straight to the source" of our preferences.
While brain implants are currently reserved for therapeutic uses (e.g., for depression or Parkinson's), some companies harbor broader ambitions. Elon Musk's Neuralink, for instance, has openly declared its aim to implant brain chips in millions of people for "nonmedical purposes – that is, solely to improve their performance." While Ienca is skeptical about the immediate widespread adoption of implants, the idea of Neuralink convincing a significant number of people to share their neural data is "worrying," especially due to the potential for undermining public trust if issues like patient deaths or major data leaks occur.
The more immediate concern arises from noninvasive technologies like portable mini EEG machines, which are already being integrated into everyday items such as headphones, fitness bands, and sleep trackers to monitor brain activity. These devices require no surgery or carry physical risks, making them attractive for consumer-focused businesses aiming for remote device control, neurogaming, entertainment services, or other applications. "These devices aren’t as effective as brain implants but they allow companies to collect data from a larger pool of consumers – and today, more data means better-trained AI programs and more robust predictive algorithms," Ienca points out. It's no coincidence that tech giants like Apple are patenting brain activity sensing technology for future products, eager to tap directly into consumer preferences and intentions from their "neurobiological source."
Ienca issues a stark warning: "Having a monopoly on the human brain is the riskiest thing that could happen to our species." He foresees a "boom in neurotechnology" this decade, akin to the personal computer revolution of the 1980s. He suggests the gaming industry could act as a "Trojan horse," popularizing neurogames where players control avatars with their thoughts, thereby normalizing the technology for mass adoption.
Fortunately, international and local efforts are underway to address these unprecedented challenges. In 2017, Ienca and his colleagues introduced the concept of "neurorights," advocating for the protection of thoughts and mental processes as fundamental human rights, and proposing updates to existing declarations like the Universal Declaration of Human Rights.
Significant progress is already being made:
- The Council of Europe, representing 55 signatory countries, is discussing an amendment to its 1981 Convention 108 on data protection to include neural data, based on guidelines drafted by an ad hoc committee including Ienca.
- UNESCO working groups, with Ienca's involvement, have drafted recommendations on the ethics of neurotechnology, currently under review by member states.
- Domestically, US states are taking action: Colorado became the first to adopt an act protecting neural data like other personal biometric data in April 2024, with California following suit in September.
While rules and regulations can sometimes be ignored, Ienca stresses the importance of establishing a legal framework. This requires a "top-down regulatory approach" combined with a "bottom-up one based on making consumers aware of the value of their data, the actual risks involved and how they can mitigate those risks."
Crucially, regulators must prevent the formation of monopolies, similar to those seen in the AI and IT industries. Europe, Ienca suggests, is uniquely positioned to lead this charge, promoting "responsible innovation that upholds our fundamental rights and aims to limit the societal impacts of neurotechnology," thereby asserting global leadership in this vital field.
The neurotechnology revolution is here, promising cures and enhanced capabilities. But as our brains become the next frontier for data collection, the fight for mental privacy has just begun. Understanding the risks and advocating for robust ethical and legal frameworks will be paramount to ensuring our minds remain truly our own.
Ref:
Brouet, A. (2025, August 8). Do neurotechnologies threaten our mental privacy? EPFL. https://actu.epfl.ch/news/do-neurotechnologies-threaten-our-mental-privacy/