New neurotechnology is blurring the boundaries around mental privacy – but are new human rights the answer?

Een vrouw probeert neurotechnologieapparatuur uit tijdens de Tech Week in Boekarest, Roemenië, in mei 2023. <a href=Cristian Cristel/Xinhua via Getty Images” src=”https://s.yimg.com/ny/api/res/1.2/T6nFf1N65by3fsHAzfrGaw–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTYzOQ–/https://media.zenfs.com/en/the_conversation_us_articles_815/2093b8fb8a512 cdeaff533b24affce9e” data-src= “https://s.yimg.com/ny/api/res/1.2/T6nFf1N65by3fsHAzfrGaw–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTYzOQ–/https://media.zenfs.com/en/the_conversation_us_articles_815/2093b8fb8a512cdeaff5 33b24affce9e”/>

Neurotechnologies – devices that interact directly with the brain or nervous system – were once dismissed as science fiction. Not anymore. Several companies are developing and some are even testing ‘brain-computer interfaces’ or BCIs, with Elon Musk’s Neuralink probably the most notable. He announced on January 29, 2024 that he was the first human in the company’s clinical trials received a brain implant.

Like other companies, Neuralink’s immediate goal is to improve the autonomy of patients with severe paralysis or other neurological conditions.

But not all BCIs are intended for medical use: there are EEG headsets that sense electrical activity in the wearer’s brain, covering a wide range of applications from entertainment and wellness to education and the workplace. Yet Musk’s ambitions extend beyond these therapeutic and non-medical applications. Neuralink aims to ultimately help people”surpass valid human performance.”

According to a United Nations report, research and patents on neurotechnology have increased at least twenty-fold in the past two decades, and devices are becoming increasingly powerful. Newer devices have the potential to collect data from the brain and other parts of the nervous system more directly, at higher resolution, in larger quantities and in more pervasive ways.

However, these improvements have also raised concerns about mental privacy and human autonomy – questions I’m thinking about in my research into the ethical and social implications of brain science and neural engineering. Who owns the generated data and who should have access? Could this type of device threaten individuals’ ability to make independent decisions?

In July 2023, the UN Science and Culture Agency held a conference on the ethics of neurotechnology, calling for a framework to protect human rights. Some critics have even argued that societies should recognize a new category of human rights, ‘neurorights’. In 2021, Chile became the first country whose constitution addresses concerns about neurotechnology.

Advances in neurotechnology raise significant privacy concerns. However, I believe that these debates may overlook more fundamental threats to privacy.

A look inside

Concerns about neurotechnology and privacy center on the idea that an observer can “read” a person’s thoughts and feelings just from recordings of their brain activity.

It is true that some neurotechnologies can record brain activity with great specificity: for example, developments in high-density electrode arrays that allow high-resolution recording from multiple parts of the brain.

Someone standing off-screen adjusts a glowing monitor connected to a computer.

Researchers can draw conclusions about mental phenomena and interpret behavior based on this type of information. However, ‘reading’ the recorded brain activity is not easy. Data has already gone through filters and algorithms before the human eye gets the output.

Given this complexity, my colleague Daniel Susser and I wrote an article in the American Journal of Bioethics – Neuroscience asking whether some concerns about mental privacy might be misplaced.

While neurotechnologies pose significant privacy concerns, we argue that the risks are similar to those of more familiar data collection technologies, such as everyday online surveillance: the kind most people experience through Internet browsers and advertisements, or wearable devices. Even browsing history on personal computers can reveal highly sensitive information.

It is also worth remembering that an important aspect of being human has always been inferring the behavior, thoughts and feelings of others. Brain activity alone doesn’t tell the full story; Other behavioral or physiological measures are also needed to reveal this type of information and the social context. For example, a certain increase in brain activity can indicate anxiety or excitement.

However, that doesn’t mean there isn’t cause for concern. Researchers are exploring new directions in which multiple sensors – such as headbands, wrist sensors, and room sensors – can be used to capture multiple types of behavioral and environmental data. Artificial intelligence could be used to combine that data into more powerful interpretations.

Think for yourself?

Another thought-provoking debate surrounding neurotechnology concerns cognitive freedom. According to the Center for Cognitive Liberty & Ethics, founded in 1999, the term refers to “the right of every individual to think independently and autonomously, to use the full power of his or her mind, and to engage in multiple modes of thought. ”

More recently, other researchers have revived the idea, such as in legal scholar Nita Farahany’s book “The Battle for Your Brain.” Advocates of cognitive freedom broadly argue for the need to protect individuals from having their mental processes manipulated or monitored without their consent. They argue that greater regulation of neurotechnology may be necessary to protect individuals’ freedom to determine their own inner thoughts and control their own mental functions.

A man in a gray turtleneck stands with what appears to be a black and white bicycle helmet on his head.A man in a gray turtleneck stands with what appears to be a black and white bicycle helmet on his head.

These are important freedoms, and there are certainly specific features – such as those of new BCI neurotechnology and non-medical neurotechnology applications – that raise important questions. Yet I would argue that the way cognitive freedom is discussed in these debates views each individual person as an isolated, independent actor, neglecting the relational aspects of who we are and how we think.

Thoughts don’t just arise out of nowhere in someone’s head. For example, part of my mental process as I write this article is remembering and reflecting on colleagues’ research. I also think about my own experiences: the many ways I am today are the combination of my upbringing, the society I grew up in, the schools I attended. Even the advertisements my web browser sends to me can shape my thoughts.

To what extent are our thoughts uniquely ours? To what extent are my mental processes already manipulated by other influences? And keeping that in mind, how should societies protect privacy and freedom?

I believe that recognizing the extent to which our thoughts are already shaped and monitored by many different forces can help set priorities as neurotechnologies and AI become more common. Looking beyond new technology to strengthen current privacy laws can provide a more holistic view of the many threats to privacy and the freedoms that need to be defended.

This is an updated version of an article originally published on August 7, 2023.

This article is republished from The Conversation, an independent nonprofit organization providing facts and trusted analysis to help you understand our complex world. It was written by: Laura Y. Cabrera, Penn State

Read more:

Laura Y. Cabrera receives funding from National Institutes of Health and the National Network Depression Centers. She is affiliated with IEEE and the International Neuroethics Society.

Leave a Comment