In a future with more ‘mind reading’, thanks to neurotechnology, we may have to reconsider freedom of thought

Socrates, the ancient Greek philosopher, never wrote things down. He warned that writing undermines memory – that it is nothing more than a memory of a previous thought. Compared to people who discuss and debate, readers will be “hearers of many things and have learned nothing; they will appear omniscient and will generally know nothing.”

These views may seem quaint, but his central fear is timeless: that technology threatens thinking. In the 1950s, Americans panicked about the possibility that advertisers would use subliminal messages hidden in films to trick consumers into buying things they didn’t really want. Today, the US is in the midst of a similar panic over TikTok, with critics concerned about its impact on viewers’ freedom of thought.

For many people, neurotechnologies seem especially threatening, even though they are still in their infancy. In January 2024, Elon Musk announced that his company Neuralink had implanted a brain chip in its first human subject – although they achieved this feat well behind competitors. Fast forward to March and that person can already play chess with just his thoughts.

Brain-computer interfaces, called BCIs, have rightly sparked debate about the appropriate boundaries of technologies that interact with the nervous system. Looking ahead to the day when wearable and implantable devices may be more widespread, the United Nations has discussed regulations and restrictions on BCIs and related neurotechnology. Chile has even enshrined neurorights – special protections for brain activity – in its constitution, while other countries are considering doing so.

A cornerstone of neurorights is the idea that all people have a fundamental right to determine the state of their brains and who can have access to that information, just as people normally have the right to determine what happens to their bodies and property happens. It is usually equated with ‘freedom of thought’.

Many ethicists and policymakers believe that this right to mental self-determination is so fundamental that it is never okay to undermine it, and that institutions should impose strict limits on neurotechnology.

But as my neuro-rights research shows, protecting the mind is not nearly as simple as protecting bodies and property.

Thoughts versus things

Creating rules that protect a person’s ability to control what is done to their body is relatively simple. The body has clear boundaries and things that cross it without permission are not allowed. Normally it is obvious when someone is breaking laws that prohibit assault or battery, for example.

The same applies to regulations that protect someone’s property. Protecting body and property are some of the main reasons why people come together to form governments.

In general, people can enjoy these protections without dramatically limiting the way others choose to live their lives.

The difficulty in establishing neurorights, on the other hand, is that, unlike bodies and property, brains and minds are constantly under the influence of outside forces. It is not possible to shield one’s mind so that nothing enters.

A light-colored wooden fence against a cloudy sky.

Instead, a person’s thoughts are largely the product of the thoughts and actions of other people. Everything from how one perceives colors and shapes to our most fundamental beliefs is influenced by what others say and do. The human mind is like a sponge, absorbing everything in which it is immersed. Regulations may control the types of liquids in the bucket, but they cannot protect the sponge from getting wet.

Even if that were possible—if there were a way to regulate people’s actions so that they did not affect the thoughts of others at all—the rules would be so cumbersome that no one could do much of anything.

If I am not allowed to influence the thoughts of others, I can never leave my house, because by doing so I cause people to think and act in a certain way. And as the Internet continues to expand a person’s reach, not only would I not be able to leave the house, I wouldn’t be able to “like” a post on Facebook, leave a product review, or comment on an article.

In other words, protecting one aspect of freedom of thought – one’s ability to protect oneself from outside influences – may conflict with another aspect of freedom of thought: freedom of speech, or one’s ability to express ideas. to express.

Neurotechnology and control

But there is another problem: privacy. People may not have complete control over what goes into their heads, but they should have significant control over what happens – and some people believe that societies need ‘neuro-rights’ regulations to ensure that. Neurotech represents a new threat to our ability to control what thoughts people reveal to others.

For example, there are ongoing efforts to develop wearable neurotechnology that can read and adjust the customer’s brain waves to help them improve their mood or sleep better. Even though such devices can only be used with the user’s consent, they still extract information from the brain, interpret it, store it and use it for other purposes.

In experiments, it is also becoming easier to use technology to gauge someone’s thoughts. Functional magnetic resonance imaging, or fMRI, can be used to measure changes in blood flow in the brain and produce images of that activity. Artificial intelligence can then analyze those images to interpret what someone is thinking.

Critics of neurotechnology fear that as the field continues to develop, it will be possible to extract information about brain activity whether or not someone wants to release it. Hypothetically, that information could one day be used in a range of contexts, from research into new devices to courts.

A small golden brain, about to be hit by a wooden hammer with a golden band on it.A small golden brain, about to be hit by a wooden hammer with a golden band on it.

Regulation may be necessary to protect people from information being taken away by neurotechnology. For example, countries could ban companies that make commercial neurotech devices, such as those intended to improve the wearer’s sleep, from storing the brainwave data these devices collect.

Still, I would argue that it may not be necessary or even feasible to protect us from neurotechnology putting information into our brains—although it is difficult to predict what capabilities neurotechnology will even have in a few years.

Partly this is because I believe people tend to overestimate the difference between neurotechnology and other forms of external influence. Think of books. Horror writer Stephen King has said that writing is telepathy: when an author writes a sentence—for example, describing a shotgun above the fireplace—it evokes a specific thought in the reader.

Furthermore, there are already strong body and property protections in place, which I believe could be used to prosecute anyone who forces invasive or wearable neurotechnology on someone else.

How different societies will deal with these challenges is an open question. But one thing is certain: with or without neurotechnology, our control over our own minds is already less absolute than many of us like to think.

This article is republished from The Conversation, an independent nonprofit organization providing facts and trusted analysis to help you understand our complex world. It was written by: Parker Crutchfield, Western Michigan University

Read more:

Parker Crutchfield does not work for, consult with, own shares in, or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Leave a Comment