Originally posted on sciencefocus.
We need to rethink our human rights for the wearable tech of tomorrow, according to neurotechnology ethics expert Prof Nita Farahany.
Through new tech, we’re now able to track our steps, our heart rate and even our vascular age. But as future technology advances, there is a new metric to access – our brain waves.
New ‘brain sensors’ promise much, but as Nita Farahany – an author and professor specialising in the ethics of emerging technologies – explains, we may need to readdress our basic human rights to prepare for them.
Are there really now devices that can access our brain waves?
Yes, but it’s a question of both scale and precision. There are millions of consumer brain wearables sold worldwide. These are in the form of headbands or sensors that can be embedded into a hard hat or a baseball cap to track brain activity. The algorithm’s interpreting that activity, but right now they’re somewhat limited in what they can do. They can decode someone’s attention, engagement, if their mind is wandering, and basic emotions like stress, happiness or sadness.
Major tech companies are investing in integrating brain sensors in the same way we see heart rate monitors in watches and rings, integrating brain sensors into everyday devices like earbuds, headphones or even wearable tattoos.
Some companies have announced that they plan to launch a neural interface as a way to interact with the rest of our technology in augmented and virtual reality by 2025.
What is it that these brain scans are actually measuring?
These are not mind-reading devices, they can’t understand our detailed thoughts. One common technology used is electroencephalography (EEG), which picks up electrical activity in your brain as you’re thinking or experiencing anything.
Neurones are firing in your brain in a way that sends characteristic patterns, giving off tiny electrical discharges that can be picked up by the EEG. Through powerful algorithms, these patterns are decoded. From this, we can measure attention, mind-wandering, and basic feelings and emotions.
Pair that with a screen someone is looking at, and we can track environmental data, too. Flash up political candidates from different parties on my screen while I have brain sensors attached and you could classify my responses to any particular party.
Researchers have also tried subliminally embedding, in a gaming environment, PIN numbers or addresses to see if recognition of that information could also be reliably detected from brainwave data.
If we’re able to ‘crack open’ the brain, how Will this affect the mental health and wellbeing space?
There are devices approved for the treatment of depression through neurofeedback, but also through electrical stimulation of the brain. People could use the data to detect earlier stages of mental health disorders or neurological disorders in much the same way that people track their heart rate, breathing and the number of steps they’ve taken.
Tracking mental health data will likely be normalised with this objective data from the brain. Do you work best at home or in an office, based on focus and attention levels? Did that glass of wine affect your sleep? It can all be tracked with your mental health. A lot of companies are investing to crack open the brain and quantify what they find.
Could this trigger a brain hypochondriac movement?
It’s definitely possible. We know very little about our brains, but what we do know is that there is diversity between brains and brain activity for different people. In the early days of this, the algorithms may misidentify people as having something neuro-atypical happening in their brains. People will also be studying their own data to see if there is anything they should be worried about.
As people begin looking at their brain data, they may start to become worried about what they see there and potentially needlessly worried in ways that could be problematic.
Often, technology is created and then the question of ethics is addressed at a later stage. what can we do beforehand to get ahead?
I advocate for recognising a right to cognitive liberty now as an international human right. That means updating our existing rights and our interpretations of them. That’s a good first step: setting both a global legal framework and a norm that recognises that self-determination of our brains and mental experiences as fundamental.
It also prioritises personal data, giving consumers the right to control their personal information, over a company’s right to use it. Not a default rule where corporations can collect, commodify, mine and analyse that data for any purpose they wish. The starting place is to flip what has been a system that really favours corporations over individuals, giving people the right to the data from their brains.
What is cognitive liberty and why is it so important?
One of the biggest problems with the technology we’re talking about is the risk to our brains. The sense in which our brains are accessed, tracked and hacked by technologies in ways that are contrary to human flourishing. Cognitive liberty is a right to self-determination over our brains, updating the concept of liberty for the digital age.
The human right to privacy should include mental privacy. The right to freedom of thought should be interpreted beyond ideas of religion and belief, to protect our robust thoughts and images in our mind from being accessed. And the right to self-determination should give us a right to access and to change our brains if we should choose to do so.
Is this definitely a technology of the future or could this be a Google Glass or Metaverse situation?
I’d be surprised if it doesn’t work out. After all, it doesn’t make sense for us to know so little about our own brains. Neurological disease and suffering are rising and their toll on an individual is extraordinary.
Our physical health and longevity are improving, but our mental health and wellness is declining. If this kind of technology gives us the tools to be empowered to take charge of our own brain health, then it could truly be revolutionary. It’s not a novelty we don’t need, it’s fundamental.