The Information Commissioner’s Office is warning that newly emerging neurotechnologies risk discriminating against people if those groups are not put at the heart of their development.
The regulator predicts that the use of technology to monitor neurodata, the information coming directly from the brain and nervous system, will become widespread over the next decade.
Neurotech is already used in the healthcare sector, where there are strict regulations. It can predict, diagnose, and treat complex physical and mental illnesses, transforming a person’s responses to illnesses such as dementia and Parkinson’s disease.
In May, Gert-Jan Oskam, a 40-year-old Dutch man who was paralysed in a cycling accident 12 years ago, was able to walk again thanks to electronic implants in his brain.
But neurotechnologies are rapidly developing for use in the personal wellbeing, sports and marketing sectors and even for monitoring people in the workplace. If not developed and tested on a wide enough range of people, the ICO says there is a risk of inherent bias and inaccurate data being embedded in neurotechnology – negatively affecting people and communities in the UK.
“To many, the idea of neurotechnology conjures up images of science fiction films, but this technology is real and it is developing rapidly,” said Stephen Almond, executive director of regulatory risk.
“Neurotechnology collects intimate personal information that people are often not aware of, including emotions and complex behaviour. The consequences could be dire if these technologies are developed or deployed inappropriately.
“We want to see everyone in society benefit from this technology. It’s important for organisations to act now to avoid the real danger of discrimination.”
Discrimination in neurotechnology could occur where models are developed that contain bias, leading to inaccurate data and assumptions about people and communities. The risks of inaccurate data emerge when devices are not trialled and assessed on a wide variety of people to ensure that data collection remains accurate and reliable.
Neurodivergent people may be particularly at risk of discrimination from inaccurate systems and databases that have been trained on neuro-normative patterns.
The use of neurotech in the workplace could also lead to unfair treatment. An example of this could be that if specific neuropatterns or information come to be seen as undesirable due to ingrained bias, those with those patterns may then be overlooked for promotions or employment opportunities.
The ICO is developing specific neurodata guidance in the medium term. It will consider the interpretation of core legislative and technical neurotechnology definitions, highlight links to existing ICO guidance, our views on emergent risks and provide sector-specific case studies to highlight good practice by 2025.