MediaTechRetail

The dystopian tale has a special place in our shared cultural heritage.

Many of us will have a favourite, or perhaps several. I myself adored the 1984 and The Handmaid’s Tale books as a youngster; moved on to JG Ballard then discovered Philip K. Dick thanks to Minority Report; and in recent years was floored by Black Mirror episodes and videogames such as The Last of Us.

The thrill can be explained by one question: ‘What if this horror was actually happening?’

“People say Black Mirror and The Handmaid’s Tale are conspiracies, science fiction – but as a philosopher, I can see a lot of the elements in these films and books that are actually happening now,” Marie Oldfield, AI ethics consultant to the UK government and other organisations, tells BusinessCloud.

“People are indoctrinated and manipulated by social media… it’s very difficult to see that it’s happening because it’s done at such a low level.

“There are a lot of issues but it’s easier to ignore them and get on with your life… it’s going to take one little step to tip that balance and then all of a sudden you’re in an episode of Black Mirror.

Amazon Alexa

Oldfield is the founding director of Oldfield Consultancy, an expert in analytical modelling and ethical artificial intelligence. She has led research on areas such as anthropomorphism and dehumanisation in AI and cyberspace, in addition to pedagogy and modelling best practice.

Her aim is to drive greater discussion about the ethical implementation of AI technology in both academia and industry, in order to protect against negative effects on society as a whole.

“Already, in a passive manner, [Amazon’s ‘smart’ assistant] Alexa is having negative consequences,” she claimed, speaking to BusinessCloud at the Digital Transformation Expo (DTX) in Manchester. “It is harvesting data then using it across different platforms to sell to people; and also using techniques to manipulate people into buying things that they may not necessarily need.

“If this then becomes a technology that is increasingly proactive – such as telling you when you should go to bed – my question is: what data is it collecting from your daily life [to arrive at this recommendation] and how is it collecting it? 

“Also, why are you giving up control of your life to a machine? This whole phenomenon of dehumanisation – where we’re not only devaluing ourselves and other people, but letting machines take control of aspects of our life – is worrying. 

“As humans, surely we strive for independence and freedom and decision making… we’re now starting to give that away because of the attachments that we’re building with this technology, which are not necessarily appropriate, but nevertheless happening. 

“It’s starting to become a very blurred line between what is technology and what is human.”

She adds: “If you’re at a point where Alexa is dictating your entire life, you do get to a point, philosophically speaking, where you ask: why are we alive? What are we here to do? How can we fulfil our ambitions and our desires? And how can we have a fulfilling life if actually we’re being controlled by the government, social media or technology?”

ICO fines facial recognition firm Clearview AI £7.5m

Elderly carers

Robots are even being deployed in UK care homes to keep elderly people company. This is completely inappropriate and perhaps dangerous, Oldfield suggests.

“If a robot has human features and talks, people may think that it has a mind of its own, that it can make goals for itself and build a relationship with them,” she says.

“When the technology doesn’t deliver on that, the person can become really angry and upset: they’re just not sure what relationship they’re supposed to have with it. 

“There are enough humans in the world – why are we content to abandon elderly people? How can you build a society where having fruitful interactions with each other as humans is neglected and replaced with a robot which looks a bit like a dog or a human – and people think that’s fine? 

“I fail to see how that’s fine, because that person is obviously lonely or socially isolated.”

She adds: “When you have people that are vulnerable in that way, if you have something like a proactive Alexa, it is very easy for it to start selling to and manipulating that person. That’s been seen in numerous studies.

“Do you want to leave your grandma or grandad in a room with these objects? There’s no assurance or control over that interaction at all.”

Children

She is also concerned that children are not equipped with an understanding of these technologies. “A child can become attached in not necessarily a positive way: they think that the technology is a human – like a parent or teacher – and if it tells them to do something, they’ll do it.

“They’re also not getting what they need from that relationship, so they can start to abuse the technology: there are examples of robots being beaten up or Alexas thrown across rooms. 

“There are a wide range of emotional reactions that a child could have to the technology.”

‘Diversity & inclusion is about progress, not perfection’

Oldfield is a fellow and executive board member for the Institute of Science and Technology, as well as an expert fellow for Sprite+ and a member of the College of Peer Reviewers for Rephrain. 

She sees a creeping need to take our phones everywhere – for example, to board planes and access sports stadiums – which is moving us closer to a nightmare Minority Report scenario which could even lead to crime-preventing pre-cogs.

“How do you operate in a world where you’re forced to be a digital citizen? If you’re visible, you’re therefore controllable,” she explains. “Where we are at the minute is a very developing situation, and it’s fluctuating quite substantially.

“If governments can get rid of cash, you’re more easily trackable. If you’ve got to have your phone with you at all times, and if you start to connect databases, everybody’s constantly being tracked. Where are you going? How are you getting there? When were you there? 

“All of a sudden, every single part of your life is in a database. What can then happen is – and these are already in development – algorithms to predict whether you will commit crime.”