This article is published in partnership with F-Secure
In 2004, a 30-metre high wall of water caused by an undersea earthquake hit the coastal communities of South East Asia killing 230,000 people. Its long-term impact was even greater. It dislocated communities, split families and as well as wiping out life, it also wiped lives clean.
Typical among its victims were Mustafa, an Indonesian businessman, and his 14-year-old daughter Rina. Mustafa lost his wife and daughter (Rina’s elder sister) and the house that they lived in was virtually destroyed. Along with their family, their memories and the bric-a-brac that made up their lives and their former identity was washed away.
Mustafa and Rina were not alone when they said that, "they had lost everything".
Such experiences, while not on the same dramatic scale, are not uncommon. They happen to nearly all of us because of a common misconception about identity and what goes into creating it.
It is a philosophical concept known as "the extended mind" that the technology industry and governments are now acutely focussed on. For most of us, our memories are in our surroundings and in our mementos. And now more and more those memories, thoughts, desires and dreams are making their way onto our laptops, tablets, and mobile phones.
Put simply, the extended mind is the web of memories, places, objects, actions and things that go to make us up.
“It’s the idea that our minds are not confined to what goes on inside our skulls, but that other information processes like our notebooks, our environments, maybe other people, or our laptops, also contain some parts of our minds or parts of our memories,” said Oxford University’s Professor Nick Bostrom, the Swedish head of the Faculty of Philosophy and Oxford Martin School, who is also Director of the Programme on the Impacts of Future Technology.
Our smartphones have become the lumber-room of our extended minds and have the power to yield huge amounts of information, a store that technology companies like Google and Facebook are eager to exploit. Google already offers identity verification services to several US government departments, and it is using our data to profile us, acquiring more information on us improves the service they offer.
All of us have special memories and places – a waterfall, a theatre, a park bench – places that had some significance in our past and are part of our extended mind. We store our souvenirs, our artefacts, our photo albums, in attics and garages in the belief that we can access and refresh a memory when the need arises.
Now we are being encouraged to preserve those memories by also putting them into a digital vault. Our memories are stored in our photos and our GPS data, but in return for the technological tools to be able to do this, companies and governments now demand unfettered access to our information, as Edward Snowden has revealed.
Poets and philosophers have reflected on the fact that we all have a different memory of events. Now, all that has changed because the technology companies want to make money out of the data of our extended minds. The companies are ensuring that not only can we never forget, but also that we are never allowed to forget.
And the reason is simple. Our identity and our memories are being sold to nearly every company in the world so that they can identify the particular individuals who fit a profile. The companies profile those individuals from the data culled from their extended minds: the things that they like, their opinions and the places that mean most to them.
This big data world allows technology companies to even work out who our friends are. For example, if they search for clusters of credit cards numbers and note the time and place where they are used, companies can find out who knows who. Ironically, this pursuit of our basic data is robbing us of our misconceptions of the past, ironing out and homogenising differences and, for the first time, generating a shared memory grounded in hard data.
Social media is now a shared mind, a collective memory.
Writers have warned us about this world, from Dickens in Hard Times to the bleak authoritarian surveillance regime of George Orwell in Nineteen Eighty-Four.
In Hard Times, facts are everything and the world of technology is now creating that world. It is a world where human frailty is being progressively lost – while we ourselves are unable to be get lost because of our satnavs, and unable to forget because of the mass of data stored on us by our friends in social media and the companies that harvest those memories. We are not only being stripped of our frailty; this new world of reason turns us into data and keeps tabs on us all of the time to ensure we behave in a way that does not cause concern.
It is a process in which we also lose one of the central themes of our existence. Being lost is an idea that has been a stock in trade of books and films since the beginning of the written word. Taken to its logical extreme, we will never ever be lost again, not even to death.
Dr Jonathan Cave of Cambridge University’s Centre for Science and Policy says we lose this human frailty at our peril.
“One of the things that computer scientists believe is that if we can be freed from some of the weaknesses that we have, that we become effectively immortal because death no longer becomes a problem, because we can preserve the mind and refresh the body and the weaknesses of memory or dementia can be made to recede.”
So our past and the events that make us could be stored on a USB stick and then inputted into a clone of ourselves. For many, an attractive and even reassuring prospect, except for the fact that our data copy is not us, it is us plus. For this data copy of our identity is an entity that knows us better than we know ourselves, modern databases now have some 1,500 data points about every individual’s life. Machines remember the things we forget or choose to forget.
This more "complete" view of our identity – whilst technically our data record – is not how we perceive ourselves, and so is not our identity at all.
Thus, we are losing fundamental rights: the right to be forgotten, the right to lose our way and the right to be forgiven – all key parts of our life and our culture until now.
Viktor Mayer-Schoenberger, the Professor of Internet Governance and Regulation at Oxford University and author of the acclaimed book Big Data shares this view.
“In essence, too much of a comprehensive digital memory might make it almost impossible for us to see the forest, we might only be able to see the trees and that makes it hard for us to take decisions in the present because we are always remembering the decisions of the past.
“In that sense, forgetting plays a very important role,” said Mayer-Schoenberger, an Austrian who once lost 10 years of correspondence when two of his hard drives failed.
“For two days I was crying, I was very depressed. And then I just got up and got on with things and it really had no impact on me.”
However, the collection of data on our habits and preferences does have an impact on us. It means that for the first time we have allowed both companies and governments to enter our intimate personal worlds on a systematic and continual basis – despite repeated proof that both types of organisation are untrustworthy.
For example, Google has laid claim to the exterior views of our homes and can now unite them with other information about us. This means Google is effectively populating the street with the sort of information that in the past was only available to our neighbours. It knows where we live, who we are, what our house looks like and indeed who our neighbours are. All information that is part of the identification services that are now being offered by companies like Google, and now Google is laying claim to our bodies too.
According to Professor Fred Cate, a Distinguished Professor at the Indiana University Maurer School of Law and a privacy expert, the amount of data that has been collected on us just via CCTV cameras is alarming because the potential now exists to mine through it using sophisticated algorithms that can track our every move.
“That type of data was collected in a world in which we thought the usefulness of the data was limited. If I didn’t see a crime being committed, the data was otherwise going to have no value at all. Today, with facial recognition technology and gait recognition technology there’s an ability to match data that frankly we didn’t have even two years ago, so those visual images now have all sorts of new life and new uses,” said Cate, who is in a position to know: he is also a member of the US Department of Homeland Security’s Data Privacy and Integrity Committee Cybersecurity sub-committee and the Department of Defense Advanced Research Projects Agency Privacy Oversight Board.
Professor Cate thinks this means that we should delete CCTV and start again: “Either we should start over – or we need some sort of notification system, so that individuals and groups as well aren’t being unfairly discriminated against.”
That risk is all too real: currently politicians and the public erroneously believe that surveillance and data monitoring are a panacea for crime and terrorism. According to Mayer-Schoenberger criminal data is considered by many in society to be fair game for retention.
“In the US, state penitentiary departments sell mug shots of prisoners to whoever is prepared to pay a certain price and a company bought hundreds of thousands of pictures of past prisoners and put them on websites.
“The company who provides this ‘service’ is actually offering the possibility of taking the name off the website against a hefty fee.
“So in that sense it is blackmail – I have no other word for it – of those who are trying to re-socialise themselves, trying to reintegrate themselves into society.”
This effectively means that the technology is now undermining other central tenets of our society, the right to forgiveness.
Now the police in the UK and the US are developing systems that will use criminal profiles to find people who are likely to become criminals.
“The problem is that if we do that then we don’t know for sure whether or not a person would actually have committed the crime because every prediction based on big data is probabilistic, it’s based on probabilities,” Mayer-Schoenberger continues.
“Even if there is a 90 per cent chance that I would commit a murder in the next 48 hours, in one out of ten cases I would be sent to prison even though I might not have committed that murder and that I would have put away the knife and walked away from the crime scene.
“There would be a terrible temptation to become involved in a system of predictive social control, a system of social control which slaughters human volition at the altar of collective fear.”
Already our lives can now be recorded in great detail. Names and identity can be pinned to the video of our movements. For example, in UK airports, technology monitors the movements of mobile phones to ensure that their owners are behaving in a way that is normal for someone in an airport.
It’s a world that is a long way away from the tidal wave that swept away Mustafa and Rina’s former lives. They were fortunately reunited and have decided that in the event of it happening again that they will both go to a meeting place only known to them, so they will never get lost again.
Now, with new technology, they probably will not need to do that. Their extended minds will be returned to them and they will be able to meet virtually. But, in return for the service, they will have sold their lives to people who might not have their best interests at heart who will even know where their secret meeting places are.
We are selling ourselves, our birthright, for a mess of potage.