Algorithms have so far analyzed human identity merely for economic reasons. The result is dangerously far removed from our reality.
There are two important ways, among many others, in which technologies identify us: in terms of what we search and in terms of what we share. Take search first. Many years ago, when I visited Google’s headquarters in Mountain View for the first time, I saw countless monitors showing the world’s most popular searches scrolling in real time. It was mesmerizing. And it made vividly clear that what we want distinguishes each of us much more profoundly than what we have. Like sponges, we are full of pores and channels through which we maintain a constant flow of information to sustain our mental lives. Every query tells the story of a need, of a problem, of a difficulty, of a curiosity, of a concern, of a doubt, of a hope or a worry, of a wish or a desire, of an itch that needs to be scratched. And that story soon becomes a unique individual. The famous German philosopher Ludwig Feuerbach (1804-1872) once wrote that “Der Mensch ist, was er ißt”, “Man is what he eats”. The truth is that, today, we are what we google. In the long run, our queries draw the contours of our identities. This also holds true for any shopping online, from Amazon to Apple, from Expedia to TripAdvisor, from your preferred fashion retailer to your favourite supermarket.
Consider next our sharing. On Facebook, on Twitter and in general on social media, we mostly broadcast, we hardly ever communicate, in the sense that it does not matter whether there is anyone reading or listening on the receiving side. We are like radios that send signals to the far corners of the universe, just in case they are not inhabited. Such sharing too betrays who we are. We are information channels, what we transmit and how we transmit it reveals our identities.
There are many reasons why we give away our identities so easily. As far as searching is concerned, we are not used to see ourselves as clusters of missing information. And so we struggle to realise that we may easily be defined negatively, by all our wants. Searches do not paint us; they carve us, to paraphrase Michelangelo. Moreover, search engines easily distract us, making us believe that all we are doing is finding something we need, instead of uncovering ourselves as the kind of individuals who have such needs. As for the sharing, our soliloquising is now deafening and inevitably public. It is not that we no longer care about privacy. It is rather that a broadcasting source treats the universe as its audience, with respect to which the private vs. public distinction is not quite applicable. It would be like saying that two actors have no sense of privacy because they make love in a film. Wrong context: of course they may still have very private lives. But also mistaken assumption: we behave like stars in a film when in fact our onlife (that special mix of offline and online, increasingly common) experience is our real life, and so the loss of privacy is actually concrete.
Our digital technologies are designed to make us feel relaxed about our lack of privacy. They facilitate indeed invite our individual searches and sharing, because they need our data to tailor to our profiles advertisements that sell us services and products. “Know your customers” is the mantra, and such knowledge is obtained by treating people first as users and then as buyers. In this scenario, there is at least one major advantage, a great risk, and a possible way forward.
The advantage is one of ease and comfort. Customisation means having the world adapting to what we need, want, expect, fear, desire, hope, or wish. It is nice and enticing to receive the right discount, at the right time, for the right goods we are planning to buy anyway. And recommendations based on our interests are better than random ones, based on anyone's taste. The risk, however, is that our digital technologies may easily become defining technologies rather than mere identifying ones. They may move from being able to spot who we are to actually making sure that we become who they say we are, and do not change. This is so because our personal identities are incredibly malleable. We can be easily influenced, nudged, pushed and pulled. If this happens constantly, relentlessly, year after year, the relationship between our digital profiles and our selves becomes one of mutual interaction. The drops end up shaping the stone. So we may become self-fulfilling prophecies, in terms of what our technologies invite us to develop, or facilitate, or eradicate, or suppress or… in our personal identities: a passion for football, or some antipathy for some kind of ethnic groups, a willingness to make a sacrifice to purchase a more expensive and fancier car, or an unwillingness to consider a less popular drink... No wonder we become predictable: we have been made predictable. Nor do our technologies have any interest in our developments and transformations: quite the opposite. They would like to see a customer who likes something to keep liking that something and anything else that is similar to that something. Cats lovers turning at most into kittens lovers, not dogs lovers. Amazon’s recommendation system can only reinforce choices and tastes and make them more stable, and more predictable. So do the “smart” algorithms behind the newsfeeds of Facebook and Instagram. Our malleability is used to give ourselves a permanent shape not to enable us to change shape.
There are ways of improving all this. More socially-aware ways of managing our technologies, more education and awareness on our sides. But I suspect the most powerful incentive would come from breaking the vicious circle whereby our digital technologies tend to create monopolies (The Google, The Facebook, The Snapchat, The… name your company), which undermine competition, decrease choice, and profile people as users of free services in silos of activities. This would be to make online advertisement illegal. The whole industry of social media would have to re-invent itself as an industry based on customer relationships, with customers having plenty of rights, and companies competing in selling the best products. Our money and not our personal data would be the currency. I doubt it will ever happen, but steps in this direction may be taken. We took them with the tobacco industry, with the pharmaceutical industry, with the drink and food industry. In other words, we took them whenever our bodily identity has been at risk. We should start regulating advertisement for the digital industry now that our informational identity is under threat. An advertisement-free infosphere would be a better place and put an end to the strange predicament whereby the technologies that can empower us so much to express ourselves are also the technologies that can mummify so effectively who we are and can be.
About the author:
Luciano Floridi is Professor of Philosophy and Ethics of Information at the University of Oxford, where he is also the Director of Research of the Oxford Internet Institute and Governing Body Fellow of St Cross College. His research concerns primarily the Philosophy of Information, Information and Computer Ethics, and the Philosophy of Technology. His most recent book is The Fourth Revolution - How the Infosphere is Reshaping Human Reality (Oxford University Press, paperback 2016).