“Nothing really matters because, really, we’re all just code.” These are the words of wisdom my tin foil hat wearing flatmate will use to console me whenever I am down. Put simply, he means “don’t cry over spilled milk. Trivial things don’t matter because, in the end, all that remains of us is a random arrangement of 0s and 1s floating through space.”
Yet, his comment also hints that we are a small part of a bigger system, and there is much we can’t control. While I’m not one to suppress emotion, ignore my problems and pretend everything is fine, my friend initiates an interesting talking point about our identities and how we perceive life, especially in the modern technological age.
Of course, our biological material is coded. Our identities are formed by a sequence of molecules that make up genetic data. But we are also binary data. Our lives, now more than ever, are determined by algorithms. Technology is becoming our new reality, and most of our existence is shared through a screen. In many ways, this has been something to celebrate.
Algorithms behind the technology have revolutionised medicine, science, transport, communication, and so much more. Algorithms have made our lives better. So, even if we are just a small encryption in an infinitely vast coded space, surely being a part of something so revolutionary is a good thing, right?
Not necessarily. Being influenced by code and algorithms is also something worthy of concern. Even within the last three years, there have been dramatic scandals over data privacy breaches that have supposedly influenced elections, “mutant algorithms” nearly derailed the exam results for thousands of students, and the introduction of experimental algorithmic technology in the employment sector has caused many ethical issues. John Cheney-Lippold, author of We Are Data, says, “our identities are made useful not for us—but for someone else,” and this is the subject of Netflix’s recent documentary, The Social Dilemma (2020).
In the documentary, multiple insiders of Silicon Valley reveal that to any major online company, we are a product. More specifically, they say, “It’s the gradual, slight, imperceptible change in your own behaviour and perception that is the product.” Corporations like Google, Facebook and YouTube want to maximise our attention on screen and our engagement through clicks and continual swiping because this is how they make money. They deliberately use algorithmic patterns of persuasive technology to make us feel a certain way. Tristan Harris (the former design ethicist of Google) says technology is like magic. It is a hidden kind of deception, a trick of the mind. By deliberately manipulating what we see to peak our interest, they also manipulate what we do, how we think and who we are.
The Social Dilemma closes with the fear that if it continues to be this way, these algorithms will begin to influence public opinions on genuine issues like politics and religion. The result could be existential. If we keep being deceived into attending to the algorithmic illusions displayed online, we will lose sight of ourselves. Now, due to the pandemic, it can’t be helped that we primarily exist online. So, it is worth considering how being part of this technological world impacts us daily, perhaps even hourly. While my friend’s heart was in the right place to suggest that I accept what I can’t control, actually, it all matters because we are just code. The question is, should we feel comforted by that or afraid?