The Prison of New Technology

There is a line in Neil Postman’s Amusing Ourselves to Death that has been haunting me for months now. It reads in part: “people will come to love their oppression, to adore the technologies that undo their capacities to think.”

The idea is counter-cultural. We believe, intrinsically, that new technologies empower us to think. The grand narrative of the internet, for instance, suggests that the internet connects us, empowers us and gives us access to new information.

But what if this popular conception is wrong?

 

Social Media & Systems of Control:

When we talk about systems of control we have a tendency of imagining police states: autocratic regimes where people are arrested, rounded up and sent to labour camps for ‘re-education’.

Control in common parlance is synonymous with power, and power is synonymous with violence. A controlling government, we imagine, uses violence to suppress the people. In George Orwell’s 1984, for example, we see the classic picture of a Big Brother police state where every rebellious thought is met with violent retribution by government officials; books are banned, activists are arrested and everyday people are spied upon in their living rooms. It is only through fear that people obey – we learn.

In 1985, Neil Postman posed the idea that George Orwell was wrong, and that it was in fact Aldous Huxley’s vision of the future which we should be worried about. Yes, he says, we can be controlled by violence and fear. But we can also be controlled by pleasure and addiction. Aldous Huxley feared not a giant police state, but our own personal vices leading us towards a ‘trivial culture’. He feared a culture that celebrated the mundane and the ordinary above the profound and the meaningful. He feared a culture that ignored poverty because of information overload. He feared a culture that ignored the meaningful in favour of entertainment.

It is very much this second vision that we face today: a world where pleasure and entertainment have come to dominate all areas of our life. Since the invention of new technologies, including the television, computer and the internet, it has become increasingly clear that not only violence, but pleasure, can control what people think, do or say. Apps, software and new devices have inbuilt control mechanisms that are designed to trigger addiction, passivity and a bland absorption of mass information – triggering paralysis.

Consider the Facebook feed. Like most online mediums, Facebook’s feed is a deluge of information so substantial in volume that it is almost impossible to absorb. What results is either passivity (the inability to take in all that information) or gut-reactions (the instant, often misdirected, emotional response to information). In both cases, there is a lack of deep and critical thinking about the information one is encountering. Whether we are directed by technology towards passivity or emotionality – it is clear that we are in fact being directed.

Neil Postman once wrote that television news was psychotic in nature. A presenter will present a genocide next to a story about fluffy animals, next to a crime, or a weather report – while the presenters all smile and act with utmost professionalism. The news is a place that is devoid of emotion, and often overloaded with fluff – banter, smiles, laughs – despite what is being said in the bulletin. In the same way, feeds on most social media platforms are similarly psychotic. A Facebook feed might have a wedding post, followed by a genocide, followed by a party, followed by a famine in Africa – how are we meant to respond to this? Is it human, or even possible, to respond in the correct way? One might imagine we should be happy about the wedding, terrified by the genocide, joyous about the party and then sad about the famine. To achieve such an emotional connection with the content would require a kind of psychosis only seen in the fringes of society. Instead, we tend to ignore most of what we see. Information overload breeds a psychologically detachment from the situation. The world might be burning in hundreds of different ways – but we are only capable of caring about one of those ways at a time.

 

Getting our Attention:

Notifications:

We have been led to believe, through popular media, that technology has freed up our capacity to think, and to empower us to connect and gather new information. But what if it has done the opposite? What if, along with giving us information, technology has begun to control what information we see, how see it, and what we do with it, and therefore begun to control what we think and do?

Consider the notification. The standard way of using a smart phone is to allow notifications from every application. Taken across email, messaging, social media, updates and so on – this results in a deluge of notifications every hour. Most people appear addicted to notifications, in the sense that they feel compelled to look at them and unable to ignore them. We must know exactly what the notification is, from what app, and what the message says. The noise – the light, inoffensive ping – the screen lighting up and the vibration, all trigger a chemical reward structure in our brain, releasing dopamine and making us excited about what we might find.

Studies have shown that people have mild hallucinations regarding phone notifications. They often imagine that they have received a notification, only to check their phone and find nothing. They feel a vibration, but no, nothing occurred. These kind of mild hallucinations are associated with the addiction. Sometimes if we want something badly enough, our brains make it happen on our behalf.

 

Changing our Thoughts:

Recommendations:

We might consider the ways in which our thoughts are being shaped by new technology.

Consider the recommendation. YouTube, Amazon, Facebook and other platforms will ‘recommend’ content to you based on the content you have already engaged with. If you watch a video on cyclones you will suddenly have recommendations for dozens of weather videos. If you buy a pair of shoes you will suddenly have hundreds of recommendations for other shoes. On the surface, this appears innocuous, even helpful. It’s great if I can get some more specific recommendations based upon the things I like. But what happens when these recommendations occur over months and years? What happens if, every time you go to a site, you keep being recommended the same things? The answer is obvious: what you think about while on that site begins to narrow.

It is clear that recommendations have begun to narrow our capacity to think, by narrowing what we spend our time thinking about. This is the second evolution of advertising. The capacity of advertising to direct our thoughts away from randomness (thinking about whatever randomly occurs to you) and towards specific actions (buying more shoes).

Over time, this can lead to a strange silo-ing effect online. If you like a certain political party, you might start only seeing content from that side of politics, for example. This can result in you becoming more narrow-minded, ideological and even extremist in your viewpoints. The narrowing effect sets out a pre-defined path for our thoughts to travel down, and the algorithms keep reinforcing the journey until we reach the other side.

Many people have started accessing YouTube and other services logged out of their accounts, so that they can still engage with a kind of random discovery that was so quintessential about the early days of the Internet. There is something fantastic about discovering something new and unexpected – and it is these moments that people chase when they move away from recommendations lists.

Regulation:

The law is always ten years out of date when it comes to new technology. Facebook as a medium experienced widespread fame in 2007, and it is only now, in 2018, that governments around the world are considering regulating it. There have been proposals, for example, to minimize the addictiveness of social media by having a pop-up that tells users ‘you have been using the app for more than 30 minutes, do you want to continue?’

These and other basic measures can be seen as safety features. Online, we face the threats of addiction, narrow-mindedness and a loss of control of our thinking – all of these require regulatory frameworks in response. It is not enough to trust companies to self-regulate on these matters. Social media companies benefit the most from customers who are extremely addicted to their technologies. Law is a necessary component for fixing these kinds of social-societal problems.