The Human Component Speaks

Photo: Rainier Ehrhardt /Getty Images Rainier Ehrhardt

Photo: Rainier Ehrhardt /Getty Images Rainier Ehrhardt

A critical article on Facebook’s extensive collection of data prompted hot debates in social media. The article was published in the magazine Morgenbladet, and written by PhD-student Anja Salzmann.

In her article, Salzmann points out that Facebook’s psychological experiments on unknown users threaten human rights. She refers to Shosana Zuboff and says Facebook undermines our autonomy and basic democratic principles. Salzmann referred to Facebook's representatives who describe the human component as challenge. Are humans just frictions in their technical systems? She asked, addressing the business model and the human view of Facebook. The following text is a translated summary of Anja Salzmann’s article in Morgenbladet.

“Facebook knows so much more about us and our loved ones than we can imagine. It is no surprise that Facebook's communications manager in the Nordic region tries to curb growing public dissatisfaction with the company's ongoing practices. Reports on various Facebook activities, like the so-called content moderators, who have to cleanse the platform of deeply traumatizing content, have travelled around the world. These cases show the most disturbing activities human souls are capable of producing.

A recent report from Amnesty International represents just the latest example of critical reviews against Facebook. The report shows that Facebook's business model, which involves collecting large amounts of data, threatens human rights.

Facebook makes good money collecting, combining and selling data from billions of people. In 2014, Facebook conducted several psychological experiments with unknowing users. Thanks to Harvard professor Shoshana Zuboff, this experiment is no longer a secret, but sadly, the revelation did nothing to change the bigger picture. You don't even have to have an account on the platform itself for the company to tap you for sensitive details.

Anja Salzmann, PhD, Department of Information Science and Media Studies, University of Bergen.

Anja Salzmann, PhD, Department of Information Science and Media Studies, University of Bergen.

With this in mind, it is not enough to call for cooperation to find regulations. We have to consider far more fundamental issues concerning Facebook's business model, which undermines our autonomy, our self-rule and some of our basic democratic principles.

We need new technology and new mechanisms to orient ourselves in this growing social, political and economic transparency. Facebook presented us with a solution to an underlying human need: the need to be a social being in a world that, thanks to the internet, has shrunk to 'a global village'.

Surely, in the internet age, you can keep in touch with friends and family without Facebook's solutions. Of course, Facebook and other stakeholders will try to convince you that you can't, with phrases like 'We are connecting people' and 'For an open and connected society'. However, research actually points us in a quite different direction. Facebook separates people and dissolves the major public arenas. This is especially dangerous in our time, where existential debates on climate, increased energy needs, overpopulation and geopolitical shifts of power are more intrusive than ever before.

Where should we have the big discussions and debates if they are drowning in an infinite number of small echo chambers? The term 'The God's Eye View' originated in Silicon Valley, and it symbolises the concentration of power and control of a few people with access to the most advanced digital technology. In other words, by using algorithms that analyse trends, feeds, habits and private messages, Facebook is able to observe and manipulate 'the global conversation'.

When Facebook's representatives describe 'the human component' as 'a social media challenge', it gives me goose bumps. It suggests a deeply technocratic-rooted human image. Are humans just friction in their technical systems?

Facebook knows so much more about us and our loved ones than we can imagine. In my eyes, this raises completely different concerns than just finding regulations to solve issues like fake news, hate waves or outright streaming of terror or murder. Rather, we should talk about the business model itself and Facebook's human view!”

Sara Pedersen Stene