A Sea of Data: Apophenia and Pattern (Mis-)Recognition. Hito Steyerl.

In his article for e-flux in April 2016, Hito Steyerl discusses the question of the recognition of objects, humans and behavior by machines as a process of recognition of patterns and data gathering. He starts by giving examples of the shapes and faces we see when we look at clouds. He then explains that the machines read the information in the same way, it observes a signal (which we cannot decipher with our human eyes) and interprets it to deliver some information. The article then looks at the issue of technology recognition, where we teach technologies to recognize certain data and interpret it based on signals that we have predetermined beforehand. Allowing these machines not to analyze the so- called «useless» data known as dirty data and to use only certain specific data.

The article then makes us aware that the decisions and data delivered by these machines make us act in our real world, preventing some people from crossing the border, saving immigrants, categorizing humans socially, give them a role or some importance in society. The interpretation of these machines of our behavior is the basis of how we act in society. Nevertheless, Hito Steyerl gives us the example of the cosmos, which we have been studying for years but on which we do not necessarily have concrete data. What we know about cosmos is then only assumptions, meaning, if we project this reasoning on our society, that what we learn in machines is only an assumption of what life must be, of what things must be, and thus proving that our societies were built only on assumptions that we feed and that we anchor in our cultures by the authority and power that we give to the machine.

More concretely, the article explains the principle of apophenia advancing the fact that we interpret messages through concrete patterns that we see. It is particularly interesting to understand that during Ancient Greece, the words of men were regarded as signals while the words of women and children as noise. Recalling the idea that machines analyse the data they choose and erase dirty data considered as noise and thus demonstrating a patriarchal society whose foundation is biased by the hierarchy of information and social roles.

Thus I come to ask myself the questions:

  • – The idea of surveillance is only a vague idea dictated by an elite so that we correspond to this elite or is it based on a real will to create a just society?
  • Thus, discipline, the way of being, the way of expressing oneself, are they abstract concepts whose limits we must retrace, aiming at seeing the state or society as the wrong interpretation of a group of persons? And can we change this society by changing the patterns we interpret or by changing the simple interpretation we have of these patterns?