I was waiting for this moment. When, after a whole term discussing the value of authenticity, we see that authenticity is over.

The graphic above should be understood as a provocation–but an interesting one. It traces the meaning of authenticity from the early 2000s, when hipster meant an obscure and non-commercial reference that nobody heard of. Soon after, the same attributes that signaled anti-capitalist values (small, local, craft, community) were applied to large scale production and audiences, and authenticity starts having a very different sense. It was not an oxymoron anymore to be authentic, to be yourself, for a large number of anonymous subscribers (but please like) or to massively sell magazines on the right way to be authentic (like Kinkfold or the Pinterest culture). Suspiciously, authenticity did not mean difference but similitude, where popular YouTube vloggers had a very similar one.

And then, authenticity was over.

Not that unexpected, though.

If we pay attention to the development of music production, auto-tune has been widely incorporated (in every sense of the word) since the late 90s. And it has not always been incompatible with what we call authentic.

The notion of authenticity, of course, has a longer history and different incarnations across time and cultures. Curiously, though, it is often something that someone claims over another culture/person/group, and not something that one can claims for oneself.

Coming back to our topic, how is this post-authenticity represented? where can it be found?

We are talking about post-authenticity, in this context, when the idea of the authentic does not read as something that is necessarily real. Or, put it differently, when the value that we associated with authenticity (affective, importance) can be rendered/found in different ways. That is the case, for example, of people liking Donald Trump because ‘he says the truth,’ even when he is lying–as we discussed in class. The whole notion of ‘fake news’ can be read in this way, when fake is not something that is not real but something that I don’t approve as real.

The documentary embedded below, Hypernormalisation by Adam Curtis, develops in the construction of this parallel reality (for lack of a better term).

Post-authenticity is inextricable from digital technologies. If authentic means genuine, it has in its etymological root the idea of genus, origin. Then, authentic is something that is originated from someone, that is born from someone.

What happens when the ways that we have to discern if something comes from someone are manipulated? A particular voice, for example, becomes something that is not that particular with new technologies. This is the case of VoCo, an Adobe software that can ‘speaks’ as a person after listening to them for 20 minutes.

[The Montréal based startup Lyrebird claims having a better match]

Post-authenticity, however, does not only describe something that passes by the real thing but something totally different. That is the case, for example, of Vocaloid, software owned by Yamaha that can create synthesized voices. Hatsune Miku is the biggest Vocaloid pop star.

Probably the most well-known example of manipulated reality on video is deepfakes. Deepfakes describe digital products (commonly videos) made through machine learning technologies, particularly deep learning. Deep learning, in fewer words than the Wikipedia definition (not bad as a starting point, by the way), describes the use of neural nets to massive data sets. By processing a massive set of examples, machines learn how to operate like them. That is why deepfakes often use popular stars as their ‘training data’: because there is plenty of data to train with.

Deepfakes can have unlimited uses, such as having Mark Zuckerberg or Barack Obama expressing polemic opinions.

And porn, of course.

Memes are also an expression of post-authenticity. It is not important anymore how original is something but its circulation and repetition. Actually, things that are too original could be less relatable and have, paradoxically, a more reduced circulation (or a more reduced impact).

But something that is not necessarily original (or where its origin is less traceable), could help people to be, paradoxically again, more authentic.


  • How can we relate a general sense of post-authenticity with new and sophisticated  tools for surveillance? How deepfakes coexist with automated facial recognition?
  • If authenticity is not the source of affect that it was, what it is?
  • What is the difference between the networked image that we discussed before and the operational one?

A Sea of Data: Apophenia and Pattern (Mis-)Recognition. Hito Steyerl.

In his article for e-flux in April 2016, Hito Steyerl discusses the question of the recognition of objects, humans and behavior by machines as a process of recognition of patterns and data gathering. He starts by giving examples of the shapes and faces we see when we look at clouds. He then explains that the machines read the information in the same way, it observes a signal (which we cannot decipher with our human eyes) and interprets it to deliver some information. The article then looks at the issue of technology recognition, where we teach technologies to recognize certain data and interpret it based on signals that we have predetermined beforehand. Allowing these machines not to analyze the so- called «useless» data known as dirty data and to use only certain specific data.

The article then makes us aware that the decisions and data delivered by these machines make us act in our real world, preventing some people from crossing the border, saving immigrants, categorizing humans socially, give them a role or some importance in society. The interpretation of these machines of our behavior is the basis of how we act in society. Nevertheless, Hito Steyerl gives us the example of the cosmos, which we have been studying for years but on which we do not necessarily have concrete data. What we know about cosmos is then only assumptions, meaning, if we project this reasoning on our society, that what we learn in machines is only an assumption of what life must be, of what things must be, and thus proving that our societies were built only on assumptions that we feed and that we anchor in our cultures by the authority and power that we give to the machine.

More concretely, the article explains the principle of apophenia advancing the fact that we interpret messages through concrete patterns that we see. It is particularly interesting to understand that during Ancient Greece, the words of men were regarded as signals while the words of women and children as noise. Recalling the idea that machines analyse the data they choose and erase dirty data considered as noise and thus demonstrating a patriarchal society whose foundation is biased by the hierarchy of information and social roles.

Thus I come to ask myself the questions:

  • – The idea of surveillance is only a vague idea dictated by an elite so that we correspond to this elite or is it based on a real will to create a just society?
  • Thus, discipline, the way of being, the way of expressing oneself, are they abstract concepts whose limits we must retrace, aiming at seeing the state or society as the wrong interpretation of a group of persons? And can we change this society by changing the patterns we interpret or by changing the simple interpretation we have of these patterns?


session 12 Jared

How can we relate a general sense of post-authenticity with new and sophisticated  tools for surveillance? How deepfakes coexist with automated facial recognition?

I don’t think we can relate at this point. I believe it’s too soon to truly see the damage that is being done by signing away all of our privacy to these apps that we use on a daily. The majority of us say yes to anything the apps ask for and whatever they are using our information for probably has not affected us just yet. I believe that there are probably already banks of information on each of us that can extremely negatively affect our lives in the future. For example, whenever an application asks for permission to use our camera roll or microphone that is an easy 90% of our daily lives taken from us.

As far as deepfakes are concerned I believe that in the future it will be a lot easier to steal somebody’s identity very easily, however because we are talking about privacy so much these days, I don’t think it will take very long to find better measures against it.

If authenticity is not the source of affect that it was, what it is?

These days I think that authenticity is just an emotion or a level of respect. It is no longer expected in society today, it is only appreciated as an act. What has replaced authenticity is yet to be truly found. Since the truth has become increasingly harder to find valued communication realms have been lost.

2 replies
  1. Long Xi
    Long Xi says:

    How can we relate a general sense of post-authenticity with new and sophisticated tools for surveillance? How deepfakes coexist with automated facial recognition?

    With new sophisticated tools for surveillance, such as smart cameras, tracking applications, and social credit systems, we have reached a point of post-authenticity where the data that is being tracked of us is data we are not aware that is available ourselves, as these surveillance technologies treat us less like people and more like quantifiable beings. We cannot even grasp the data being collected from us, because we are either not aware of its collection (ie. Google tracking our every move, modes of transportation, identifying from movement where we live and work) or the data collected is not something that would have ever crossed our minds as important in our every day lives (ie. 24/7 heart rate, number of clicks, website cookies, etc.). Data collection is very post-authentic because it does not treat us as real humans, but quantifiable machines/sources of data. Automated facial recognition, in treating our faces as sources of data that fulfill certain digital criteria, codify our faces and therefore make it possible to create digital depictions of humans that are not even real. Whether the data that is being collected from us could even be considered in the realm of reality is also a question to be pondered, as many of the collected information is based on social constructs (demographic data, tracking facial emotions, etc.)

    If authenticity is not the source of affect that it was, what it is?

    I think with post-authenticity arriving at the intersection of the most saturated we have even with media in history, affect comes from both shock value and radical differentiations from what we are used to. For example, the affective nature of Hatsune Miku is that we are not used to seeing such complex AI performers, and the affective value of Donald Trump is that he is so frank and different than any other politician in recent memory. These shocking forms become most affective because in a cultural context, they border on the absurd and unbelievable, and in a poltiical context, they break through the dominant neoliberalism and moderate politics of the moment. It is only fitting in a world with deep fakes and AI performers that we also have a reality TV show president because as humans inundated with so much similar media, these bizarre stand outs in culture end up attracting our attention. Even if we hate deep fakes or Donald Trump, we can’t help but be fascinated and affected by them, more so than we would be by straightforward technologies we are used to, or a boring politician.

  2. Guillaume
    Guillaume says:

    How can we relate a general sense of post-authenticity with new and sophisticated tools for surveillance? How deepfakes coexist with automated facial recognition?

    I sort of wanted to add onto Jared’s response, as I really agree with his point of us not having seen the affects of these surveillance technologies just yet. Until something really does happen and it is too late to fix, I dont think that most people will be concerned by this issue. Similarly to climate change, unless it is impacting them directly or is to late to fix, humans dont seem to care to prevent things from going bad. This is something that I have seen in many different instances and I think that these issues of surveillance that most are completely choosing to ignore will have a very negative outcome in the future. Once again I am sure that once something does happen everyone will ask why we didn’t act upon it when it we knew it was coming. However unfortunately we are all the reason for these issues, as most of us choose to do nothing to change our habits and prevent ourselves from sharing this info.

    If authenticity is not the source of affect that it was, what it is?

    Similarly to the concept about the most popular memes being the ones that are not original, I think that the fact that the media is so flooded with the same things is playing a big role in these issues of authenticity. We all see relatively the same content on Instagram each day. Even though some of us are part of more niche groups we are exposed to content that everyone else sees. This happens multiple times a day. Even within some of the niche things I follow I am shown the same things multiple times a day. Nothing is really new or diverse, I think that the lack of diversity of what we see also plays a role in this as we all see the same things. I think that today being able to all sort of reflect on the same content that we all see has sort of replaced authenticity.

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply