Photo: Jan Slavik
Joana Moll is a Barcelona based artist and researcher. Her work critically explores the way techno-capitalist narratives affect the alphabetization of machines, humans and nature. Joana’s main research topics include Internet materiality, surveillance, social profiling and interfaces. She has presented her work in renowned institutions, museums, universities, festivals and publications around the world. She is the co-founder of the Critical Interface Politics Research Group at HANGAR, Barcelona, and currently a visiting lecturer at University of Potsdam.
The interview was conducted on May 12, 2020 by Daniel Irrgang, fellow at Weizenbaum Institute’s research group “Inequality and Digital Sovereignty”, following up the preparations of the “almost” symposium and exhibition “Practicing Sovereignty”. Here, Joana would have been an invited speaker and exhibiting artist with her work “The Hidden Life of an Amazon User” (2019). Due to the current crisis, however, this event had to be cancelled. For news regarding a new date of or the upcoming publication on “Practicing Sovereignty”, please register to our newsletter.
Joana, thanks for taking the time, in these times of video chats, to have yet another one with me.
I would like to use this possibility to talk with you about your work and your take on, or opinion about, some current “technological contingency measures” in these times of crisis. Since those measures touch questions of individual digital sovereignty, they are of strong interest for our research group at Weizenbaum Institute. In your talk on contact tracing apps during one of the Disruption Network Lab’s “Disruptive Fridays” recently, you stated “this could be a precedent that could be normalized”. I had to think of Naomi Klein’s book The Shock Doctrine. That’s pretty basic literature for you, I believe. She states, referring to Milton Friedman’s neo-liberal theses, that in times of crisis reforms or agendas, which wouldn’t be implementable under normal circumstances, can be brought forward and ultimately normalized. Would you agree to this hypothesis? Or could you elaborate in general the concern you expressed during your talk?
Naomi Klein’s book was indeed the first that came into my mind. When it was published, it was so mind blowing. Because she put it so clear. It is also an amazing historical review; she even mentions the “Chicago Boys”, etc.
Since the creation of the panopticon I think two things happened: The more our devices become smaller and smaller, they are more attached to our bodies. They get closer and closer to our bodies. And finally, the smaller our devices, the objects that execute surveillance, the less we have control over the things that this object is surveilling. I think this is a very poetic contradiction.
Now, in the COVID-19 crisis, we already have devices attached, in a way, to our bodies. And now they are breaking yet another border, which is connected with the control of biological rhythms. So if today someone says “we are going back to the panopticon”, I think we have never left it, and we’ve actually enhanced it. This is why I think the current crisis sets a very strong precedent. A social consensus is developing and measures will be normalized as standard practice without a proper democratic process. And that’s it. I don’t think we are ready yet to understand the short and long term implications of such a development.
So you think there is a real threat that measures will not return back to “normal” afterwards, or at least not completely?
I mean, it’s just like everybody says: it’s the new normal. We just go to a new situation, like we did after 9/11 and like we did when there was any major rupture in history. Then change happens very fast, from one day to the other. As Naomi Klein says, the shock that these situations trigger paralyzes citizens and some changes are brutally implemented without having the time to democratically assess the new political paradigms that will arise for a society as a whole. It blinds us from fully understanding what this will mean for our lives, and what it actually will mean to be human in this “new normal” situation. And I think we have enough historical precedents to understand why we shouldn’t take those decisions so fast.
I agree. I jump to a next question, because it connects to something you just referred to. You were talking about devices getting smaller. They are also getting more proprietary in a way, despite the open source movement, which is strong as well. They are becoming black boxes. And here I want to ask you a question about your artistic work, which, as I understand it, has a critical engineering impetus since it is strongly engaged with opening these black boxes. So, your work uncovers data exploitation mechanisms and strategies or, more general, the impact of ICT and data extraction on culture, and also nature. Could such an artistic approach be a strategy to help shifting the discourse of superficial “techno-solutionism”. And could it actually help to uncover what actually happens with the data generated by those apps? Or, asking more generally, how could such an artistic approach, your kind of artistic approach, play a role in the critical investigation of technology?
I’ve read this great article by Evgeny Morozov and he also mentioned something that I’ve been thinking about for many years, which is the crisis of imagination. I also think that we are unable to devise other than technical solutions to problems. It feels like this is a very reactive thing to do, like “okay, we have this problem, but we can fix it through technology”. Because that’s what we’ve been doing for decades. After WW2 and especially during the Cold War human experience, instinct and even historical events were drastically undermined in favor of fully automated technological solutions when it came to management and control.
I wish I’d know what we’d need to be able to overcome this crisis of imagination, but for now I’m just spotting a few thoughts here so we can discuss it. Sadly, I didn’t have time to articulate a proper discourse on that yet. However, I think that cynicism plays a major role here. Cynicism, as exposed by Franco Berardi,* is just a way to block possibilities and perpetuate the status quo, in a sense. I believe that cynicism is systemic and affects most of the narratives that articulate our socio-technical arrangements, and we accept it. In a sense we know that our systems do not work, but we tend to accept them as the only possible way to go, as the only possible alternative. In my opinion it’s very hard to imagine other possibilities when you don’t really identify or recognize “wrong” as in wrong, but as in “good wrong”. In fact, I believe it’s almost impossible to imagine new paradigms or other modes of thinking when we don’t accept or don’t fully embrace the reality we live in. I don’t know how else to explain this.
* “Zynismus is not disruptive. It’s only an internalization of the impotence of truth which is born out of the failure of the 20-th century utopian ideologies and the perception that exploitation of labour, competition and war are inevitable and irreversible (capitalism). […] While irony does not postulate the existence of any reality, cynicism postulates the inescapable reality of power, particularly the power of the economy.” – Franco ‘Bifo’ Berardi, The Uprising: On Poetry and Finance (Semiotext(e)/The MIT Press, 2012).
I think I know where you are heading at and I really like the idea you are proposing, about cynicism, being cynical.
I wrote a little note on that the other day, I can just read it: “Cynicism prevents thinking about other possibilities because the real narratives that articulate our reality are not fully recognized or integrated in the social imagination.” Greenwashing is an example: “Green energies will save everything!” But this is an impossibility because there is no 100% green energy. But we accept this reality and so we don’t feel the urge to rethink, for example, the extremely polluting global supply chains that allow green energies to exist, and thus re-imagine them as sustainable systems.
This connects wonderfully to a participatory “exercise” Bruno Latour just launched online in context of a research and exhibition project at ZKM Karlsruhe connected to climate change, of which I am part of as well. The exercise is a participatory approach to consider which of the changes that came with the pandemic, even limitations, would we like to persist after the crisis. It is also a means of reflecting one’s own behavior and personal attitude towards the world we live in; it is ultimately about responsibility, or respons-ibility, as Bruno Latour calls it, being responsive to the world we live in and are an inherent part of.
But the current situation – both climate change and pandemic – is so difficult to grasp, especially on an individual level. The complex questions related to the pandemic involve technology, biology, epidemiology – just a handful of people can sufficiently engage with all of these fields. Instead, some people are dealing with this complexity by running away from responsibility, for example by buying into cynicism. Also by buying into conspiracy theories, which we see rising at the moment – the need to simplify the complexity and to search somebody who’s responsible, whoever that might be. Do you think this cynicism is a mechanism of running away from individual responsibility?
I don’t believe that we can think of cynicism as a conscious way to escape responsibility. As I already mentioned, I believe cynicism is embedded in our systems by design, so as individuals that operate in such systems we are pretty much functioning according to cynic parameters. Besides, I have a problem with individual responsibility. I don’t think individuals should bear all the responsibility for problems that are fundamentally systemic. As you stated, we’re part of a very, very big ecosystem where a vast number of entities converge and affect each other. So, I feel that attributing responsibility to one individual actually just replicates the logic of techno-solutionism: reducing complexity to fix one thing, and thus avoiding to tackle the chain reaction that triggered the problem in the first place. I think the COVID-19 crisis can exemplify this quite well. Again, cynicism plays a big role here. All this being said, I believe we have to think collectively, find solutions collectively and take responsibilities collectively. In my opinion this is the only way of approaching any systemic problem.
You are right to question my notion of responsibility. Actually, when we see this term from the perspective of digital sovereignty, a focus of our research group, governmental calls for individual responsibility, in this case as in people acting responsible in the digital sphere, can also mean that the government is not acting responsible but passes on responsibility to the people. So you are right for criticizing this notion of responsibility. Maybe a better term, which is connected to responsibility in a way, could be “change of own behavior”. If it has to be a collective solution, then acting collectively starts with changing individual behavior.
I’m not sure. I think that it’s starting from negotiating with your, say, ecosystem and not just with the person that you are living with, your neighbor or the people that you know. There are so many different sensitivities and so many different things that should be included in any kind of solution. So, negotiating is key, because anything that we could devise individually or any behavior that we could change individually might only respond to our own view of reality and wouldn’t include others. I believe it’s really about communicating and negotiating with the other – humans and not humans (including viruses!) – which is quite hard because we don’t really know how to do it. We have no clue, actually! [laughs]
The work we’ve been preparing in the last two years stressed the term of negotiation, how to negotiate between different groups. Between a group with one world view and another group with another world view, a view which may not only be different but which may be pretty much opposed to one’s own and which one really doesn’t like – and yet, one still has to negotiate. How do you think an approach such as your artistic work could help facilitating this negotiation?
What I try to do is to provide new connections to think about new sites and maybe even new paradigms, to imagine other possibilities. By disclosing realities that are part of systems, especially when it comes to technology. In fact, I think it’s especially about disclosing cynicism, in a way, when one is disclosing hidden mechanisms. It is basically about looking at systemic aspects that are not obvious but which are critical to the system.
So, what I do is just this, trying to connect things as much as possible, things which are connected, but which are very disconnected from the social imagination. In the sense of, for example, the “Amazon User” project. To understand that all this user tracking happens in our devices. And that we’re not just being exploited by means of free labor – that our activity as a user is monetized by multiple parties – but that also part of the energy needed to execute all these free labor falls on us. Because the content of the website is being downloaded in your computer and you have to bear the energy costs that such processes require. And yes, you can say it’s not a lot. But yet, that’s how the system work. So, my work is about disclosing all these sorts of hidden mechanisms or processes that happen and which are not obvious, but which are critical to the digital economy – and which have major implications for the users and the environment.
I’m not sure if you implied that already, but besides providing the energy and the time, or labor, for feeding your devices, one is producing basically the most valuable resource of the digital economy, which is data. Data is not generated by tech-corporation. They rather find means of harvesting data which has been produced, in a strange circular fashion, by the customers using their applications, devices and services.
But this is not embedded in the social imagination. And this is insane. And that’s when cynicism comes into play. One sort of knows it, but one accepts it. But then, how can we imagine alternatives if we accept this? It’s very hard to imagine how we can reach a shift if we don’t embrace reality as it is. Even if it hurts. [laughs]
Maybe there is cynicism. People know that they shouldn’t unfold their whole life on these platforms – if concerns of privacy play a role for them, that is. And yet they do it.
To connect that to the current crisis, do you think this strange contradiction affects a widespread use of COVID-19 contact tracing apps which is currently discussed pretty much everywhere? Do you think this subtle awareness, at least in some parts of our societies, about how personal data is used in ways one doesn’t agree to effects people’s willingness to use contact tracing apps? Because in the last years, parts of the population became increasingly skeptical about digital services. People are cynical, but they somehow do know that it can be considered problematic that one’s personal interactions or locations could theoretically be tracked at any given time.
It’s not so much about the people being cynical. It is a very cynical infrastructure. Because Google, Facebook and all these big players are the most cynical of all. I didn’t mean to say that it’s the user who is being cynical. It is an embedded systemic cynicism.
I think there are a lot of cynical users as well…
But, you know, it’s just part of a systemic cynicism. It’s cynicism by design!
Do you think that the effects of past political ruptures such as the NSA revelations, the Snowden revelations or the Facebook/Cambridge Analytica revelations would actually stand in the way of people using contact tracing apps today?
A valid question. After the Snowden revelations there has been a very strong public debate on privacy. The GDPR (EU General Data Protection Regulation) is a consequence of that, and it started from civil rights initiatives demanding more security, privacy and so on. And this was such a good start, even though I have my concerns about GDPR – but it was a good start. But then, suddenly, the COVID-19 crisis broke. And it seems to me that many of those objections have now vanished for the sake of security and health protection and securing the economy.
However, it’s a very difficult consideration between privacy and public health. The father of a good friend of mine was recently in the hospital with COVID-19, staying in the intensive care unit for seven weeks – and he recovered, it’s just a miracle. And now I was wondering how I’d react if it were my father. So, I understand that people are willing to accept and need to react to this – i.e. using a magic app – instead of taking a step back and trying to understand why all this happened in the first place. But even if we implement an app it’s not sure that it will work. This is an environmental, social, economic and political crisis, and any possible solution should propose a re-balanced relationship between such entities.
Something what really strikes me from the current situation is that everybody is just dying to go back to “normal” instead of questioning this normal, which is what actually lead us here. In many ways, I believe that the COVID-19 crisis offers a precious window to re-think our fundamentally broken systems and, for example, to activate drastic measures to reduce social inequality and climate change impact at large. But one loses hope when global governmental organizations such as the OECD and its list of Covid-19 measures claim that countries have been spending tones of public funds but have failed on tapping the wealthy or large profitable corporations.
Maybe this is in line with the “cynicism by design” you mentioned. The contact tracing app is not a structural change. It’s, as Jaron Lanier would say, yet another gadget.
Yes, it’s just adding another layer and another layer on an already dysfunctional system, like a plaster when the wound is still infected. It’s techno-solutionism at large. It will consume more resources urgently needed elsewhere and – in some cases – dramatically cut civil rights, reduce freedom of expression, and basically generate more fear. And an app like that could dramatically change all the “social tissue” and the way we use to communicate; it could even create a whole new reorganization of society based on who is healthy and who is not. And of course, the COVID-19 crisis could be the beginning of a very profitable business. And I think we have to be able to see this – and we need a little bit of time to acknowledge this and to introduce it into the public debate.
Some of the proponents of contact tracing technology say, “Where’s the problem? It’s voluntary.” Well, in certain countries it’s not, but in Germany it would be voluntary. But the question is if one can really talk about voluntariness considering the major social pressure of “I have to use this app too, because people expect me to do it”. What’s your take on the “voluntary” argument?
Well, it’s similar to owning a smartphone. It’s not mandatory that you have to have one, but then a lot of necessary bureaucratic activities, such as banking or even public administration, expect citizens to use a smartphone. I you don’t have a smartphone you’re excluded in some major ways.
And it’s not only about owning a smartphone, it’s also about having the skills to use it in this particular way, which can be quite complex for some people.
Absolutely, but it becomes mandatory, passively mandatory.
As for the public debate on contact tracing apps, there is a lot of “benchmarking” or proof of concept arguments stating that the app already works in countries like South Korea or China. But is this really a comparable situation, since for example in China, a QR code based app for location check-in is mandatory? Also, there are tighter preexisting networks of individual tracing already in place, such as the partly implemented social scoring system.
We already cannot compare this because of the societal and cultural differences. A friend that lived in China for many years explained to me that many Chinese citizens are actually not so unhappy about social scoring. Because for them, it’s a tool to verify if you can trust people. Apparently – I had no idea. Such social arrangements are difficult for us to understand, of course, because it’s not our culture. I think in this case we need to focus on what’s here. And to understand what we want. And even within Europe, there are many, many differences. So I don’t think it’s comparable. In terms of technological efficiency, of course, you can compare that – but the machines are implemented by humans after all.
A wonderful closing sentence, Joana. Thank you again for your time.
My pleasure, thank you.