The global coronavirus pandemic has prompted states to rush to embrace digital surveillance tools such as contact tracing apps as quick fixes and policy responses to the crisis.
Understandably, a lot of sophisticated yet questionable new technological solutions have been hurriedly deployed due to the severity of the pandemic. However, such technologies raise serious concerns related to mass digital surveillance practices, the outsourcing of expertise or sensitive personal data to private companies, and the potential infringement of citizens’ fundamental rights.
Different approaches to the same problem
For instance, Germany rolled out in early April a coronavirus symptom tracker app, designed to provide the government with a plethora of health-related data. The purpose was to better monitor the virus’ development patterns, but it soon came under heavy criticism when it was revealed that the app runs personal data via Big Tech mediators and allows the German health authorities access to users’ data even after the app is deleted. This earlier version kept anonymized infection data on centralized servers and was discarded due to outcry from data privacy activists. In June, Germany released the Corona-Warn-App, using instead a decentralized framework and Bluetooth short-range radio, meaning that users’ encrypted information is stored locally on people’s smartphones.
Conversely, in June the French government embraced a centralized architecture for its tracing app StopCovid that would store and collect citizens’ data on a governmental central server sending alerts to anyone that has been potentially within the proximity of an infected person. Civil society groups have raised questions about this approach and over privacy laws, as well as concerns that the app might be used as a tool for mass surveillance. Yet, the official governmental line is that the use of the app remains voluntary and it only provides anonymous user codes, thus not infringing on any privacy laws. Interestingly, since its launch, the StopCovid tracing application has only reported 14 risk cases, with only 2% of the French population installing the app on their mobiles, as of this writing, triggering debates about the usefulness of these tools against the pandemic.
On the one hand, opting for a decentralized system such as the German case might be better for privacy, but it also means depending more heavily on private companies such as Google or Apple. On the other hand, the French centralized framework highlights that decisions around the public use of data should be made by elected officials rather than private companies. Irrespective of the preferred approach and privacy tradeoff, both frameworks pose potential surveillance risks. They also raise concerns about the normalization of widespread digital surveillance, especially when deploying opaque systems of information collection and predictive analytics.
Privacy and health: can we have it both ways?
This also begs the question of whether it is possible to develop apps that can both ensure user privacy and effectively combat the spread of the virus. Tracking apps open the door for the normalization of government-led and corporate digital surveillance and the risk of function creep, namely the expansion of a technology beyond its original purposes. The controversy about digital surveillance must be consequently situated in current uncritical visions that glorify technological solutions as silver bullets for solving deeper socio-political problems, or in the case of the virus, healthcare crises. Big Tech plays a fundamental role in feeding this hype machine. Digital mass surveillance presents further challenges as regards the intensification of everyday monitoring mediated by tech corporations and via their products. Echoing the chilling premise of surveillance capitalism, the business model underpinning the current digital world seems to have benefited from the pandemic by significantly expanding digital giants’ profits.
What the pandemic has also demonstrated is that during the public health crisis, governments tend to rush into mass digital surveillance solutions as a quick fix. In the process, trust in both authorities and technologies risks being lost. Before even considering the data protection and privacy implications of surveillance technologies, questions should be asked about the trustworthiness and legitimacy of the authorities and companies deploying them, as well as opening the black box of the technologies themselves. How do they work, what data do they gather, how is the data processed and stored, for how long, and for what purposes?
Exceptional or the New Normal
The logic goes along the lines that when everyone feels at risk because of the virus, the acceptance level for exceptional measures and enhanced digital surveillance is higher. It comes as no surprise that states of emergency such as the coronavirus crisis tend to warrant an extension of discretionary governmental powers, which can also be employed as a rationale or pretext to suspend and undermine democratic principles and rights. This view evokes the work of Italian philosopher Giorgio Agamben on the state of exception, namely the coronavirus state of emergency has become the permanent condition of political life via the unusual extension of governmental powers to tackle the pandemic.
However, what this approach potentially misses in the case of the current crisis is that power seems to come not only from governments, but also from other agents, including the virus itself. The non-human coronavirus as a disruptive performative political agent has actually demonstrated the powerlessness of authorities in the face of it, as well as drawn sharp lines between the limits of sovereign state powers and technologically-driven insecurity management. The concepts of liquid surveillance and post-panopticon put forward by Zygmunt Bauman and David Lyon could better describe current new modes of surveillance that are not determined by top-down and direct state monitoring, but by indirect observation and self-surveillance. In particular, self-surveillance has become a staple feature during the pandemic, reflected by self-isolation practices and the use of COVID-19 symptom-tracking apps by citizens.
An analysis based on a centralized governmental view of power risks ignoring other non-governmental actors such as tech giants when it comes to their role during states of emergency, coronavirus-induced or not. The German case of changing course from a centralized contact-tracing scheme to a decentralized one is indicative of how private companies has been dictating the terms of the coronavirus response to national governments. Apple and Google played a significant role in shaping the German government’s approach to building digital tracking tools in line with their operating-system policy requirements. This is just one example of how the internal policy choices of large private tech companies are leveraged against matters of public policy.
Contract-tracing apps could indeed help countries mitigate the spread virus before a vaccine is available for mass distribution. But the question still remains whether there is sufficient public understanding of the technologies, their efficacy, risks, and consequences in terms of balancing public health concerns and the protection of civil rights. Current debates about first generation digital surveillance technologies and the tradeoffs they bring in terms of security and privacy, as well as public-private relations, are ever more important given the expectation that pandemics may become an episodic feature of contemporary life, due to climate change and globalization. They could become backdoors for radical forms of surveillance that undermine the tenets of democratic systems and lead to new forms of emergency politics. They represent a critical experiment for the role technology will play in tackling future pandemics.
The piece is based on the article “New states of emergency: normalizing techno-surveillance in the time of COVID-19“, published in Global Affairs.