Prof.  Gloria GonzΓ‘lez Fuster and prof. Mireille Hildebrandt, co-directors of VUB Law, Science, Technology & Society (LSTS).

The outbreak of Covid-19 has made clear what we already saw coming for years: as our offline life shrunk, our digital dependence grew stronger than ever. Suddenly, we all started receiving basic information about the crisis from online platforms that earn their money by processing our personal data on a large scale. We have been doing our best to continue working and studying, but we use tools that are data-hungry and not fully transparent to users about their pervasive data practices. We search for essentials relying on sophisticated algorithms, we hunt for food through international data empires. We communicate our most intimate hopes and fears to our loved ones through global digital infrastructures, which might take the opportunity to infer inappropriate assumptions, and possibly use them against us.

We all need data protection, but some of us need it even more than others. Our children need to be able to play and learn in safe spaces. Now that their classrooms and playgrounds are all digital and connected, it is our collective responsibility to make sure whatever they make there, in what are for them inescapable circumstances, will not be (mis)used against them later.

We have gradually put most of our lives, and of our societies, into digital hands. We must make sure that we still can keep an eye on what these hands – from private companies or public authorities - might be doing with the data about us. We need digitalisation to effectively and appropriately combat the spread of the virus, but at the same time ensure that the data practices developed are both safe and effective. That is why we need the data protection guaranteed by the General Data Protection Regulation (GDPR) today more than ever.

We need digitalisation to effectively and appropriately combat the spread of the virus, but at the same time ensure that the data practices developed are both safe and effective.

A good example that shows why data protection is important is the app that the British National Health Service (NHS) developed in collaboration with scientists and companies. This app can warn people if they came into contact with potentially infectious others. This is done on the basis of Bluetooth. No information is given about who might have been infected, but an alert warns the user to go into isolation.

The first question with this app is who has access to the data. For this app, the NHS may have partnered with Palantir, a highly controversial American company working with the US secret services. Second, the GDPR requires that access to data must always be necessary and limited to a specific purpose, thereby excluding all kinds of improper reuse. In addition, the GDPR requires that these safeguards are guaranteed on a technical and organizational level, otherwise they will remain just empty words.  

A third question is whether the warning 'possibly infected' is reliable. Developing an app that tells you whether you were around people with a certain 'label' is not very difficult, preventing people from falsely isolating themselves and wrongly thinking they are safe is, given the state of science, not possible at the moment. As said, the GDPR only allows infringements of fundamental rights (not just privacy, but also discrimination and the right to a fair trial), if this is necessary for the purpose to be achieved. Even emergency legislation cannot ignore this. Therefore, if a measure is not effective and thus not necessary, it may not be taken. That is a good thing, not just for privacy, it means that we will only start this kind of 'labeling' if it can actually contribute to the legitimate purpose for which it is used.

The GDPR offers ample opportunities to data processing for scientific research respecting basic safeguards, but the identification of such safegaruds varies considerably from one Member State to another. This is worrying and slows down European cooperation.

Meanwhile, other initiatives have also been launched, with similar apps. The same questions arise here. We mention the explicitly privacy-friendly proposal of PEPP-PT, a collaboration of commercial and non-commercial institutes, and the proposal of a large group of privacy and security experts, DP-3T, which is based on decentralized technology (but not blockchain). The aforementioned requirements also apply to those apps. We note that the PEPP-PT application allows for various implementations, not all of which offer the same level of protection against reuse for undesired forms of crowd control.

Finally, the question of data protection supervision arises with regard to all these types of applications. Claims that an app meets privacy safeguards are nice, but without proper testing and enforcement, this is of little use. The GDPR has a careful system of private and public enforcement that makes it possible to intervene both beforehand and afterwards if our fundamental rights are wrongfully violated.

Today, the GDPR is more important than ever to protect our fundamental rights. This might require, however, a commitment to make it play an even more signficant role. In this sense, the GDPR offers ample opportunities to data processing for scientific research respecting basic safeguards, but the identification of such safegaruds varies considerably from one Member State to another. This is worrying and slows down European cooperation, for example in the field of medical research. There as not yet been a satisfactory response from EU institutions on these matters. The EU has clear competence and fundamental responsibility here. Now is the time to take this seriously.