This article is the best-in-category winner entry to the Justis International Law and Technology Writing Competition 2020, from the category of ‘social media, data and privacy’
In our data-driven society, every piece of technology that connects us to the internet collects our personal data (any information relating to an identified or identifiable natural person), building elaborate profiles on what we are doing, where we are, and even who we are.
As data subjects (those about whom personal data are collected), we can no longer hide from data controllers (those who collect and determine what these data are used for). With every data breach and data sharing revelation from Cambridge Analytica to Google’s Project Nightingale, our personal data is becoming less personal, where data attached to our identity are no longer in our control and becomes harder for us to identify who is responsible.
The data subject’s struggle
Recognising the need to protect privacy as an individual’s right, data protection attempts to rebalance power between data subjects and data controllers. The European General Data Protection Regulation (GDPR) [1] grants data subject rights such as the right of access [2], right to be forgotten [3], and right not to be subject to a decision based solely on automated processing [4]. Data controllers must also follow the principles of data protection by design and by default [5]. However, even with the GDPR, data subjects still lack the extra hours and cognitive capacity to exercise these rights. Only 15% of EU citizens feel completely in control of their personal data [6]. Additionally, while there are multiple means for lawful processing of personal data [7], data controllers have weaponised consent by using privacy policies written in legalese and dark patterns to hide privacy-protecting options, obfuscating how data subjects’ data are reused, aggregated, and anonymised to make decisions about them.
Everyone is a data controller
Responsibility over personal data is further complicated where judgements have expansive interpretations of who could be considered a data controller. A user who administers a Facebook Group or Page [8], a website operator who has a Facebook ‘like’ button or other social plug-ins [9], and a religious community whose congregation conducted preaching activities and collected personal data [10] are ‘joint controllers’ who are all liable if one controller breaches requirements on those data. This significantly increases the number of data controllers and people responsible for personal data, where not all joint controllers need to have access to the data for joint controllership to occur. While these judgements introduce more responsibility, they also disperse where data responsibility lies, increasing the ambiguity over who can share, reuse, and repurpose data.
From my data to our data to your data
Beyond the individual, initiatives such as Decode encourage public institutions to be more responsible with its citizens’ data. However, governments continue to watch over its people through social credit scoring, criminal sentencing, and partnerships with privately-owned, pervasive technologies. In the age of surveillance capitalism, where personal experiences are translated into free raw material for behavioural data, our personal and derived data are collectively used against us. Although data protection and information rights enable some forms of transparency and accountability, our data are still often used without our knowledge and without legal recourse as decisions are made using unexplainable black-box algorithms [11].
Want to write for the Legal Cheek Journal?
Find out moreReclaiming our personal data
In order to better understand how our personal data is being used and abused, we need to look beyond data protection on an individual level. Instead, privacy should represent an ecosystem that requires legal and socio-technical collaboration between lawyers, technologists, policymakers, and most importantly, us as data subjects.
Firstly, stronger regulation beyond data protection is required to fully realise the responsibility data controllers have over our personal data. While the European Data Protection Board established guidelines to clarify the GDPR, further regulatory guidance has only been provided by academics and has yet to be codified. Regulators should do more to prevent ‘ethics washing’, whereby data companies use ethics boards and policies to limit regulation. Competition law in particular may help us escape the grasp of digital behemoths. Looking beyond fines, Margrethe Vestager, the EU’s competition commissioner, plans to regulate industries such as artificial intelligence and gig economy companies to return the ethos of ‘consumer is king’ back to data subjects. Other mechanisms include using legal data Trusts to empower data subjects by facilitating access to pre-authorised, aggregated data and remove key obstacles to the realisation of the potential underlying large datasets.
Secondly, although many of the challenges described are driven by the business models of data controllers, technology should be considered part of, and not excluded from, solutions that help data subjects better understand how our data are processed and managed. Tools such as Databox, Jumbo Privacy, and DoNotPay are already beginning to challenge the data protection practices of Big Tech companies, providing alternatives to existing services and mechanisms for control.
Finally, in considering how personal data should be best protected, data protection must be considered beyond the individual. Data protection should look beyond privacy as control and be expanded to include the ability to participate and engage with other individuals and groups, crowdsourcing information and solutions to personal data challenges. Philosophical discussions surrounding group privacy can be put into practice. Developing a data protection public sphere and commons, regulators, lawyers, and technologists can support data subjects in minimising the risks involved in the public use of anonymised personal data [12] and establish the necessity for collective rights [13] before and after data are collected. The protection of data subjects with regard to the processing of personal data can only be achieved where legal frameworks and technological mechanisms include input from data subjects to respect their data protection requirements.
The responsibility over our personal data should not burden data subjects. As data protection matures, this responsibility should be shared with all stakeholders that benefit from the personal data, not only with those about whom personal data are collected. It is only with legal and technical collaboration that data subjects can be collectively protected, governing the data protection landscape for the benefit of our current and our future selves.
Janis Wong is a PhD student in computer science at the Centre for Research into Information, Surveillance and Privacy (CRISP), University of St Andrews. She holds a LLB from the London School of Economics and a MSc in computing from the University of St Andrews.
The Justis International Law and Technology Writing Competition is in its third year. This year, the competition attracted entries from students at 98 universities in 30 countries. Judging was conducted by a panel of industry experts and notable names, including The Secret Barrister and Judge Rinder.
Sources:
[1]: Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.
[2]: ibid art 15.
[3]: ibid art 17.
[4]: ibid art 22.
[5]: ibid rec 108.
[6]: Bart Custers, Alan M. Sears, Francien Dechesne, Ilina Georgieva, Tommaso Tani, and Simone van der Hof, ‘Conclusions’ in Bart Custers, Alan M. Sears, Francien Dechesne, Ilina Georgieva, Tommaso Tani, and Simone van der Hof (eds), EU Personal Data Protection in Policy and Practice (T.M.C. Asser Press 2019).
[7]: General Data Protection Regulation, art 6.
[8]: Case C‑210/16 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH ECLI:EU:C:2018:388.
[9]: Case C-40/17 Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV ECLI:EU:C:2019:629.
[10]: Case C-25/17 Tietosuojavaltuutettu v Jehovan todistajat — uskonnollinen yhdyskunta ECLI:EU:C:2018:551.
[11]: Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information (Harvard University Press 2015).
[12]: Luciano Floridi, ‘Group Privacy: A Defence and an Interpretation’ in Linnet Taylor, Luciano Floridi, and Bart van der Sloot (eds), Group Privacy: New Challenges of Data Technologies (Springer International Publishing 2016).
[13]: Joseph Raz, The Morality of Freedom (Oxford University Press 1986).