Designed to serve mankind?
The politics of the General Data Protection Regulation
The collection and processing of our personal data is changing sense-making of the world and ourselves, as are projects of governance surrounding it. In the year since it has become effective, the General Data Protection Regulation (GDPR) has come to represent a global standard for privacy and data protection. Yet, the idea that the GDPR represents a global standard assumes a uniform cultural and political context globally and overlooks broader questions about the interaction between law, technology, and society. Indeed, an alternative view might illuminate how the GDPR’s individual privacy framework derives from a particular political and cultural context, embedding, prioritizing, and stabilizing certain political claims and values over others.
Here the concept of co-production developed by Science and Technology Studies scholar Sheila Jasanoff is particularly useful to examine the relation between the GDPR, personal data processing, and the social. Co-production refers to the idea that technology “both embeds and is embedded in social practices, identities, norms, conventions, discourses, instruments and institutions . . . .”
Through this analysis, it emerges that the GDPR reflects at least three social shifts, each of which is entangled with the others. First, the GDPR marked a practical shift in data collection practices, at the technological and institutional levels. Second, the GDPR represented a stabilization of the public discourse on personal data collection and processing around the issue of privacy, a discourse which became institutionalized through the GDPR, as well as the adoption of similar measures and language by other jurisdictions and institutions. This discursive stabilization in turn has effects on collective understandings and framings of the problems surrounding data processing, reflecting also an epistemological stabilization. Third, the GDPR elevated the privacy rights claims of certain subjects, namely individual “data subjects” in the European Union (EU), reflecting a normative shift.
The technological and institutional shift
Although the GDPR was adopted partly in response to changes in technologies that permitted the collection and analysis of massive datasets, or so-called “big data,” the regulation has also changed the technological and institutional practices in the aftermath of its implementation on May 25, 2018. In response, some technology companies moved user data off of EU servers, changed their advertising practices on their websites in the EU, and certain websites were blocked in the EU, reflecting changes in the technological practices. More importantly, it has required technology companies to respond directly to requests from EU data subjects for access and information on the types of personal data collected on them and for what purposes, to requests for deletion of their data (the “right to be forgotten”), to exercises of the right to data portability, and to their right to object to the processing of their data, among other things. While the longer-term effects of the regulation remain to be seen, the regulation has already had an impact on the way the practices of data collection and processing are performed, both at the technological and the institutional levels.
The discursive and epistemological shift
Perhaps the broader social changes attributed to the GDPR’s implementation relate to other jurisdictions adopting regulations in its aftermath which resemble or adopt its principles. The GDPR has come to attain a status as a model legislation to be aspired to or adopted in other jurisdictions, reconfiguring how a variety of cultures and subjects think about, or ought to think about, the issues surrounding collection of personal data. Indeed, the GDPR has stabilized the public discourse around personal data collection practices as an issue of individual privacy. This can be seen in regulations modeled on the GDPR that are increasingly adopted elsewhere, and institutional and corporate discourses that highlight privacy concerns, centering discourse and knowledge of the problems associated with data collection on that issue. While the privacy discourse preceded the GDPR, and other developments such as the news of the Cambridge Analytica data breach may have contributed to its prominence, the implementation of the regulation, its extraterritorial reach, and its status as a new “global standard” further solidified it on a transnational level. Thus, the GDPR also had an impact on knowledge and sense-making, marking a shift in how the problems associated with data collection and processing are collectively framed in its aftermath. This discursive and epistemological shift is accompanied by the exportation of European values elsewhere, as the GDPR reflects European values toward fundamental rights, including the rights to privacy and data protection. One might argue that given the GDPR’s expressly extraterritorial reach, the EU is trying to promote its legal values as universal values – a view consistent with the EU’s institutional aims.
In developing countries operating in the “digital economy,” such as India, the political and social context reflects a different set of values and concerns than those of the EU. In 2018, India developed a draft Bill containing principles modeled on the GDPR. On the one hand, India has an incentive to comply with GDPR and provide “adequate protection” for data transfers from the EU in order to continue its software and data-related trade with the EU – a large contributor to its economic activity – as the alternatives to meeting the standard would be costly for firms to implement. On the other hand, its status as a developing country means that adopting privacy standards akin to those of the EU might limit the country economically, requiring a balancing between protecting privacy rights and its economic implications.
The Normative Shift
That the GDPR privileges European values of privacy over competing claims reflects political choices. Arguments for global harmonization elide these important political and distributive concerns. For example, we might ask what is at stake when individual privacy becomes the dominant form of discussing and regulating the problems associated with data collection and processing? In the process, what other claims and whose values have been backgrounded? What are the effects of framing these concerns as individual privacy issues on the social and political order in different contexts?
While the practices of collection of personal data and classification and standardization of subjects are not new, they might have been framed as problems of governance or subjectivity rather than as problems of privacy in other contexts. Even today, the issue of collection of personal data could be framed as a distributional issue, a competition issue, a democratic one, or even an environmental one. This illustrates how each of these framings prioritize certain values over others and vis-à-vis what competing forces or claims they ought to be considered.
The framing of social problems around technology also shows how projects of technological governance, in this case the GDPR, are inextricably tied up with, or co-produce, the technological, the normative, the epistemological, and the institutional. The GDPR’s hopeful claim that “the processing of personal data should be designed to serve mankind” is indicative of this, embedding a particular imaginary of both personal data processing and its governance. At the same time, it universalizes the GDPR’s vision of the social good and affirms the EU’s position as a powerful player in the global production of normative values.
Conclusion
Framing the GDPR as a global standard elides its political and cultural specificities. Globalization of the GDPR ensures that a certain vision of the social good wins out over others through the expert language of law. Indeed, the EU’s political and economic power and recognized expertise in regulating privacy issues have a great deal to do with why the GDPR’s vision and its extraterritorial reach have actually had such a strong impact elsewhere – a function of its willingness to assert its regulatory power outside its own borders and its ability to wield its power to generate those effects elsewhere.
Both data processing and its governance raise a number of political and normative questions which require sustained and deep democratic engagement, as well as avenues for diverse publics to imagine and decide their own visions of the social good. While the GDPR has provided ordinary citizens useful mechanisms to directly challenge the practices of data collectors and processors, the exportation of the EU’s regulatory framework and principles to other jurisdictions does little to engage with deeper political questions or to reflect on its broader social effects; rather, it reflects an attempt at universalization of a particular technical fix to political problems.
Roxana Vatanparast is a Visiting Researcher at the Institute for Global Law & Policy at Harvard Law School and a PhD candidate at the University of Turin. She holds an LLM from the International University College of Turin / University of Turin and a JD from the University of California, Hastings College of the Law.
Cite as: Roxana Vatanparast, “Designed to serve mankind? The politics of the General Data Protection Regulation”, Völkerrechtsblog, 27 May 2019, doi: 10.17176/20190529-121845-0.