Montaigne Centre Blog

Blogposts

The power to be free from algorithmic governance experimentation

,

From algorithms used in policing and the judicial system, to numerous social welfare fraud detection cases, examples of implementation of algorithms in public governance amass in recent years. Algorithmic governance can rely on surveillance, for instance in cases of spatial crime forecasting, or on censorship, such as when used for repression of protests. In this regard, many have invoked the panopticon metaphor to describe algorithmic governance, whereby constant surveillance causes self-censorship. Moreover, the exploitation aspect of crime prevention and detection cannot be neglected, given the amount of data extraction it requires. While we are all subjected to socio-technical experimentation with algorithmic governance, due to the subtle cultural effects related to its increased prevalence, some are certainly more directly affected than others. This contribution presents the argument that primarily affected are those individuals who enjoy the least power, as mainly determined by their socio-economic status (SES). As such, it analyses algorithmic governance experimentation through the lens of interconnectedness of power, defined as the ability to achieve the desired outcome, and freedom, defined as the absence of domination.

Is algorithmic governance a sort of experimentation?

Technological experimentation generally occurs in two broad stages: first, scientific experimentation takes place in the lab under controlled conditions; and second, socio-technical experimentation involves “beta-testing” technologies in real-life conditions. Socio-technical experimentation is characterized by the uncertainty around both the societal and technological outcomes, thereby making the implementation of algorithms in public governance a prime example of socio-technical experimentation.

Firstly, the algorithms in question, especially AI algorithms, are continuously developing based on the data, typically gathered through socio-technical experimentation, which constitutes the technical aspect of experimentation. Secondly, the policies based on such emerging algorithmic systems are also informed by the societal outcomes, thereby defining the social dimension of socio-technical experimentation. Moreover, algorithmic governance can be considered a type of socio-technical experimentation because it transcends the controlled realm of the laboratory to be further tested in the real world. Besides being conducted in society (rather than in the lab), algorithmic governance experimentation is conducted on society (with individuals, groups, and society as a whole being subjects of experimentation) and by society (by societal actors such as governmental institutions, corporations, or various forms of cooperation between these actors through increasingly prevalent network governance, such as public private partnerships).

Undoubtedly, the societal consequences of socio-technical experimentation are far-reaching: not only does algorithmic experimentation drive advancements in further algorithmization of governance and produce direct outcomes for the governed citizens, it may also produce unpredictable socio-cultural side-effects, which are often subtle, but cumulative and can therefore be sweeping over time. The key questions, then, are: who is being experimented on, and what are the consequences for these individuals and society as a whole? Next, we will search for answers through considerations of power and freedom.

What is power?

Power is a complex construct, but in its most basic form, it can be defined as the ability to achieve a desired outcome. Since the ability to achieve desired outcomes is evidently unequally distributed in the society, the following long-standing question arose: what determines differences in power? Many theoretical approaches, such as Marxist, Weberian, social reproduction and elite theory, as well as feminist, intersectional and post-colonial approaches, in one way or another, draw the connection between socio-economic status and power. These theories point out the undeniable correlation between social status, economic position, and societal power of individuals and groups, while focusing on different components of this interwoven network. While some of the theories emphasize status and economic determinants (e.g., class), others emphasize social identities (e.g., race and gender) that have traditionally informed status and economic positions of individuals. However, all these theoretical approaches share the observation that higher socio-economic status generates greater capacity for achieving a desired outcome, i.e. power.

What is freedom?

Similar to power, freedom is an elusive concept, defined by many thinkers and traditions. For the purpose of this contribution, however, we will stick to the tradition of republican theory, within which freedom is defined as the absence of power disparities that would allow for arbitrary interference, control, or coercion, i.e. freedom from domination. As societal power disparities determine experimenters on the one hand, as well as potential and actual experiment subjects on the other hand, experimentation could also be analyzed as an example of domination. Moreover, if the consequences of experimentation are unknown to the experimenter, then the act of experimentation could, by its very definition, constitute arbitrary use of power. Therefore, according to the republican theory, algorithmic governance experimentation may pressurize or violate freedom as a public value.

The power to be free?

Following the described interconnectedness of SES, power, and freedom, let us now consider how these constructs relate in the context of algorithmic governance experimentation. When it comes to experimentation, the power of actual and potential experiment subjects defines whose freedom is affected by algorithmic governance and to what extent. This occurs in at least three ways. First, power impacts who is targeted by experimentation, e.g. individuals who need to use the social welfare systems more often than individuals of higher SES. Secondly, among those targeted, power determines who is able to opt out from experimentation. Of course, opting out is often not possible – an observation that inspires the stream of criticism pertaining to a lack of informed consent in algorithmic systems. However, in some cases, like MyGovScot, Scotland’s digital services, citizens have the right to stick to more traditional ways of engaging with public services and therefore effectively to opt out from algorithmic governance experimentation. In such cases, whether one opts out can depend on a multitude of factors, many of which are connected to SES, such as previous knowledge about algorithmic governance as well as their rights as citizens (often facilitated by education, which is a determinant of SES), the amount of free time to engage with administrative procedures (often connected with economic status), etc. Thirdly, among those who could not or did not opt out and became experimentation subjects, power can determine the ability to resist the effects of socio-technical experimentation on their freedom. This ability, too, is connected to knowledge, for instance regarding algorithmic administrative decision appeal procedures.

Conversely, individuals at the very bottom of societal power hierarchies are exposed to experimentation without buffers such as the ability to opt out or question the outcome of administrative procedures based on algorithmic systems. The most confronting examples come from the realm of migration, particularly concerning undocumented migrants and asylum seekers, who are most blatantly exposed to algorithmic experimentation. The accuracy of many of the technologies used, such as dialect recognition and facial recognition for asylum applicants, is questionable. Moreover, the data captured, including biometrics and communications, is highly personal and transgressive of the right to privacy. However, in practice, it is unclear to what extent legal protections of privacy and other rights apply to migrants. These examples indicate that the groups and individuals who enjoy the least amount of power in society seem to be the ones who are most often subjects of intrusive algorithmic governance experimentation. Moreover, the less power subjects of experimentation have, the more overt the effects of experimentation on their freedom (i.e. capture of highly personal data, intrusive surveillance, etc.). At the same time, these are the same groups who are the most vulnerable to potential negative outcomes of algorithmic administrative experimentation, algorithmic mistakes, or side-effects, such as subjective experience of dehumanization.

Conclusion

Understanding experimentation as a form of domination may explain why those with less power are more commonly affected: since power disparity is key for domination as described in the republican tradition (i.e. arbitrary use of power over someone), individuals with less power make for more convenient experiment subjects. At the same time, the effects of experimentation on freedom, as demonstrated by the republican theory, are grave. Living in a society within which there is the potential for arbitrary use of power through socio-technical experimentation is already an infringement on freedom, let alone the explicit use of surveillance, censorship and exploitation through data extraction associated with algorithmic experimentation.

For the sake of simplicity, this contribution focused mainly on the state as the source of socio-technical experimentation (i.e. algorithmic public governance), but it is important to note that algorithmic experimentation occurs both by private and public actors. The problem of algorithmic governance experimentation is wicked, with many pros and cons, and will as such not be solved in this blog post. What was addressed here, however, is how the effects of algorithmic governance experimentation on freedom depend on the subject’s power. Public values of the rule of law such as freedom, equality and trust should be protected throughout society regardless of society members’ power and SES, also in case of algorithmic governance.

 

Antonia Stanojević is a postdoctoral researcher in the NWO CHAIN project at the Tilburg Institute for Law, Technology, and Society (TILT).

Jurgen Goossens is a full professor of constitutional law at Utrecht University and Principal Investigator of the NWO CHAIN project.