Intervention Symposium: “Algorithmic Governance”; organised by Jeremy Crampton and Andrea Miller

The following essays first came together at the 2016 AAG Annual Meeting in San Francisco. Jeremy Crampton (Professor of Geography at the University of Kentucky) and Andrea Miller (PhD candidate at University of California, Davis) assembled five panellists to discuss what they call algorithmic governance – “the manifold ways that algorithms and code/space enable practices of governance that ascribes risk, suspicion and positive value in geographic contexts.”

Among other things, panellists explored how we can best pay attention to the spaces of governance where algorithms operate, and are contested; the spatial dimensions of the data-driven subject; how modes of algorithmic modulation and control impact understandings of categories such as race and gender; the extent to which algorithms are deterministic, and the spaces of contestation or counter-algorithms; how algorithmic governance inflects and augments practices of policing and militarization; the most productive theoretical tools available for studying algorithmic data; visualizations such as maps being implicated by or for algorithms; and the genealogy of algorithms and other histories of computation.

Three of the panellists plus Andrea and Jeremy present versions of these discussions below, following an introduction to the Intervention Symposium from its guest editors (who Andy and Katherine at Antipode would like to thank for all their work!).

Introduction

This Intervention Symposium brings together commentaries presented at the 2016 Annual Meeting of the American Association of Geographers addressing the increasing concern with and interest in what we refer to as “algorithmic governance”. Variously taking up algorithmic governance as an analytic framework, a concept, and a point of departure, we investigate and complicate the manifold ways that algorithms are imagined, enabled, deployed, and utilized in practices of governance. We ask how we might attend to the particularities of algorithmic practices–such as datamining, digital, spatial, and biometric surveillance, notions of “blackboxing”, and the modes of targeting and control generated by and through algorithms–as we engage questions of the political, both historically and in this contemporary moment. What are the attendant anxieties around algorithms: their limits, doubts, and the possibilities of counter-algorithms? What are the possibilities for a politics of the algorithm that these uncertainties open up? How does code (re)produce embodied difference–locating, classifying, and even imagining bodies through and alongside notions of race, gender, and geography? And what is the role of algorithms in producing value?

Algorithms are now embedded in a proliferation of objects, bodies, databases, and software stacks, as well as what Ian Shaw describes as a deterritorialized series of non-human and immaterial vectors of state power. There is a need, therefore, to understand the specific ways algorithms and other forms of code–such as “smart” cities/technologies, the Internet of Things (IoT), and machine-based learning–have so successfully arisen to challenge, supplement, and, at times, replace human decision-making. Although algorithmic governance is a relatively recent term, its historical trajectory is much longer. As an object of scholarly concern, we can trace its emergence through conversations in science and technology studies, cultural geography, digital media studies, and game studies, as with Alexander Galloway’s Protocol: How Control Exists After Decentralization (2004) and Gaming: Essays on Algorithmic Culture (2006). However, at its most general, an algorithm need not be digital (a recipe is an algorithm). As Tarleton Gillespie (2014) defines them, algorithms are a way of taking an input and performing a calculative task in order to produce a desired output. This definition usefully foregrounds the notion of desire, and thus, implicitly, the human. Algorithmic governance is an assemblage of human and non-human actors, both material and discursive. Any history would therefore have to disentangle algorithms’ instantiations in digital technologies from what Andrea Miller calls their broader protocological capacities. These protocological capacities might emerge materially or immaterially, taking digital, non-digital, and not-only-digital forms in databases, museums, and archives, state and corporate infrastructural apparatuses, and colonial logics of population management and policing.

For Antoinette Rouvroy and Thomas Berns (2013:xix), as the algorithm’s concern with relational data supersedes human decision-making and its attendant forms of knowledge production, it ushers in a form of governance that is not immediately legible as politics as such. Rouvroy and Berns describe algorithmic governmentality as “a certain type of (a)normative or (a)political rationality founded on the automated collection, aggregation and analysis of big data so as to model, anticipate and pre-emptively affect possible behaviours” (2013:x). As a spatializing phenomenon, algorithmic governance produces “a colonization of public space by a hypertrophied private space” (2013:v), a topology of relations that neither generates nor cares for a subject or individual (2013:xvii). Drawing from Deleuze and Guattari, Rouvroy and Berns identify algorithmic governance as rhizomatic. Its object of governance is relations themselves (2013:xx), and its emancipatory, if unfulfilled, potential lies in its ability to produce a “multiple without otherness” (2013:xxvii).

For Rouvray and Berns, algorithmic governance comprises three “stages”:

1. The generation of the data double and Big Data (2013:vi), e.g. facial movements become statistical data (cf. Deleuze 1992). Here, data are “accumulated by dispossession” (Harvey 2003), an extraction but also a substitution, a standing-in-for or representation.
2. Automated knowledge production. In this stage, marked by machine learning and “absolute objectivity” (2013:vii), the databases are acquired and formed.
3. “Action on behaviours” (2013:viii). Anticipate and preempt individual desires and behavior and associate with profiles. Reading this through Foucault, we can invoke Foucault’s (2003) notion of “pastoral power”, that which exercises the conduct of conduct over “each and all”.

However, as Louise Amoore (2013:59) has astutely pointed out, the calculus of risk and threat that confers vitality to the algorithm’s “ontology of associations” is not without otherness. Rather, the other of algorithmic governance is one imagined to inhabit an indeterminate future, whose threat potential is refracted through scales of time and space to generate disciplinary subjects in the present (see also Miller 2017). From everyday practices that calculate consumer financial risk to the overtly violent practices that generate the target of a US drone strike, the algorithmic is deeply entangled in racialized, classed, and gendered logics of the political present (Amoore 2013; Amoore and de Goede 2008; Miller 2017; Shaw and Akhter 2014; Wall 2016).

But in examining algorithmic practices as they emerge through various registers, whether those of the military, government, finance, tech industry, university, or any combinations therein, questions around the “desires” that animate algorithms or the work that algorithms are designed to do in the world are necessarily attended by a series of doubts, uncertainties, anxieties, and limits. For Amoore, this area of slippage is what allows for the possibility of a politics of algorithms. For if algorithms always fall short of creating a fully knowable future and never have sufficient data (or conversely too much data), then we enter a political condition of what Keith Woodward (2014) calls the “affective uncertainties in statist errancies”. This notion of the affective algorithm has been little studied, but, for example, as Emily Kaufman argues, the field of biospatial profiling–surveillance practices increasingly taken up by police departments across the United States–is predicated on biometrically identifying worrying behaviors such as furtive movements and inappropriate encounters (see also Kaufman 2016). This datafication of worry, whether arising from such deviant individual behaviors or from errant state-corporate infrastructures (data breaches, brown-outs), is an increasing gap of human and calculative uncertainty and anxiety the algorithm is meant to leap and secure (Crawford 2014).

In addition to this rendering algorithmic of affective and psychic realms traditionally associated with the human, this datafication also corresponds to attributing to these forms of experiential qualification–worry, insecurity, anxiety–and their targets monetary and speculative value. As they increasingly are thought of as sites of data to be extracted and redeployed in the service of capital, these valences of algorithmic governance necessarily generate questions about how we might also need to rethink notions of subjectification as a process increasingly inflected by algorithmically driven practices. The datafication of subjectivities, for example, what John Cheney-Lippold (2011) calls the “soft biopolitics” of algorithmic citizenship, is also, then, no doubt a prime target of value extraction. The proliferation of real-time analytics, such as 24/7 monitoring of your heart rate by fitness devices, provides valuable data for health insurance companies who can micro-tune premiums if your heart rate is too low, high, or unbalanced across zones. Students can now be assessed every day or even every hour, rather than every semester, in “personalized learning” schemes–and their achievements continually assessed against a battery of metrics and measurable goals. Or, returning to policing, we see an investment in what Shaw (2016) calls “algorithmic technics” of constant capital (batons, body cameras, police dogs, facial recognition, drones) and a disinvestment in variable capital (the human).

Perhaps surprisingly, however, if there was one theme that contributors were themselves anxious to acknowledge, it was that there is nothing special about algorithms that excuses them from critique. Algorithms are not a tool solely of the powerful but are, rather, a proliferating assemblage available across a diversity of spaces. Their very claims to closure and finality expose them to the realities of their actual asymmetries and uncertainties, and thus the entry of an intriguing politics. For every territorialization, there is a deterritorialization. For example, the consumer drone market may already be worth $5 billion a year (more than the Pentagon spends on military drones) and offer multiple avenues for disruption of state technics through the realm of the algorithmic. The recent use of consumer drones by water protectors at Standing Rock to document their fight against the Dakota Access Pipeline and the actions of law enforcement presents one such example of the insurgent and perhaps unexpected political capacities of the algorithmic. In other words, the very anxieties that give rise to algorithms are those that might also produce the terms through which they can be resituated and reimagined.

Jeremy Crampton
University of Kentucky
[email protected]

Andrea Miller
University of California, Davis
[email protected]

The essays

Introduction – Jeremy Crampton and Andrea Miller

What Does It Mean To Govern With Algorithms? – Louise Amoore

Policing the Future City: Robotic Being-in-the-World – Ian Shaw

Data-Driven or Data-Justified? – Emily Kaufman

Algorithmic Anxieties – Jeremy Crampton

Protocological Violence and the Colonial Database – Andrea Miller

References

Amoore L (2013) The Politics of Possibility: Risk and Security Beyond Probability. Durham: Duke University Press

Amoore L and de Goede M (2008) Transactions after 9/11: The banal face of the preemptive strike. Transactions of the Institute of British Geographers 33(2):173-185

Cheney-Lippold J (2011) A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society 28(6):164-181

Crawford K (2014) The anxieties of big data. The New Inquiry 30 May

Deleuze G (1992) Postscript on the societies of control. October 59:3-7

Foucault M (2003) “Omnes et singulatim”: Toward a critique of political reason. In P Rabinow and N Rose (eds) The Essential Foucault (pp180-201). New York: New Press

Galloway A (2004) Protocol: How Control Exists After Decentralization. Cambridge: MIT Press

Galloway A (2006) Gaming: Essays on Algorithmic Culture. Minneapolis: University of Minnesota Press

Gillespie T (2014) The relevance of algorithms. In T Gillespie, P J Boczkowski and K A Foot (eds) Media Technologies (pp167-194). Cambridge: MIT Press

Harvey D (2003) The New Imperialism. New York: Oxford University Press

Kaufman E (2016) Policing mobilities through bio-spatial profiling in New York City. Political Geography 55:72-81

Miller A (2017) (Im)material terror: Incitement to violence discourse as racializing technology in the war on terror. In C Kaplan and L Parks (eds) Life in the Age of Drone Warfare (forthcoming). Durham: Duke University Press

Rouvroy A and Berns T (2013) Algorithmic governmentality and prospects of emancipation (trans E Libbrecht). Réseaux 177:163-196

Shaw I G R (2016) The urbanization of drone warfare: Policing surplus populations in the dronepolis. Geographica Helvetica 71(1):19-28

Shaw I and Akhter M (2012) The unbearable humanness of drone warfare in FATA, Pakistan. Antipode 44(4):1490-1509

Wall T (2016) Ordinary emergency: Drones, police, and geographies of legal terror. Antipode 48(4):1122-1139

Woodward K (2014) Affect, state theory, and the politics of confusion. Political Geography 41:21-31

Comments