Toward a Minor Tech:CRICHLOW5000: Difference between revisions

From creative crowd wiki
Jump to navigation Jump to search
Line 7: Line 7:




''2015,'' a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition ''My Blue Window'' at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicles front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word '''“FORECASTING”''' is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on an abandoned street as the words “'''CRIME DETERRED'''” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of a predictive policing software.
''2015,'' a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition ''My Blue Window'' at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicle's front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word '''“FORECASTING”''' is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on an abandoned street as the words “'''CRIME DETERRED'''” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of a predictive policing software.


Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black people in the art world and beyond. Their multimedia works explore forms of cultural critique that critically engage systems of control, blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s ''2015'' interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black people in the art world and beyond. Their multimedia works explore forms of cultural critique that critically engage systems of control, blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s ''2015'' interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.


As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, ''2015'' ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As Paul Gilroy sociologist and cultural studies scholar observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21<sup>st</sup> century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, ''2015'' ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21<sup>st</sup> century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.


Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.
Line 48: Line 48:
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).


Beyond this critique cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of ''racial formations as data formations'' provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation?  
Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of ''racial formations as data formations'' provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation?  


=== '''''2015''''' ===
=== '''''2015''''' ===

Revision as of 13:48, 17 April 2023

<div class="pad"><eplite id="CRICHLOW5000" show-chat="false" /></div>

Scaling Up, Scaling Down: Racialism in the Age of Big Data

Figure 1: American Artist, still from 2015, 2019, Single-channel HD video, 21:38 minutes.


2015, a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition My Blue Window at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicle's front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word “FORECASTING” is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on an abandoned street as the words “CRIME DETERRED” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of a predictive policing software.

Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black people in the art world and beyond. Their multimedia works explore forms of cultural critique that critically engage systems of control, blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s 2015 interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.

As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, 2015 ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21st century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.

Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.

Through analysis of American Artists video installation, 2015, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of racial formations as data formation, that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s 2015 as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.

The scales of Euclidean anatomy

The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18th century naturalist Carl Linnaeus’s major classificatory work, Systema Naturae (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19th century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms the scale of comparative or Euclidean anatomy (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19th century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20th century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.

In the 21st century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond Euclidean anatomy, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.

This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21st century successor is being rendered in new perceptual formats, remains an urgent question.

‘Racial formations as data formations’

Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21st century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?

Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.

Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s Algorithms of Oppression highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s Weapons of Maths Destruction addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of bias – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes.

This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing new racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classifications. These classifications conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21st century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term racial formations as data formations. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.

Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21st globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21st century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).

The Problem of Prediction: Data-led policing in the U.S

Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, "data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.

 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as hot spot criminology. By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.

Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122).

Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).

Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of racial formations as data formations provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation?

2015

Figure 2: American Artist, still from 2015, 2019, Single-channel HD video, 21:38 minutes.

This question returns us to American Artist’s video installation, 2015. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.

The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of 2015’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.

Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of traces.

2015’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.

As American Artist’s 2015 so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s 2015 palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.

Figure 3: American Artist, still from 2015, 2019, Single-channel HD video, 21:38 minutes.

Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.

In American Artist’s 2015, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic 2015 is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications.

Conclusion

This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19th century to the genomic revolution of the 1990’s, I argue that race has always been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of racial formations as data formations – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s 2015 visualises ways in which these residual, and emergent characteristics of racialism are embedded in the everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.

Works Cited

Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” Political geography 25.3 (2006): 336–351.

Berman, Eli et al. Small Wars, Big Data: The Information Revolution in Modern Conflict. Princeton University Press, 2018.

Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: Board of Governors of the Federal Reserve System, 2022.

Brayne, Sarah. Predict and Surveil: Data, Discretion, and the Future of Policing. New York, NY: Oxford University Press, 2020.

Browne, Simone. 2015. Dark Matters: On the Surveillance of Blackness. Durham: Duke University Press.

Chun, Wendy Hui Kyong. Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition. Cambridge: MIT Press, 2021. Print.

DiCaglio, Joshua. Scale Theory : a Nondisciplinary Inquiry. University of Minnesota Press, 2021.

Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, Gartner, File No. 949, 6 February 2001, http://blogs.gartner.com/doug-laney/files/2012/01/ad949-3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety.pdf.

Gilroy, Paul. “Race Ends Here.” Ethnic and racial studies, vol 21, no. 5, 1998, pp. 838–847.

Jefferson, Brian. Digitize and Punish: Racial Criminalization in the Digital Age. Minneapolis: University of Minnesota Press, 2020

Lloyd, David. 2018. Under Representation: The Racial Regime of Aesthetics. New York: Fordham University Press.

Macnish, Kevin, and Jai Galliott, editors. Big Data and Democracy: Edinburgh University Press, 2020.

Mbembe, Achille. 2013. Critique of Black Reason. Durham: Duke University Press.

Melamed, Jodi. 2011. Represent and Destroy: Rationalizing Violence in the New Racial Capitalism . Minneapolis: University of Minnesota Press.

Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press, 2018

Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” Science, 2019, pp. 447-453.

O’Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Allen Lane, 2016.

Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” Journal of Law, Medicine & Ethics, vol. 49, no. 4, 2021, pp. 666-676..

Saini, Angela. Superior: the Return of Race Science . Beacon Press, 2020.

Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129.

Than, Thao, and Scott Wark. “Racial formations as data formations.” Big Data & Society, 2021, vol. 8, no. 2, pp. 1-5.

Vogl, Joseph.Joseph Vogl. Le spectre du capital. Diaphanes, 2013.

Winston, Brian. “Surveillance in the Service of Narrative”.  A Companion to Contemporary Documentary Film, edited by Alexandra Juhasz and Alisa Lebow. John Wiley & Sons, 2015, pp.  611-628.

Womack, Autumn. The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.  The University of Chicago Press, 2022.

Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” European journal of criminology, vol 18, no. 5, 2021, pp. 623–642.