<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://cc.practices.tools/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Crichlow</id>
	<title>Creative Crowds wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="http://cc.practices.tools/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Crichlow"/>
	<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/Special:Contributions/Crichlow"/>
	<updated>2026-04-07T06:31:54Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.45.1</generator>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2759</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2759"/>
		<updated>2023-06-30T13:41:07Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: /* Scaling Up, Scaling Down: Racialism in the Age of Big Data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
= Scaling Up, Scaling Down: Racialism in the Age of Big Data =&lt;br /&gt;
&#039;&#039;&#039;Camille Crichlow&#039;&#039;&#039;&lt;br /&gt;
[[File:My Blue Window - Composition.00 02 24 00.Still045.jpg|thumb|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This article explores the shifting perceptual scales of racial epistemology and anti-blackness in predictive policing technology. Following Paul Gilroy, I argue that the historical production of racism and anti-blackness has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Where racialisation was once bound to the anatomical scale of the body, Thao Than and Scott Wark’s conceptualisation of “racial formations as data formations” inform insights into the ways in which “race”, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor, is increasingly being produced as a cultivation of post-visual, data-driven abstractions. I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that divide civilian from suspect. Beyond a “garbage in, garbage out” critique, I explore the ways in which predictive policing instils racialisation as an epiphenomenon of data-generated proxies. By way of conclusion, I analyse American Artist’s 21-minute video installation &#039;&#039;2015&#039;&#039; (2019), which depicts the point of view of a police patrol car equipped with a predictive policing device, to parse the scales upon which algorithmic regimes of racial domination are produced and resisted.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed on the vehicle’s front windshield, a continuous flow of statistical data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. Below a shifting animation of neon pink clouds, the word “forecasting” appears as the sun rises on the freeway. The vehicle suddenly changes course, veering towards an exit guided by a series of blinking ‘hot spots’ identified on the screen’s navigation grid. Over the deafening din of a police siren, the car races towards its analytically derived patrol zone. The movement of the camera slows to a stop on an abandoned street as the words “Crime Deterred” repetitively pulse across the screen. This narrative arc circuitously structures the filmic point of view of a predictive policing device.&lt;br /&gt;
&lt;br /&gt;
In tandem with American Artist’s broader multimedia oeuvre, &#039;&#039;2015&#039;&#039; similarly operates at historical intersections of race, technology, and knowledge production. Their legal name- change to American Artist in 2013 suggests a purposeful play with ambivalence. One that foregrounds the visibility and erasure of black art practice, asserting blackness as descriptive of an American artist, while simultaneously signalling anonymity to evade the surveillant logics of virtual spaces. Across their multimedia works forms of cultural critique stage the relation between blackness and power while addressing histories of network culture.  Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black people in predictive policing technology, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary-like footage to construct a unique experimental means to invite rumination on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; plays with scale as response. Following Joshua DiCaglio, I invoke scale here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (3). Relatedly, scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has &#039;&#039;always&#039;&#039; been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, racialisation finds novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. No doubt, residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience. Here, however, I adopt a different orientation, one that specifically examines the less considered role of data-driven technologies that increasingly inscribe racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing technology relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne; Chun). Indeed, predictive analytics range across a wide spectrum of sociality. Health care algorithms employed to predict and rank patient care, favour white patients over black (Obermeyer) and automated welfare eligibility calculations keep the racialised poor from accessing state-funded resources, for example. (Rao; Toos). Relatedly, credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produce racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Informed by the “creeping” role of prediction and subsequent “zones of suspicion,” I consider how racial epistemology is actively reconstructed and reified within the scalar magnitude of ‘big data’. This article will focus on racialisation as it is bound up in the historical production of blackness in the American context, though I will touch on the ways in which big data is reframing the categories upon which former racial classifications rest more broadly. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. To do this, I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that that map new categories of human difference through statistical inferences of risk. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
== The scales of Euclidean anatomy ==&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types. Nevertheless, it inaugurated a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, the emergence of a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting, and evaluating – a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
Now twenty years into the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, our perceptual regime has been fundamentally altered by exponential advancements in digital technology capacity.  Developments across computational, biological, and analytic sciences produce new forms of perceptual scale, and with it, as Gilroy suggests, open consideration for envisioning the end of race as we know it. Writing in the late 1990’s, Gilroy observed how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (846).” By imaging the body in new ways, Gilroy proposes, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, biological conceptions of race were disproved as a scientifically valid construct. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy discerns, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perceptual regime that once overdetermined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
Rehearsing this argument is not meant to suggest that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. Gilroy (“Race and Racism in ‘The Age of Obama’”), along with his critics, make clear that the “normative potency” of biological racism retains rhetorical and semiotic force within contemporary culture. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently insecure and can be made to yield, politically and culturally, to alternative visions of non-racialism. To combat the emergent racism of the present, this vision suggests, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy directs attention to tasks of doing “a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839).&lt;br /&gt;
&lt;br /&gt;
Attending to these tasks of intervention requires that we keep in mind the myriad ways in which the residual traces left by older racial regimes subtly insinuate the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus lay bare its hidden truths, also “reveal a set of violences, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing, and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
== ‘Racial formations as data formations’ ==&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful, efficient, and objective compared to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google. Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes – garbage in, garbage out. Demands for inclusion or “unbiased data”, however, often fail to address the racialised dialectic between inside and outside, human and Other. As Ramon Amaro argues, “to merely include a representational object in a computational milieu that has already positioned the white object as the prototypical characteristic catalyses disruption superficially” (53). From this perspective, the racial other is positioned in opposition to the prototypical classification, which is whiteness, and is thus seen as “alientated, fragmented, and lacking in comparison” (Amaro 53). If the end goal is inclusion, Amaro follows, what about a right of refusal to representation? This question is particularly pertinent in a context where inclusion also means exposure to heightened forms of surveillance for racialised communities, particularly in the context of policing (Lee and Chin).&lt;br /&gt;
&lt;br /&gt;
Relatedly, the language of bias, inclusion, and exclusion does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer solely predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. Nonetheless, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, neither “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6). In the next section, I turn to predictive policing technology to parse the ways in which data regimes are mapping new terrains upon which racial formations are produced and sustained.&lt;br /&gt;
&lt;br /&gt;
== The Problem of Prediction: Data-led policing in the U.S. ==&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small-scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues, “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before. The analytically derived &amp;quot;impact zone” can thus be understood as a bordering technology – one that sorts and divides civilian populations from those marked by higher probabilities of risk, and thus suspicion.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117). Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118). Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122).&lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which black communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are actively produced not merely through historical data, but in the correlative models themselves (Lloyd 2). While these statistically generated “patrol zones” tend to map onto historically racialised communities, this process of racialisation does not necessarily correspond to the visual, or phenotypic signifiers of race. What emerges in these correlative models are novel kinds of classifications that arise from probabilistic inferences of suspicion through which subjects – often racial minorities – are exposed to heightened surveillance and violence. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). The question remains, as neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, what becomes of the body in this post-visual shift?&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;2015&#039;&#039; ==&lt;br /&gt;
[[File:Image 2.jpg|thumb|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This provocation returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the work, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of evidentiary image-making, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the moving image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal. As Brian Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion of the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes that implicitly inform the surveillance context are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. American Artist’s &#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision. Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
Here, as &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveillant apparatus. The unadorned message: race is produced and sustained as a product of data.spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.&lt;br /&gt;
&lt;br /&gt;
[[File:Scan.jpg|thumb|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
Yet, at the same time, the work’s aesthetic intervention interrogates the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This visceral reference to biometric identification – reading the body as the site and sign of identity – complicates the claim that the primordial, objectifying force of visual evidence are transcended by neutral seeming post-visual data apparati. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE). Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the evidentiary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary-like coded format, American Artist calls into question what it means to document, record, or survey within the frame of moving images. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” have produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is a prime example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
This article explores the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I show that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies afford potentially new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. In the context of ongoing logics of contemporary race, American Artist’s &#039;&#039;2015&#039;&#039; returns consideration to the ways in which residual, and emergent characteristics of racialism are embedded in everyday systems of predictive policing technology. Through multimedia intervention, Artist’s video work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. In this instance, American Artist orchestrates one critical means to grasp racialism’s multiple forms, past and present, visual, and otherwise, towards future modalities and determinations not yet realised.&lt;br /&gt;
&lt;br /&gt;
== Works cited ==&lt;br /&gt;
Amaro, Ramon. &#039;&#039;The Black Technical Object: On Machine Learning and the Aspiration of Black Being&#039;&#039;. Sternberg Press, 2023.&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. The MIT Press, 2021. &lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race and Racism In the Age of Obama”. The Tenth Annual Eccles Centre for American Studies Plenary Lecture given at the British Association for American Studies Annual Conference, 2013.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. University of Minnesota Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039;. Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129.&lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. John Wiley &amp;amp; Sons, 2015, pp.  611-628.&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039; The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=File:Image_2.jpg&amp;diff=2758</id>
		<title>File:Image 2.jpg</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=File:Image_2.jpg&amp;diff=2758"/>
		<updated>2023-06-30T13:38:38Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;American Artist, still from 2015&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=File:My_Blue_Window_-_Composition.00_02_24_00.Still045.jpg&amp;diff=2757</id>
		<title>File:My Blue Window - Composition.00 02 24 00.Still045.jpg</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=File:My_Blue_Window_-_Composition.00_02_24_00.Still045.jpg&amp;diff=2757"/>
		<updated>2023-06-30T13:37:37Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Figure 1: American Artist, still from 2015&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2193</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2193"/>
		<updated>2023-06-11T14:41:46Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Abstract:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This article explores the shifting perceptual scales of racial epistemology and anti-blackness in predictive policing technology. Following Paul Gilroy, I argue that the historical production of racism and anti-blackness has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Where racialisation was once bound to the anatomical scale of the body, Thao Than and Scott Wark’s conceptualisation of “racial formations as data formations” inform insights into the ways in which “race”, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor, is increasingly being produced as a cultivation of post-visual, data-driven abstractions. I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that divide civilian from suspect. Beyond a “garbage in, garbage out” critique, I explore the ways in which predictive policing instils racialisation as an epiphenomenon of data-generated proxies. By way of conclusion, I analyse American Artist’s 21-minute video installation &#039;&#039;2015&#039;&#039; (2019), which depicts the point of view of a police patrol car equipped with a predictive policing device, to parse the scales upon which algorithmic regimes of racial domination are produced and resisted.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Bio:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Camille Crichlow is a PhD Researcher at the Sarah Parker Remond Centre for the Study of Racism and Racialisation (University College London). Her research interrogates how the historical and socio-cultural narrative of race manifests in contemporary algorithmic technologies.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Key words:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Big data, predictive policing, post-visual, race, blackness,&lt;br /&gt;
&lt;br /&gt;
=== Introduction ===&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed on the vehicle’s front windshield, a continuous flow of statistical data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. Below a shifting animation of neon pink clouds, the word “forecasting” appears as the sun rises on the freeway. The vehicle suddenly changes course, veering towards an exit guided by a series of blinking ‘hot spots’ identified on the screen’s navigation grid. Over the deafening din of a police siren, the car races towards its analytically derived patrol zone. The movement of the camera slows to a stop on an abandoned street as the words “Crime Deterred” repetitively pulse across the screen. This narrative arc circuitously structures the filmic point of view of a predictive policing device.&lt;br /&gt;
&lt;br /&gt;
In tandem with American Artist’s broader multimedia oeuvre, &#039;&#039;2015&#039;&#039; similarly operates at historical intersections of race, technology, and knowledge production. Their legal name- change to American Artist in 2013 suggests a purposeful play with ambivalence. One that foregrounds the visibility and erasure of black art practice, asserting blackness as descriptive of an American artist, while simultaneously signalling anonymity to evade the surveillant logics of virtual spaces. Across their multimedia works forms of cultural critique stage the relation between blackness and power while addressing histories of network culture.  Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black people in predictive policing technology, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary-like footage to construct a unique experimental means to invite rumination on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; plays with scale as response. Following Joshua DiCaglio, I invoke scale here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (3). Relatedly, scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has &#039;&#039;always&#039;&#039; been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, racialisation finds novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. No doubt, residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience. Here, however, I adopt a different orientation, one that specifically examines the less considered role of data-driven technologies that increasingly inscribe racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing technology relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne; Chun). Indeed, predictive analytics range across a wide spectrum of sociality. Health care algorithms employed to predict and rank patient care, favour white patients over black (Obermeyer ) and automated welfare eligibility calculations keep the racialised poor from accessing state-funded resources, for example. (Rao; Toos). Relatedly, credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produce racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Informed by the “creeping” role of prediction and subsequent “zones of suspicion,” I consider how racial epistemology is actively reconstructed and reified within the scalar magnitude of ‘big data’. This article will focus on racialisation as it is bound up in the historical production of blackness in the American context, though I will touch on the ways in which big data is reframing the categories upon which former racial classifications rest more broadly. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. To do this, I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that that map new categories of human difference through statistical inferences of risk. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== The scales of Euclidean anatomy ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types. Nevertheless, it inaugurated a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, the emergence of a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting, and evaluating – a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
Now twenty years into the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, our perceptual regime has been fundamentally altered by exponential advancements in digital technology capacity.  Developments across computational, biological, and analytic sciences produce new forms of perceptual scale, and with it, as Gilroy suggests, open consideration for envisioning the end of race as we know it. Writing in the late 1990’s, Gilroy observed how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (846).” By imaging the body in new ways, Gilroy proposes, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, biological conceptions of race were disproved as a scientifically valid construct. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy discerns, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perceptual regime that once overdetermined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
Rehearsing this argument is not meant to suggest that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. Gilroy (“Race and Racism in ‘The Age of Obama’”), along with his critics, make clear that the “normative potency” of biological racism retains rhetorical and semiotic force within contemporary culture. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently insecure and can be made to yield, politically and culturally, to alternative visions of non-racialism. To combat the emergent racism of the present, this vision suggests, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy directs attention to tasks of doing “a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839).&lt;br /&gt;
&lt;br /&gt;
Attending to these tasks of intervention requires that we keep in mind the myriad ways in which the residual traces left by older racial regimes subtly insinuate the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus lay bare its hidden truths, also “reveal a set of violences, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing, and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== ‘Racial formations as data formations’ ===&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful, efficient, and objective compared to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google. Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes – garbage in, garbage out. Demands for inclusion or “unbiased data”, however, often fail to address the racialised dialectic between inside and outside, human and Other. As Ramon Amaro argues, “to merely include a representational object in a computational milieu that has already positioned the white object as the prototypical characteristic catalyses disruption superficially” (53).  From this perspective, the racial other is positioned in opposition to the prototypical classification, which is whiteness, and is thus seen as “alientated, fragmented, and lacking in comparison” (Amaro 53). If the end goal is inclusion, Amaro follows, what about a right of refusal to representation? This question is particularly pertinent in a context where inclusion also means exposure to heightened forms of surveillance for racialised communities, particularly in the context of policing (Lee and Chin).&lt;br /&gt;
&lt;br /&gt;
Relatedly, the language of bias, inclusion, and exclusion does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer solely predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. Nonetheless, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, neither “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6). In the next section, I turn to predictive policing technology to parse the ways in which data regimes are mapping new terrains upon which racial formations are produced and sustained.&lt;br /&gt;
&lt;br /&gt;
=== The Problem of Prediction: Data-led policing in the U.S ===&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small-scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues, “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before. The analytically derived &amp;quot;impact zone” can thus be understood as a bordering technology – one that sorts and divides civilian populations from those marked by higher probabilities of risk, and thus suspicion.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122).&lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which black communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are actively produced not merely through historical data, but in the correlative models themselves (Lloyd 2). While these statistically generated “patrol zones” tend to map onto historically racialised communities, this process of racialisation does not necessarily correspond to the visual, or phenotypic signifiers of race. What emerges in these correlative models are novel kinds of classifications that arise from probabilistic inferences of suspicion through which subjects – often racial minorities – are exposed to heightened surveillance and violence. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). The question remains, as neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, what becomes of the body in this post-visual shift?&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;2015&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This provocation returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the work, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of evidentiary image-making, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the moving image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal. As Brian Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion of the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes that implicitly inform the surveillance context are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. American Artist’s &#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision. Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
Here, as &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveillant apparatus. The unadorned message: race is produced and sustained as a product of data.spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.[[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work’s aesthetic intervention interrogates the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This visceral reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that the primordial, objectifying force of visual evidence are transcended by neutral seeming post-visual data apparati. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the evidentiary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary-like coded format, American Artist calls into question what it means to document, record, or survey within the frame of moving images. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” have produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is a prime example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications.&lt;br /&gt;
&lt;br /&gt;
=== Conclusion ===&lt;br /&gt;
&lt;br /&gt;
This article explores the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I show that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies afford potentially new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. In the context of ongoing logics of contemporary race, American Artist’s &#039;&#039;2015&#039;&#039; returns consideration to the ways in which residual, and emergent characteristics of racialism are embedded in everyday systems of predictive policing technology. Through multimedia intervention, Artist’s video work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. In this instance, American Artist orchestrates one critical means to grasp racialism’s multiple forms, past and present, visual, and otherwise, towards future modalities and determinations not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
Amaro, Ramon. &#039;&#039;The Black Technical Object: On Machine Learning and the Aspiration of Black Being&#039;&#039;. Sternberg Press, 2023.&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race and Racism “In the Age of Obama”. The Tenth Annual Eccles Centre for American Studies Plenary Lecture given at the British Association for American Studies Annual Conference, 2013.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129.&lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2192</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2192"/>
		<updated>2023-06-11T14:40:02Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Abstract:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This article explores the shifting perceptual scales of racial epistemology and anti-blackness in predictive policing technology. Following Paul Gilroy, I argue that the historical production of racism and anti-blackness has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Where racialisation was once bound to the anatomical scale of the body, Thao Than and Scott Wark’s conceptualisation of “racial formations as data formations” inform insights into the ways in which “race”, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor, is increasingly being produced as a cultivation of post-visual, data-driven abstractions. I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that divide civilian from suspect. Beyond a “garbage in, garbage out” critique, I explore the ways in which predictive policing instils racialisation as an epiphenomenon of data-generated proxies. By way of conclusion, I analyse American Artist’s 21-minute video installation &#039;&#039;2015&#039;&#039; (2019), which depicts the point of view of a police patrol car equipped with a predictive policing device, to parse the scales upon which algorithmic regimes of racial domination are produced and resisted.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Bio:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Camille Crichlow is a PhD Researcher at the Sarah Parker Remond Centre for the Study of Racism and Racialisation (University College London). Her research interrogates how the historical and socio-cultural narrative of race manifests in contemporary algorithmic technologies.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Key words:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Big data, predictive policing, post-visual, race, blackness,&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed on the vehicle’s front windshield, a continuous flow of statistical data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. Below a shifting animation of neon pink clouds, the word “forecasting” appears as the sun rises on the freeway. The vehicle suddenly changes course, veering towards an exit guided by a series of blinking ‘hot spots’ identified on the screen’s navigation grid. Over the deafening din of a police siren, the car races towards its analytically derived patrol zone. The movement of the camera slows to a stop on an abandoned street as the words “Crime Deterred” repetitively pulse across the screen. This narrative arc circuitously structures the filmic point of view of a predictive policing device.&lt;br /&gt;
&lt;br /&gt;
In tandem with American Artist’s broader multimedia oeuvre, &#039;&#039;2015&#039;&#039; similarly operates at historical intersections of race, technology, and knowledge production. Their legal name- change to American Artist in 2013 suggests a purposeful play with ambivalence. One that foregrounds the visibility and erasure of black art practice, asserting blackness as descriptive of an American artist, while simultaneously signalling anonymity to evade the surveillant logics of virtual spaces. Across their multimedia works forms of cultural critique stage the relation between blackness and power while addressing histories of network culture.  Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black people in predictive policing technology, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary-like footage to construct a unique experimental means to invite rumination on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; plays with scale as response. Following Joshua DiCaglio, I invoke scale here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (3). Relatedly, scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has &#039;&#039;always&#039;&#039; been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, racialisation finds novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. No doubt, residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience. Here, however, I adopt a different orientation, one that specifically examines the less considered role of data-driven technologies that increasingly inscribe racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing technology relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne; Chun). Indeed, predictive analytics range across a wide spectrum of sociality. Health care algorithms employed to predict and rank patient care, favour white patients over black (Obermeyer ) and automated welfare eligibility calculations keep the racialised poor from accessing state-funded resources, for example. (Rao; Toos). Relatedly, credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produce racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Informed by the “creeping” role of prediction and subsequent “zones of suspicion,” I consider how racial epistemology is actively reconstructed and reified within the scalar magnitude of ‘big data’. This article will focus on racialisation as it is bound up in the historical production of blackness in the American context, though I will touch on the ways in which big data is reframing the categories upon which former racial classifications rest more broadly. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. To do this, I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that that map new categories of human difference through statistical inferences of risk. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== The scales of Euclidean anatomy ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types. Nevertheless, it inaugurated a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, the emergence of a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting, and evaluating – a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
Now twenty years into the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, our perceptual regime has been fundamentally altered by exponential advancements in digital technology capacity.  Developments across computational, biological, and analytic sciences produce new forms of perceptual scale, and with it, as Gilroy suggests, open consideration for envisioning the end of race as we know it. Writing in the late 1990’s, Gilroy observed how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (846).” By imaging the body in new ways, Gilroy proposes, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, biological conceptions of race were disproved as a scientifically valid construct. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy discerns, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perceptual regime that once overdetermined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
Rehearsing this argument is not meant to suggest that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. Gilroy (“Race and Racism in ‘The Age of Obama’”), along with his critics, make clear that the “normative potency” of biological racism retains rhetorical and semiotic force within contemporary culture. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently insecure and can be made to yield, politically and culturally, to alternative visions of non-racialism. To combat the emergent racism of the present, this vision suggests, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy directs attention to tasks of doing “a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839).&lt;br /&gt;
&lt;br /&gt;
Attending to these tasks of intervention requires that we keep in mind the myriad ways in which the residual traces left by older racial regimes subtly insinuate the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus lay bare its hidden truths, also “reveal a set of violences, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing, and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== ‘Racial formations as data formations’ ===&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful, efficient, and objective compared to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google. Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes – garbage in, garbage out. Demands for inclusion or “unbiased data”, however, often fail to address the racialised dialectic between inside and outside, human and Other. As Ramon Amaro argues, “to merely include a representational object in a computational milieu that has already positioned the white object as the prototypical characteristic catalyses disruption superficially” (53).  From this perspective, the racial other is positioned in opposition to the prototypical classification, which is whiteness, and is thus seen as “alientated, fragmented, and lacking in comparison” (Amaro 53). If the end goal is inclusion, Amaro follows, what about a right of refusal to representation? This question is particularly pertinent in a context where inclusion also means exposure to heightened forms of surveillance for racialised communities, particularly in the context of policing (Lee and Chin).&lt;br /&gt;
&lt;br /&gt;
Relatedly, the language of bias, inclusion, and exclusion does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer solely predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. Nonetheless, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, neither “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6). In the next section, I turn to predictive policing technology to parse the ways in which data regimes are mapping new terrains upon which racial formations are produced and sustained.&lt;br /&gt;
&lt;br /&gt;
=== The Problem of Prediction: Data-led policing in the U.S ===&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small-scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues, “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before. The analytically derived &amp;quot;impact zone” can thus be understood as a bordering technology – one that sorts and divides civilian populations from those marked by higher probabilities of risk, and thus suspicion.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122).&lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which black communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are actively produced not merely through historical data, but in the correlative models themselves (Lloyd 2). While these statistically generated “patrol zones” tend to map onto historically racialised communities, this process of racialisation does not necessarily correspond to the visual, or phenotypic signifiers of race. What emerges in these correlative models are novel kinds of classifications that arise from probabilistic inferences of suspicion through which subjects – often racial minorities – are exposed to heightened surveillance and violence. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). The question remains, as neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, what becomes of the body in this post-visual shift?&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;2015&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This provocation returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the work, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of evidentiary image-making, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the moving image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal. As Brian Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion of the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes that implicitly inform the surveillance context are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. American Artist’s &#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision. Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
Here, as &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveillant apparatus. The unadorned message: race is produced and sustained as a product of data.spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.[[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work’s aesthetic intervention interrogates the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This visceral reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that the primordial, objectifying force of visual evidence are transcended by neutral seeming post-visual data apparati. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the evidentiary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary-like coded format, American Artist calls into question what it means to document, record, or survey within the frame of moving images. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” have produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is a prime example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications.&lt;br /&gt;
&lt;br /&gt;
=== Conclusion ===&lt;br /&gt;
&lt;br /&gt;
This article explores the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I show that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies afford potentially new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. In the context of ongoing logics of contemporary race, American Artist’s &#039;&#039;2015&#039;&#039; returns consideration to the ways in which residual, and emergent characteristics of racialism are embedded in everyday systems of predictive policing technology. Through multimedia intervention, Artist’s video work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. In this instance, American Artist orchestrates one critical means to grasp racialism’s multiple forms, past and present, visual, and otherwise, towards future modalities and determinations not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
Amaro, Ramon. &#039;&#039;The Black Technical Object: On Machine Learning and the Aspiration of Black Being&#039;&#039;. Sternberg Press, 2023.&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race and Racism “In the Age of Obama”. The Tenth Annual Eccles Centre for American Studies Plenary Lecture given at the British Association for American Studies Annual Conference, 2013.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129.&lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2191</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2191"/>
		<updated>2023-06-11T14:35:58Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed on the vehicle’s front windshield, a continuous flow of statistical data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. Below a shifting animation of neon pink clouds, the word “forecasting” appears as the sun rises on the freeway. The vehicle suddenly changes course, veering towards an exit guided by a series of blinking ‘hot spots’ identified on the screen’s navigation grid. Over the deafening din of a police siren, the car races towards its analytically derived patrol zone. The movement of the camera slows to a stop on an abandoned street as the words “Crime Deterred” repetitively pulse across the screen. This narrative arc circuitously structures the filmic point of view of a predictive policing device.&lt;br /&gt;
&lt;br /&gt;
In tandem with American Artist’s broader multimedia oeuvre, &#039;&#039;2015&#039;&#039; similarly operates at historical intersections of race, technology, and knowledge production. Their legal name- change to American Artist in 2013 suggests a purposeful play with ambivalence. One that foregrounds the visibility and erasure of black art practice, asserting blackness as descriptive of an American artist, while simultaneously signalling anonymity to evade the surveillant logics of virtual spaces. Across their multimedia works forms of cultural critique stage the relation between blackness and power while addressing histories of network culture.  Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black people in predictive policing technology, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary-like footage to construct a unique experimental means to invite rumination on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; plays with scale as response. Following Joshua DiCaglio, I invoke scale here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (3). Relatedly, scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has &#039;&#039;always&#039;&#039; been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, racialisation finds novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. No doubt, residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience. Here, however, I adopt a different orientation, one that specifically examines the less considered role of data-driven technologies that increasingly inscribe racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing technology relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne; Chun). Indeed, predictive analytics range across a wide spectrum of sociality. Health care algorithms employed to predict and rank patient care, favour white patients over black (Obermeyer ) and automated welfare eligibility calculations keep the racialised poor from accessing state-funded resources, for example. (Rao; Toos). Relatedly, credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produce racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Informed by the “creeping” role of prediction and subsequent “zones of suspicion,” I consider how racial epistemology is actively reconstructed and reified within the scalar magnitude of ‘big data’. This article will focus on racialisation as it is bound up in the historical production of blackness in the American context, though I will touch on the ways in which big data is reframing the categories upon which former racial classifications rest more broadly. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. To do this, I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that that map new categories of human difference through statistical inferences of risk. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== The scales of Euclidean anatomy ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types. Nevertheless, it inaugurated a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, the emergence of a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting, and evaluating – a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
Now twenty years into the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, our perceptual regime has been fundamentally altered by exponential advancements in digital technology capacity.  Developments across computational, biological, and analytic sciences produce new forms of perceptual scale, and with it, as Gilroy suggests, open consideration for envisioning the end of race as we know it. Writing in the late 1990’s, Gilroy observed how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (846).” By imaging the body in new ways, Gilroy proposes, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, biological conceptions of race were disproved as a scientifically valid construct. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy discerns, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perceptual regime that once overdetermined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
Rehearsing this argument is not meant to suggest that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. Gilroy (“Race and Racism in ‘The Age of Obama’”), along with his critics, make clear that the “normative potency” of biological racism retains rhetorical and semiotic force within contemporary culture. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently insecure and can be made to yield, politically and culturally, to alternative visions of non-racialism. To combat the emergent racism of the present, this vision suggests, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy directs attention to tasks of doing “a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839).&lt;br /&gt;
&lt;br /&gt;
Attending to these tasks of intervention requires that we keep in mind the myriad ways in which the residual traces left by older racial regimes subtly insinuate the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus lay bare its hidden truths, also “reveal a set of violences, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing, and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== ‘Racial formations as data formations’ ===&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful, efficient, and objective compared to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google. Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes – garbage in, garbage out. Demands for inclusion or “unbiased data”, however, often fail to address the racialised dialectic between inside and outside, human and Other. As Ramon Amaro argues, “to merely include a representational object in a computational milieu that has already positioned the white object as the prototypical characteristic catalyses disruption superficially” (53).  From this perspective, the racial other is positioned in opposition to the prototypical classification, which is whiteness, and is thus seen as “alientated, fragmented, and lacking in comparison” (Amaro 53). If the end goal is inclusion, Amaro follows, what about a right of refusal to representation? This question is particularly pertinent in a context where inclusion also means exposure to heightened forms of surveillance for racialised communities, particularly in the context of policing (Lee and Chin).&lt;br /&gt;
&lt;br /&gt;
Relatedly, the language of bias, inclusion, and exclusion does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer solely predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. Nonetheless, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, neither “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6). In the next section, I turn to predictive policing technology to parse the ways in which data regimes are mapping new terrains upon which racial formations are produced and sustained.&lt;br /&gt;
&lt;br /&gt;
=== The Problem of Prediction: Data-led policing in the U.S ===&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small-scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues, “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before. The analytically derived &amp;quot;impact zone” can thus be understood as a bordering technology – one that sorts and divides civilian populations from those marked by higher probabilities of risk, and thus suspicion.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122).&lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which black communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are actively produced not merely through historical data, but in the correlative models themselves (Lloyd 2). While these statistically generated “patrol zones” tend to map onto historically racialised communities, this process of racialisation does not necessarily correspond to the visual, or phenotypic signifiers of race. What emerges in these correlative models are novel kinds of classifications that arise from probabilistic inferences of suspicion through which subjects – often racial minorities – are exposed to heightened surveillance and violence. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). The question remains, as neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, what becomes of the body in this post-visual shift?&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;2015&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This provocation returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the work, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of evidentiary image-making, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the moving image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal. As Brian Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion of the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes that implicitly inform the surveillance context are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. American Artist’s &#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision. Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
Here, as &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveillant apparatus. The unadorned message: race is produced and sustained as a product of data.spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.[[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work’s aesthetic intervention interrogates the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This visceral reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that the primordial, objectifying force of visual evidence are transcended by neutral seeming post-visual data apparati. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the evidentiary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary-like coded format, American Artist calls into question what it means to document, record, or survey within the frame of moving images. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” have produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is a prime example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications.&lt;br /&gt;
&lt;br /&gt;
=== Conclusion ===&lt;br /&gt;
&lt;br /&gt;
This article explores the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I show that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies afford potentially new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. In the context of ongoing logics of contemporary race, American Artist’s &#039;&#039;2015&#039;&#039; returns consideration to the ways in which residual, and emergent characteristics of racialism are embedded in everyday systems of predictive policing technology. Through multimedia intervention, Artist’s video work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. In this instance, American Artist orchestrates one critical means to grasp racialism’s multiple forms, past and present, visual, and otherwise, towards future modalities and determinations not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
Amaro, Ramon. &#039;&#039;The Black Technical Object: On Machine Learning and the Aspiration of Black Being&#039;&#039;. Sternberg Press, 2023.&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race and Racism “In the Age of Obama”. The Tenth Annual Eccles Centre for American Studies Plenary Lecture given at the British Association for American Studies Annual Conference, 2013.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129.&lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2190</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=2190"/>
		<updated>2023-06-11T14:34:13Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed on the vehicle’s front windshield, a continuous flow of statistical data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. Below a shifting animation of neon pink clouds, the word “forecasting” appears as the sun rises on the freeway. The vehicle suddenly changes course, veering towards an exit guided by a series of blinking ‘hot spots’ identified on the screen’s navigation grid. Over the deafening din of a police siren, the car races towards its analytically derived patrol zone. The movement of the camera slows to a stop on an abandoned street as the words “Crime Deterred” repetitively pulse across the screen. This narrative arc circuitously structures the filmic point of view of a predictive policing device.&lt;br /&gt;
&lt;br /&gt;
In tandem with American Artist’s broader multimedia oeuvre, &#039;&#039;2015&#039;&#039; similarly operates at historical intersections of race, technology, and knowledge production. Their legal name- change to American Artist in 2013 suggests a purposeful play with ambivalence. One that foregrounds the visibility and erasure of black art practice, asserting blackness as descriptive of an American artist, while simultaneously signalling anonymity to evade the surveillant logics of virtual spaces. Across their multimedia works forms of cultural critique stage the relation between blackness and power while addressing histories of network culture.  Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black people in predictive policing technology, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary-like footage to construct a unique experimental means to invite rumination on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; plays with scale as response. Following Joshua DiCaglio, I invoke scale here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (3) Relatedly, scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has &#039;&#039;always&#039;&#039; been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, racialisation finds novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. No doubt, residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience. Here, however, I adopt a different orientation, one that specifically examines the less considered role of data-driven technologies that increasingly inscribe racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing technology relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Indeed, predictive analytics range across a wide spectrum of sociality. Health care algorithms employed to predict and rank patient care, favour white patients over black (Obermeyer 2019) and automated welfare eligibility calculations keep the racialised poor from accessing state-funded resources, for example. (Rao 2019; Toos 2021). Relatedly, credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produce racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Informed by the “creeping” role of prediction and subsequent “zones of suspicion,” I consider how racial epistemology is actively reconstructed and reified within the scalar magnitude of ‘big data’. This article will focus on racialisation as it is bound up in the historical production of blackness in the American context, though I will touch on the ways in which big data is reframing the categories upon which former racial classifications rest more broadly. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. To do this, I build upon analysis of this phenomena in the context of predictive policing, where analytically derived “patrol zones” produce virtual barriers that that map new categories of human difference through statistical inferences of risk. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== The scales of Euclidean anatomy ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types. Nevertheless, it inaugurated a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, the emergence of a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting, and evaluating – a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
Now twenty years into the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, our perceptual regime has been fundamentally altered by exponential advancements in digital technology capacity.  Developments across computational, biological, and analytic sciences produce new forms of perceptual scale, and with it, as Gilroy suggests, open consideration for envisioning the end of race as we know it. Writing in the late 1990’s, Gilroy observed how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (846).” By imaging the body in new ways, Gilroy proposes, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, biological conceptions of race were disproved as a scientifically valid construct. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy discerns, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perceptual regime that once overdetermined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
Rehearsing this argument is not meant to suggest that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. Gilroy (“Race and Racism in ‘The Age of Obama’”), along with his critics, make clear that the “normative potency” of biological racism retains rhetorical and semiotic force within contemporary culture. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently insecure and can be made to yield, politically and culturally, to alternative visions of non-racialism. To combat the emergent racism of the present, this vision suggests, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy directs attention to tasks of doing “a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839).&lt;br /&gt;
&lt;br /&gt;
Attending to these tasks of intervention requires that we keep in mind the myriad ways in which the residual traces left by older racial regimes subtly insinuate the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus lay bare its hidden truths, also “reveal a set of violences, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing, and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== ‘Racial formations as data formations’ ===&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful, efficient, and objective compared to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes – garbage in, garbage out. Demands for inclusion or “unbiased data”, however, often fail to address the racialised dialectic between inside and outside, human and Other. As Ramon Amaro argues, “to merely include a representational object in a computational milieu that has already positioned the white object as the prototypical characteristic catalyses disruption superficially” (53).  From this perspective, the racial other is positioned in opposition to the prototypical classification, which is whiteness, and is thus seen as “alientated, fragmented, and lacking in comparison” (Amaro 53). If the end goal is inclusion, Amaro follows, what about a right of refusal to representation? This question is particularly pertinent in a context where inclusion also means exposure to heightened forms of surveillance for racialised communities, particularly in the context of policing (Lee and Chin).&lt;br /&gt;
&lt;br /&gt;
Relatedly, the language of bias, inclusion, and exclusion does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer solely predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. Nonetheless, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, neither “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6). In the next section, I turn to predictive policing technology to parse the ways in which data regimes are mapping new terrains upon which racial formations are produced and sustained.&lt;br /&gt;
&lt;br /&gt;
=== The Problem of Prediction: Data-led policing in the U.S ===&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small-scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues, “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before. The analytically derived &amp;quot;impact zone” can thus be understood as a bordering technology – one that sorts and divides civilian populations from those marked by higher probabilities of risk, and thus suspicion.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122).&lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which black communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are actively produced not merely through historical data, but in the correlative models themselves (Lloyd 2). While these statistically generated “patrol zones” tend to map onto historically racialised communities, this process of racialisation does not necessarily correspond to the visual, or phenotypic signifiers of race. What emerges in these correlative models are novel kinds of classifications that arise from probabilistic inferences of suspicion through which subjects – often racial minorities – are exposed to heightened surveillance and violence. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). The question remains, as neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, what becomes of the body in this post-visual shift?&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;2015&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This provocation returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the work, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of evidentiary image-making, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the moving image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal. As Brian Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion of the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes that implicitly inform the surveillance context are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. American Artist’s &#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision. Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
Here, as &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveillant apparatus. The unadorned message: race is produced and sustained as a product of data.spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.[[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work’s aesthetic intervention interrogates the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This visceral reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that the primordial, objectifying force of visual evidence are transcended by neutral seeming post-visual data apparati. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the evidentiary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary-like coded format, American Artist calls into question what it means to document, record, or survey within the frame of moving images. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” have produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is a prime example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications.&lt;br /&gt;
&lt;br /&gt;
=== Conclusion ===&lt;br /&gt;
&lt;br /&gt;
This article explores the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I show that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies afford potentially new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. In the context of ongoing logics of contemporary race, American Artist’s &#039;&#039;2015&#039;&#039; returns consideration to the ways in which residual, and emergent characteristics of racialism are embedded in everyday systems of predictive policing technology. Through multimedia intervention, Artist’s video work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. In this instance, American Artist orchestrates one critical means to grasp racialism’s multiple forms, past and present, visual, and otherwise, towards future modalities and determinations not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
Amaro, Ramon. &#039;&#039;The Black Technical Object: On Machine Learning and the Aspiration of Black Being&#039;&#039;. Sternberg Press, 2023.&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race and Racism “In the Age of Obama”. The Tenth Annual Eccles Centre for American Studies Plenary Lecture given at the British Association for American Studies Annual Conference, 2013.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129.&lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1666</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1666"/>
		<updated>2023-04-17T13:48:12Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: /* Scaling Up, Scaling Down: Racialism in the Age of Big Data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;br /&gt;
&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicle&#039;s front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word &#039;&#039;&#039;“FORECASTING”&#039;&#039;&#039; is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on an abandoned street as the words “&#039;&#039;&#039;CRIME DETERRED&#039;&#039;&#039;” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of a predictive policing software.&lt;br /&gt;
&lt;br /&gt;
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black people in the art world and beyond. Their multimedia works explore forms of cultural critique that critically engage systems of control, blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As sociologist and cultural studies scholar Paul Gilroy observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Through analysis of American Artists video installation, &#039;&#039;2015&#039;&#039;, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The scales of Euclidean anatomy&#039;&#039;&#039; ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;‘Racial formations as data formations’&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes. &lt;br /&gt;
&lt;br /&gt;
This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The Problem of Prediction: Data-led policing in the U.S&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122). &lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critical cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation? &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;&#039;&#039;2015&#039;&#039;&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This question returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
As American Artist’s &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.  &lt;br /&gt;
&lt;br /&gt;
 [[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications. &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;Conclusion&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I argue that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s &#039;&#039;2015&#039;&#039; visualises ways in which these residual, and emergent characteristics of racialism are embedded in &amp;lt;s&amp;gt;the&amp;lt;/s&amp;gt; everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf. &lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129. &lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1665</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1665"/>
		<updated>2023-04-17T13:42:42Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: /* Scaling Up, Scaling Down: Racialism in the Age of Big Data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;br /&gt;
&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicles front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word &#039;&#039;&#039;“FORECASTING”&#039;&#039;&#039; is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on an abandoned street as the words “&#039;&#039;&#039;CRIME DETERRED&#039;&#039;&#039;” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of a predictive policing software.&lt;br /&gt;
&lt;br /&gt;
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black people in the art world and beyond. Their multimedia works explore forms of cultural critique that critically engage systems of control, blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As Paul Gilroy sociologist and cultural studies scholar observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Through analysis of American Artists video installation, &#039;&#039;2015&#039;&#039;, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The scales of Euclidean anatomy&#039;&#039;&#039; ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;‘Racial formations as data formations’&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes. &lt;br /&gt;
&lt;br /&gt;
This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The Problem of Prediction: Data-led policing in the U.S&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122). &lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critique cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation? &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;&#039;&#039;2015&#039;&#039;&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This question returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
As American Artist’s &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.  &lt;br /&gt;
&lt;br /&gt;
 [[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications. &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;Conclusion&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I argue that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s &#039;&#039;2015&#039;&#039; visualises ways in which these residual, and emergent characteristics of racialism are embedded in &amp;lt;s&amp;gt;the&amp;lt;/s&amp;gt; everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf. &lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129. &lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1664</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1664"/>
		<updated>2023-04-17T13:37:11Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: /* Scaling Up, Scaling Down: Racialism in the Age of Big Data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;br /&gt;
&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicles front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word &#039;&#039;&#039;“FORECASTING”&#039;&#039;&#039; is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on an abandoned street as the words “&#039;&#039;&#039;CRIME DETERRED&#039;&#039;&#039;” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of a predictive policing software.&lt;br /&gt;
&lt;br /&gt;
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black art practice. Their multimedia works explore forms of cultural critique that stage histories of systems of control in relation to blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As Paul Gilroy sociologist and cultural studies scholar observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Through analysis of American Artists video installation, &#039;&#039;2015&#039;&#039;, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The scales of Euclidean anatomy&#039;&#039;&#039; ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;‘Racial formations as data formations’&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes. &lt;br /&gt;
&lt;br /&gt;
This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The Problem of Prediction: Data-led policing in the U.S&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122). &lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critique cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation? &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;&#039;&#039;2015&#039;&#039;&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This question returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
As American Artist’s &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.  &lt;br /&gt;
&lt;br /&gt;
 [[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications. &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;Conclusion&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I argue that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s &#039;&#039;2015&#039;&#039; visualises ways in which these residual, and emergent characteristics of racialism are embedded in &amp;lt;s&amp;gt;the&amp;lt;/s&amp;gt; everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf. &lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129. &lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1663</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1663"/>
		<updated>2023-04-17T11:20:19Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;br /&gt;
&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicles front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word &#039;&#039;&#039;“FORECASTING”&#039;&#039;&#039; is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on a seemingly abandoned street as the words “&#039;&#039;&#039;CRIME DETERRED&#039;&#039;&#039;” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of predictive policing technology.&lt;br /&gt;
&lt;br /&gt;
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black art practice. Their multimedia works explore forms of cultural critique that stage histories of systems of control in relation to blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As Paul Gilroy sociologist and cultural studies scholar observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Through analysis of American Artists video installation, &#039;&#039;2015&#039;&#039;, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The scales of Euclidean anatomy&#039;&#039;&#039; ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;‘Racial formations as data formations’&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), warfare (Berman), criminal justice (Završnik) and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes. &lt;br /&gt;
&lt;br /&gt;
This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The Problem of Prediction: Data-led policing in the U.S&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122). &lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critique cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation? &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;&#039;&#039;2015&#039;&#039;&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This question returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
As American Artist’s &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.  &lt;br /&gt;
&lt;br /&gt;
 [[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims” (110). It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications. &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;Conclusion&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I argue that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s &#039;&#039;2015&#039;&#039; visualises ways in which these residual, and emergent characteristics of racialism are embedded in &amp;lt;s&amp;gt;the&amp;lt;/s&amp;gt; everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
Berman, Eli et al. &#039;&#039;Small Wars, Big Data: The Information Revolution in Modern Conflict&#039;&#039;. Princeton University Press, 2018.&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Browne, Simone. 2015. &#039;&#039;Dark Matters: On the Surveillance of Blackness.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
Doug Laney. “3D Data Management: Controlling Data Volume, Velocity, and Cariety”, &#039;&#039;Gartner&#039;&#039;, File No. 949, 6 February 2001, &amp;lt;nowiki&amp;gt;http://blogs&amp;lt;/nowiki&amp;gt;.&#039;&#039;gartner&#039;&#039;.com/&#039;&#039;doug&#039;&#039;-&#039;&#039;laney&#039;&#039;/&#039;&#039;files&#039;&#039;/2012/01/&#039;&#039;ad949&#039;&#039;-&#039;&#039;3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety&#039;&#039;.pdf. &lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039;, vol 21, no. 5, 1998, pp. 838–847.&lt;br /&gt;
&lt;br /&gt;
Jefferson, Brian. &#039;&#039;Digitize and Punish: Racial Criminalization in the Digital Age&#039;&#039;. Minneapolis: University of Minnesota Press, 2020&lt;br /&gt;
&lt;br /&gt;
Lloyd, David. 2018. &#039;&#039;Under Representation: The Racial Regime of Aesthetics.&#039;&#039; New York: Fordham University Press.&lt;br /&gt;
&lt;br /&gt;
Macnish, Kevin, and Jai Galliott, editors. &#039;&#039;Big Data and Democracy&#039;&#039;: Edinburgh University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Mbembe, Achille. 2013. &#039;&#039;Critique of Black Reason.&#039;&#039; Durham: Duke University Press.&lt;br /&gt;
&lt;br /&gt;
Melamed, Jodi. 2011. &#039;&#039;Represent and Destroy: Rationalizing Violence in the New Racial Capitalism .&#039;&#039; Minneapolis: University of Minnesota Press.&lt;br /&gt;
&lt;br /&gt;
Noble, Safiya Umoja. &#039;&#039;Algorithms of Oppression: How Search Engines Reinforce Racism&#039;&#039;. New York University Press, 2018&lt;br /&gt;
&lt;br /&gt;
Obermeyer. Ziad et al. “Dissecting racial bias in an algorithm used to manage the health of populations” &#039;&#039;Science,&#039;&#039; 2019, pp. 447-453.&lt;br /&gt;
&lt;br /&gt;
O’Neil, Cathy. &#039;&#039;Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.&#039;&#039; Allen Lane, 2016.&lt;br /&gt;
&lt;br /&gt;
Rothstein, M. “Big Data, Surveillance Capitalism, and Precision Medicine: Challenges for Privacy.” &#039;&#039;Journal of Law, Medicine &amp;amp; Ethics,&#039;&#039; vol. &#039;&#039;49, no.&#039;&#039; 4, 2021, pp. 666-676..&lt;br /&gt;
&lt;br /&gt;
Saini, Angela. &#039;&#039;Superior: the Return of Race Science&#039;&#039; . Beacon Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Scannel, R.  Joshua.  “This Is Not Minority Report predictive policing and population racism”. Viral Justice: How We Grow the World We Want, edited by Ruha Benjamin. Princeton University Press, 2022, pp. 106-129. &lt;br /&gt;
&lt;br /&gt;
Than, Thao, and Scott Wark. “Racial formations as data formations.” &#039;&#039;Big Data &amp;amp; Society&#039;&#039;, 2021, vol. 8, no. 2, pp. 1-5.&lt;br /&gt;
&lt;br /&gt;
Vogl, Joseph.Joseph Vogl. &#039;&#039;Le spectre du capital&#039;&#039;. Diaphanes, 2013.&lt;br /&gt;
&lt;br /&gt;
Winston, Brian. “Surveillance in the Service of Narrative”. &#039;&#039; A Companion to Contemporary Documentary Film,&#039;&#039; edited by Alexandra Juhasz and Alisa Lebow. &#039;&#039;John Wiley &amp;amp; Sons, 2015, pp.  611-628.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039;  The University of Chicago Press, 2022.&lt;br /&gt;
&lt;br /&gt;
Završnik, Aleš. “Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings.” &#039;&#039;European journal of criminology&#039;&#039;, vol 18, no. 5, 2021, pp. 623–642.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1662</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1662"/>
		<updated>2023-04-17T11:11:58Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;br /&gt;
&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
[[File:FORECASTING.jpg|thumb|586x586px|Figure 1: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicles front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word &#039;&#039;&#039;“FORECASTING”&#039;&#039;&#039; is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on a seemingly abandoned street as the words “&#039;&#039;&#039;CRIME DETERRED&#039;&#039;&#039;” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of predictive policing technology.&lt;br /&gt;
&lt;br /&gt;
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black art practice. Their multimedia works explore forms of cultural critique that stage histories of systems of control in relation to blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As Paul Gilroy sociologist and cultural studies scholar observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Through analysis of American Artists video installation, &#039;&#039;2015&#039;&#039;, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The scales of Euclidean anatomy&#039;&#039;&#039; ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;‘Racial formations as data formations’&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), insurance, criminal justice (Ales), warfare (Berman), and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes. &lt;br /&gt;
&lt;br /&gt;
This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The Problem of Prediction: Data-led policing in the U.S&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122). &lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critique cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation? &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;&#039;&#039;2015&#039;&#039;&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
[[File:Grid.jpg|thumb|736x736px|Figure 2: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
This question returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
As American Artist’s &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data.  &lt;br /&gt;
&lt;br /&gt;
 [[File:Scan.jpg|thumb|1208x1208px|Figure 3: American Artist, still from &#039;&#039;2015,&#039;&#039; 2019, Single-channel HD video, 21:38 minutes.]]&lt;br /&gt;
Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims”. It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications. &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;Conclusion&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I argue that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s &#039;&#039;2015&#039;&#039; visualises ways in which these residual, and emergent characteristics of racialism are embedded in &amp;lt;s&amp;gt;the&amp;lt;/s&amp;gt; everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=File:FORECASTING.jpg&amp;diff=1661</id>
		<title>File:FORECASTING.jpg</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=File:FORECASTING.jpg&amp;diff=1661"/>
		<updated>2023-04-17T11:11:22Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Figure 1: American Artist, still from 2015, 2019, Single-channel HD video, 21:38 minutes.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=File:Scan.jpg&amp;diff=1660</id>
		<title>File:Scan.jpg</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=File:Scan.jpg&amp;diff=1660"/>
		<updated>2023-04-17T11:09:09Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Figure 1: American Artist, still from 2015, 2019, Single-channel HD video, 21:38 minutes.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=File:Grid.jpg&amp;diff=1659</id>
		<title>File:Grid.jpg</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=File:Grid.jpg&amp;diff=1659"/>
		<updated>2023-04-17T11:07:26Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Figure 1: American Artist, still from 2015, 2019, Single-channel HD video, 21:38 minutes.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1658</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1658"/>
		<updated>2023-04-17T11:03:23Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;br /&gt;
&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(Having trouble uploading images)&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicles front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word &#039;&#039;&#039;“FORECASTING”&#039;&#039;&#039; is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on a seemingly abandoned street as the words “&#039;&#039;&#039;CRIME DETERRED&#039;&#039;&#039;” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of predictive policing technology.&lt;br /&gt;
&lt;br /&gt;
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black art practice. Their multimedia works explore forms of cultural critique that stage histories of systems of control in relation to blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As Paul Gilroy sociologist and cultural studies scholar observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Through analysis of American Artists video installation, &#039;&#039;2015&#039;&#039;, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The scales of Euclidean anatomy&#039;&#039;&#039; ===&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;‘Racial formations as data formations’&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), insurance, criminal justice (Ales), warfare (Berman), and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes. &lt;br /&gt;
&lt;br /&gt;
This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).&lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;The Problem of Prediction: Data-led policing in the U.S&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122). &lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critique cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation? &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;&#039;&#039;2015&#039;&#039;&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This question returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
As American Artist’s &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data. &lt;br /&gt;
&lt;br /&gt;
Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims”. It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications. &lt;br /&gt;
&lt;br /&gt;
=== &#039;&#039;&#039;Conclusion&#039;&#039;&#039; ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I argue that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s &#039;&#039;2015&#039;&#039; visualises ways in which these residual, and emergent characteristics of racialism are embedded in &amp;lt;s&amp;gt;the&amp;lt;/s&amp;gt; everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.&lt;br /&gt;
&lt;br /&gt;
=== Works Cited ===&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=File:2015_My-Blue-Window-Composition.00_00_02_01.Still059-1.jpg&amp;diff=1657</id>
		<title>File:2015 My-Blue-Window-Composition.00 00 02 01.Still059-1.jpg</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=File:2015_My-Blue-Window-Composition.00_00_02_01.Still059-1.jpg&amp;diff=1657"/>
		<updated>2023-04-17T10:59:31Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Figure 1: American Artist, still from 2015, 2019, Single-channel HD video, 21:38 minutes.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1656</id>
		<title>Toward a Minor Tech:CRICHLOW5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:CRICHLOW5000&amp;diff=1656"/>
		<updated>2023-04-17T10:58:16Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: Created page with &amp;quot; Category:Toward a Minor Tech Category:5000 words &amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:5000 words]]&lt;br /&gt;
&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow5000&amp;diff=1655</id>
		<title>Toward a Minor Tech:Crichlow5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow5000&amp;diff=1655"/>
		<updated>2023-04-17T10:57:29Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;nowiki&amp;gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CRICHLOW5000&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/nowiki&amp;gt;&amp;lt;nowiki&amp;gt;&amp;lt;/div&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of Big Data ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015,&#039;&#039; a 21-minute video installation shown at American Artist’s 2019 multimedia solo exhibition &#039;&#039;My Blue Window&#039;&#039; at the Queen’s Museum in New York City, assumes the point of view of a dashboard surveillance camera positioned on the hood of a police car cruising through Brooklyn’s side streets and motorways. Superimposed in the corner of the vehicles front windshield, a continuous flow of data registers the frequency of crime between 2015 and the preceding year: “Murder, 2015: 5, 2014: 7. Percent change: -28.6%”. As the car hurtles down the freeway, the word &#039;&#039;&#039;“FORECASTING”&#039;&#039;&#039; is projected onto the windshield, followed by the appearance of a grid-like navigation system. The vehicle suddenly changes course, veering onto an exit towards a series of blinking ‘hot spots’ algorithmically identified as the location of an imminent crime. Over the deafening din of a police siren, the car reaches its destination and slows to a stop on a seemingly abandoned street as the words “&#039;&#039;&#039;CRIME DETERRED&#039;&#039;&#039;” repetitively pulse across the screen. This narrative arc circuitously captures the meandering course of a police patrol car navigated by the machinations of predictive policing technology.&lt;br /&gt;
&lt;br /&gt;
Located at the intersection of race, technology, and knowledge production, American Artist—a name they legally adopted in 2013—engages a practice of ambivalent play with the visibility and erasure of black art practice. Their multimedia works explore forms of cultural critique that stage histories of systems of control in relation to blackness and networked culture. Foregrounding analytic means through which data-processing and algorithms augment and amplify racial violence against black and brown bodies, American Artist’s &#039;&#039;2015&#039;&#039; interweaves fictional narrative and coded documentary footage, constructing an experimental documentary form that ruminates on racialised spaces and bodies and their assigned “truths” in our surveillance culture.&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing entrenches racial inequalities through processes indiscernible to the human eye, &#039;&#039;2015&#039;&#039; ultimately evokes a question of scale. Following Joshua DiCaglio (2021), scale is invoked here as a mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. As Paul Gilroy sociologist and cultural studies scholar observes, race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of the human against an imagined nonhuman ‘other’. In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, racialisation is finding novel lines of emergence in evolving technological formats less constrained by the perceptual and scalar codes of a former racial era. While residual patterns of racialisation at the scale of the individual body remain entrenched in everyday experience, analytic surveillance technologies are increasingly inscribing racialisation as a large-scale function of datafication.&lt;br /&gt;
&lt;br /&gt;
Predictive policing, technology for example, relies on the accumulation of data to construct zones of suspicion through which racialised bodies are disproportionately rendered hyper-visible and subject to violence (Brayne 2020; Chun 2021). Health care algorithms used to predict and rank patient care favour white patients over black (Obermeyer 2019). Automated welfare eligibility calculations keep the racialised poor from accessing state resources (Rao 2019; Toos 2021). Credit-market algorithms widen already heightened racial inequalities in home ownership (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing often produces racialising outputs that, at first glance, appear neutral.&lt;br /&gt;
&lt;br /&gt;
Through analysis of American Artists video installation, &#039;&#039;2015&#039;&#039;, this paper considers how racial epistemology is actively being reconstructed and reified within the scalar magnitude of ‘big data’. Following Paul Gilroy’s historical periodisation of racism as a scalar project of the body that moves simultaneously inwards and downwards towards molecular scales of corporeal visibility, I ask how ‘big data’ now exerts upwards and outwards pressures into a globalised regime of datafication, particularly in the context of predictive policing technology. Drawing from Thao Than and Scott Wark’s conception of &#039;&#039;racial formations as data formation,&#039;&#039; that is, “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1), I explore the stakes and possibilities for dismantling racialism when the body is no longer its singular referent. I conclude by returning to analysis of American Artist’s &#039;&#039;2015&#039;&#039; as an example of emergent artistic intervention that reframes the scales upon which algorithmic regimes of domination are being produced and resisted.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The scales of Euclidean anatomy&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The story of racism, as Paul Gilroy tells it, moves simultaneously inwards and downwards into the contours of the human body. The onset of modernity – defined by early colonial contact with indigenous peoples and the expansion of European empires, the trans-Atlantic slave trade, and the emergence of enlightenment thought – saw the evolution of a thread of natural scientific thinking centered around taxonomical hierarchies of human anatomy. 18&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century naturalist Carl Linnaeus’s major classificatory work, &#039;&#039;Systema Naturae&#039;&#039; (1735), is widely recognised as the most influential taxonomic method that shaped and informed racist differentiations well into the nineteenth century and beyond. Linnaeus’s framework did not yet mark a turn towards biological hierarchisation of racial types; but concurrently, it signalled a new epoch of race science that would collapse and order human variation into several fixed and rigid phenotypic prototypes. By the onset of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the racialised body took on new meaning as the terminology of race slid from a polysemous semantic to a narrower signification of hereditary, biological determinism. In this shift from natural history to the biological sciences, Gilroy notes a change in the “modes and meanings of the visual and the visible”, and thus, a new kind of racial scale; what he terms &#039;&#039;the scale of comparative or Euclidean anatomy&#039;&#039; (844). This shift in scalar perceptuality is defined by “distinctive ways of looking, enumerating, measuring, dissecting and evaluating”; a trend that could only move further inwards and downwards under the surface of the skin (844). By the middle of the nineteenth 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, for example, the science of physiognomy, phrenology and comparative anatomy had encoded racial hierarchies within the physiological semiotics of skulls, limbs, and bones. By the early 20&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century, the eugenics movement pushed the science of racial discourse to ever smaller scopic regimes. Even the microscopic secrets of the blood became subject to racial scrutiny through the language of genetics and heredity.&lt;br /&gt;
&lt;br /&gt;
In the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century, however, things are different. Our perceptual regime has been forever altered by the revolution in digital technologies. Developments across computational, biological, and analytic sciences signal a new shift in perceptual scale, and with it, as Gilroy suggests, the end of race as we know it. Writing in the late 1990’s, Gilroy observes how technical advancements in imaging technologies, such as the nuclear magnetic resonance spectroscope [NMR/MRI], and positron emission tomography [PET], “have remade the relationship between the seeable and the unseen” (1998, 846).” By imaging the body in new ways, Gilroy claims, emergent technologies that allow the body to be viewed on increasingly minute scales “impact upon the ways that embodied humanity is imagined and upon the status of bio-racial differences that vanish at these levels of resolution” (846). This scalar movement ever inwards and downwards became especially evident in the advancements of molecular biology. Between 1990 and 2003, the Human Genome Project mapped the first human genome using new gene sequencing technology. Their study concluded that there is no scientific evidence that supports the idea that racial difference is encoded in our genetic material. Once and for all, or so we thought, race was disproved as a scientifically valid construct. As biological conceptions of race were belied by these breakthroughs in molecular biology, the perceptual regime to which racialism was once attached was ambivalently undone. In this scalar movement beyond &#039;&#039;Euclidean anatomy&#039;&#039;, as Gilroy claims, the body ceases to delimit “the scale upon which assessments of the unity and variation of the species are to be made” (845). In other words, we have departed from the perpetual regime that once determined who could be deemed ‘human’ at the scale of the body.&lt;br /&gt;
&lt;br /&gt;
This is not to say that racism has been eclipsed by innovations in technology, or that racial classifications do not remain persistently visible. As critics of Gilroy have evinced,  the language of biological racism is not obsolete. Efforts to resuscitate research into race’s biological basis continue to appear in scientific fields (Saini 2020), while the ongoing deaths of black people at the hands of police, or the increase in violent assaults against East Asian people during the Corona virus pandemic, demonstrate how racism is obstinately fixed within our visual regime. Gilroy suggests, however, that while the perceptual scales of race difference remain entrenched, these expressions of racialism are inherently residual. In order to combat the emergent racism of the present, we must look beyond the perceptual-anatomical scales of race difference that defined the modern episteme. Having “let the old visual signifiers of race go”, Gilroy argues, we  can “do a better job of countering the racisms, the injustices, that they brought into being if we make a more consistent effort to de-nature and de-ontologize ‘race’ and thereby to disaggregate raciologies” (839). This logic, however, deemphasizes the myriad ways in which the residual traces of an older racial regime shape the functions of newly emergent ‘post-visual’ technologies. As Alexandra Hall and Louise Amoore observe, the nineteenth century ambition to dissect the body, and thus reveal its hidden truths, “reveal a set of violence’s, tensions, and racial categorizations which may be reconfigured within new technological interventions and epistemological frameworks” (451). Referencing contemporary virtual imaging devices which scan and render the body visible in the theatre of airport security, Hall and Amoore suggest that new ways of visualizing, securitizing and mapping the body draw upon the old-age racial fantasy of rendering identity fully transparent and knowable through corporeal dissection. While the anatomical scales of racial discourse have not been wholly untethered from the body, the ways in which race, or its 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century successor is being rendered in new perceptual formats, remains an urgent question.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;‘Racial formations as data formations’&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beyond anatomical scales of race discourse, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. If the inward direction towards the hidden racial truths of the human body defined the logics and aesthetics of our former racial regime, how might we think about the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century avalanche of data and analytic technologies that increasingly determine life chances in an interconnected, yet deeply inequitable world? Can it be said that our current racial regime has reversed racialism’s inward march, now exerting upwards and outwards pressures into a globalised regime of ‘big data’?&lt;br /&gt;
&lt;br /&gt;
Big data, broadly understood, refers to data that is large in volume, high in velocity, and is provided in a variety of formats from which patterns can be identified and extracted (Laney). “Big”, of course, evokes a sense of scalar magnitude. For data scholar Wolfgang Pietsch, “a data set is ‘big’ if it is large enough to allow for reliable predictions based on inductive methods in a domain comprising complex phenomena”. Thus, data can be considered ‘big’ in so far as it can generate predictive insights that inform knowledge and decision-making. Growing ever more prevalent across major industries such as medical practice (Rothstein), insurance, criminal justice (Ales), warfare (Berman), and politics (Macnish and Galliot), data acquisition and analytics increasingly forms the bedrock of not only the global economy, but domains of human experience.&lt;br /&gt;
&lt;br /&gt;
Big data technologies are often claimed to be more truthful,  efficient, and objective compared to to the biased and error-prone tendencies of human decision-making. Its critics, however, have shown this assumption to be outrightly false – particularly for people of colour. Safiya Noble’s &#039;&#039;Algorithms of Oppression&#039;&#039; highlights cases of algorithmically driven data failures which underscore the ways in which sexism and racism are fundamental to online corporate platforms like Google (2018). Cathy O’Neil’s &#039;&#039;Weapons of Maths Destruction&#039;&#039; addresses the myriad ways in which big data analytics tend to disadvantage the poor and people of colour under the auspice of objectivity. Such critiques often approach big data through the lens of &#039;&#039;bias&#039;&#039; – either bias embedded in views of the dataset or algorithm creator, or bias ingrained in the data itself. In other words, biased data will subsequently produce biased outcomes. &lt;br /&gt;
&lt;br /&gt;
This garbage in/garbage out model, however, does not account for the ways in which big data analytics are producing &#039;&#039;new&#039;&#039; racial classifications emerging not from data inputs, but within correlative models themselves. As Thao Than and Scott Wark suggest, “the application of inductive techniques to large data sets produces novel classiﬁcations. These classiﬁcations conceive us in new ways – ways that we ourselves are unable to see” (3). Following Gilroy’s idea that changes in perceptuality led by the technological revolution of the 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century require a reimagination of race, or a repudiation of it altogether, Than and Wark claim that racialism is no longer explicitly predicated on visual hierarchies of the body, but rather “emerges as an epiphenomenon of automated algorithmic processes of classifying and sorting operating through proxies and abstractions” (2). This phenomenon is what they term &#039;&#039;racial formations as data formations&#039;&#039;. That is, racialisation shaped by the non-visible processing of data-generated proxies. Drawing from examples such as Facebook’s now disabled ‘ethnic affinity’ function, which classed users by race simply by analysing their behavioural data and proxy indicators, such as language, ‘likes’, and IP address – Than and Wark show “that in the absence of explicit racial categories, computational systems are still able to racialize us” – though this may or may not map onto what one looks like (3). While the datafication of racial formations may deepen already-present inequalities for people of colour, these formations have a much more pernicious function: the transformation of racial category itself.&lt;br /&gt;
&lt;br /&gt;
Can these emergent formations culled from the insights of big data be called ‘race’, or do we need a new kind of language to account for technologically induced shifts in racial perception and scale? Further, are processes of computational induction ‘racialising’ if they are producing novel classifications which often map onto, but are not constrained by previous racial categories? As Achille Mbembe notes, these questions must also be considered in the context of 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; globalisation and the encroachment of neoliberal logics into all facets of life, such that “all events and situations in the world of life can be assigned a market value” (Vogl 152). Our contemporary context of globalised inequality is increasingly predicated on what Mbembe describes as the ‘universalisation of the black condition’, whereby the racial logics of capture and predication which have shaped the lives of black people from the onset of the transatlantic slave trade, “have now become the norm for, or at least the lot of, all of subaltern humanity” (Mbembe 4).  Here, it is not the biological construct of race per se that is activated in the classifying logics of capitalism and emergent technologies, but rather, the production of “categories that render disposable populations disposable to violence” (Lloyd 2018, 2). In other words, 21&amp;lt;sup&amp;gt;st&amp;lt;/sup&amp;gt; century racialism is circumscribed by differential relations of human value determined by the global capitalist order. However, these new classifications retain the pervasive logic of difference and division, reconfiguring the category of the disentitled, less-than-human Other in new formations. As Mbembe suggests, “Blackness nor race has ever been fixed”, but rather reconstitutes itself in new ways (6).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;The Problem of Prediction: Data-led policing in the U.S&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Multiple vectors of racialism, both old and new, visual and post-visual, large and small scale, play out in the optics of predictive policing technology. Predictive policing software operates by analysing vast swaths of criminological data to forecast when and where a crime of a certain nature will take place, or who will commit it. The history of data collection is deeply entwined with the project of policing and criminological knowledge, and further, the production of race itself. As Autumn Womack shows in her analysis of “the racial data revolution” in late nineteenth century America, &amp;quot;data and black life were co-constituted in the service of producing a racial regime” (15). Statistical attempts to measure and track the movements of black populations during this period went hand in hand with sociological and carceral efforts to regulate and control black life as an object of knowledge. Policing was and continues to be central to this disciplinary project. As R. Joshua Scannell powerfully argues,  “Policing does not have a “racist history.’ Policing makes race and is inextricable from it. Algorithms cannot ‘code out’ race from American policing because race is a policing technology, just as policing is a bedrock racializing technology” (108). Like data, policing and the production of race difference co-constitute one another. Predictive policing thus cannot be analysed without accounting for entanglements between data, carcerality, and racialism.&lt;br /&gt;
&lt;br /&gt;
 Computational methods were integrated into American criminal justice departments beginning in the 1960’s. Incited by America’s “War on Crime”, the densification of urban areas following the Great Migration of African Americans to Northern cities, and the economic fall-out from de-industrialisation, criminologists began using data analytics to identify areas of high-crime incidence from which police patrol zones were constructed. This strategy became known as &#039;&#039;hot spot criminology.&#039;&#039; By 1994, the New York City Police Department (NYPD) had integrated CompStat, the first digitised, fully automated data-driven performance measurement system into its everyday operations. CompStat is now employed by nearly every major urban police department in America. Beginning in 2002, the NYPD began using statistical insights digitally generated by CompStat to draw up criminogenic “impact zones” – namely low income, black neighbourhoods – that would be subject to heightened police surveillance. As Brian Jefferson observes, the NYPD’s statistical strategy “was deeply wound up in dividing urban space according to varying levels of policeability” (116). Moreover, impact zones “provided not only a scientific pretext for inundating negatively racialized communities in patrol units but also a rationale for micromanaging them through hyperactive tactics” such as stop-and-frisk searches (Jefferson 117). Between 2005 and 2006, the NYPD conducted 510,000 stops in impact zones – a 500% increase from the year before.&lt;br /&gt;
&lt;br /&gt;
Policing has only grown more reliant on insights culled from predictive data models. PredPol – a predictive policing company that was developed out of the Los Angeles Police Department in 2009 – forecasts crimes based on crime history, location, and time. HunchLab, the more “holistic” successor of PredPol, not only considers factors like crime history, but uses using machine learning approaches to assign criminogenic weights to data “associated with a variety of crime forecasting models” such as the density of “take-out restaurants, schools, bus stops, bars, zoning regulations, temperature, weather, holidays, and more” (Scannel 117).  Here, it is not the omniscience of panoptic vision, or the individualising enactment of power that characterises Hunchlab’s surveillance software, but the punitive accumulation of proxies and abstractions in which “humans as such are incidental to the model and its effects” (Scannel 118).  Under these conditions, for example, “criminality increasingly becomes a direct consequence of anthropogenic climate change and ecological crisis” (Scannel 122). &lt;br /&gt;
&lt;br /&gt;
Data-driven policing is often presented as the objective antidote to the failures of human-led policing. However, in a context where black and brown people around the world are historically, and contemporaneously subjected to disproportionate police surveillance, carceral punishment, and state-sponsored violence, input data analysed by predictive algorithms often perpetuates a self-reinforcing cycle through which racialised communities are circuitously subjected to heightened police presence. As sociologist Sarah Brayne explains, “if historical crime data are used as inputs in a location-based predictive policing algorithm, the algorithm will identify areas with historically higher crime rates as high risk for future crime, officers will be deployed to those areas, and will thus be more likely to detect crimes in those areas, creating a “self- fulfilling statistical prophecy” (109).&lt;br /&gt;
&lt;br /&gt;
Beyond this critique cycle of ‘garbage in, garbage out,’ Than and Wark’s conceptualisation of &#039;&#039;racial formations as data formations&#039;&#039; provides insight into the ways in which predictive policing instils racialisation as a post-visual epiphenomenon of data-generated proxies. While the racist outcomes of data-led policing certainly manifest in the lived realities of poor and negatively racialised communities, predictive policing necessarily relies upon data-generated, non-visual proxies of race – postcode, history of contact with the police, geographic tags, distribution of schools or restaurants, weather, and more. Such technologies demonstrate how different valuations of risk that “render disposable populations disposable to violence” are being actively being produced not merely through data, but in the correlative models themselves. As Jefferson suggests, “modernity’s racial taxonomies are not vanishing through computerization; they have just been imported into data arrays” (6). As neighbourhoods and ecologies, and those who dwell within them, are actively transcribed into newly ‘raced’ data formations, does the body merely disappear from the racial equation? &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;2015&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This question returns us to American Artist’s video installation, &#039;&#039;2015&#039;&#039;. From the onset of the film, the camera’s objectivity is consistently brought into question. Gesturing towards the frame as an architectural narrowing of positionality, the constricted, stationary viewpoint of the camera fixed onto the dashboard of the police car positions the viewer within the uncomfortable observatory of the surveillant police apparatus. The window is imaged as an enclosure which frames the disproportionate surveillance of black and brown communities by police. The world view here is captured from a single axis, a singular ideological vantage point, as an already known world of city landscape passes ominously through the window’s frame of vision. The frame’s hyper-selectivity, an enduring object of scrutiny in the field of documentary cinema, and visuality more broadly, is always implicated in the politics of what exists beyond its view, thus interrogating the assumed indexicality, or visual truth of the filmic image.&lt;br /&gt;
&lt;br /&gt;
The frame’s ambiguous functionality is made palpable when the car pulls over to stop. Over the din of a loud police siren, we hear a car door open and shut as the disembodied police officer climbs out of the car to survey the scene. Never entering the camera’s line of vision, the imagined, diegetic space outside the frame draws attention to the occlusive nature of the recorded seen-and heard. As demonstrated in the countless acquittals of police officer’s guilty of assaulting or killing unarmed black people, even when death occurs within the “frame” of a surveillance camera, dash cam or a civilian bystander, this visual record remains ambiguous and is rarely deemed conclusive.  Consider the cases of Eric Garner, Philando Castille, or Rodney King, a black man whose violent assault by a group of LAPD officers in 1991 was recorded by a bystander and later used as evidence in the prosecution of King’s attackers. Despite the clear visual evidence of what took place, it was the Barthesian concept of the “caption” – the contextual text which rationalises or situates an image within a given ontological framework – that led to the officer’s acquittal.  As Winston notes, “what was beyond the frame containing “the recorded ‘seen-and-heard’” was (or could be made to seem) crucial. This is always inevitably the case because the frame is, exactly, a “frame” – it is blinkered, excluding, partial, limited” (614). This interrogation of the fallacies of visual “evidence” is a critical armature of &#039;&#039;2015&#039;&#039;’s intervention, one that interrogates the underlying assumptions of visuality and perception in surveillance apparatuses, constructing the frame of the police window not as a source of visible evidence, but that which obfuscates, conceals, or obstructs.&lt;br /&gt;
&lt;br /&gt;
Beyond the visual, other lives of data further complicate the already troubled notion the visible as a stable category. As Sharon Lin Tay argues, “Questions of looking, tracking, and spying are now secondary to, and should be seen within the context of, network culture and its enabling of new surveillance forms within a technology of control.”  In other words, scopic regimes of surveillance are increasingly subsumed by the datasphere from which multiple stories and scenes may be spun. “Evidence” no longer relies solely on a visual account of “truth”, but, rather on a digital record of &#039;&#039;traces&#039;&#039;. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;2015&#039;&#039;’s representations of predictive policing software and technologies of biometric identification alludes to the scope in which data is literally superimposed onto our own frame of vision.  Predicting our locations, consumption habits, political views, credit scores, and criminal intentions, analytic predictive technologies condense potential futures into singular outputs. As the police car follows the erratic route of its predictive policing software on the open road, we are simultaneously made aware of a future which is already foreclosed.&lt;br /&gt;
&lt;br /&gt;
As American Artist’s &#039;&#039;2015&#039;&#039; so aptly suggests, the life of data exists beyond our frame of view but increasingly determines what occurs within it.  Data is the text that literally “captions” our lives and identities. In zones deemed high-risk for crime by analytic algorithms, subjects are no longer considered civilians, but are hailed and interpolated as criminalised suspects through their digital subjectification. As the police car cruises through Brooklyn’s sparsely populated streets and neighbourhoods in the early morning, footage of people going about their daily business morphs into an insidious interrogation of space and mobility. As the work provocatively suggests, predictive policing construct zones of suspicion and non-humanity through which the body is interrogated and brought into question. In identifying the body as “threat” by virtue of its geo-spatial location in a zone wholly constructed by the racializing history of policing data, the racial body is recoded, not as a necessarily phenotypic entity, but as a product of data. American Artist’s &#039;&#039;2015&#039;&#039; palpably coneys race as lived through data, shaping who, and what comes into the frame of the surveilling apparatus. The unadorned message: race is produced and sustained as a product of data. &lt;br /&gt;
&lt;br /&gt;
Yet, at the same time, the work insists on the enduring physiological nature of visual racialism through the coding of the body. As the police car cruises through the highlighted zones of predicted crime, select passers-by are singled out and scanned by a facial recognition device. This reference to biometric identification – reading the body as the site and sign of identity - complicates the claim that forms of visual evidence are increasingly being subsumed by the post-visual data apparatus. Biometric systems of measurement, such as facial templates or fingerprint identification, are inherently tied to older, eighteenth and nineteenth century colonial and ethnographic regimes of physiological classification that aimed to expose a certain narrative truth about the racialized subject through their visual capture. Contemporary biometric technologies, as Simone Browne argues, retain the same systemic logics of their colonial predecessors, “alienating the subject by producing a truth about the racial body and one’s identity (or identities) despite the subject’s claims”. It has been repeatedly shown, for example, that facial recognition software demonstrate bias against subjects belonging to specific racial groups, often failing to detect or misclassifying darker-skinned subjects, an event that the biometric industry terms a “failure to enrol” (FTE).  Here, blackness is imaged as outside the scope of human recognition, while at the same time, black people are disproportionately subjected to heightened surveillance by global security apparatuses. This disparity shows that while forms of racialisation are increasingly migrating to the terrain of the digital, race still inheres, even if residually, as an epidermal materialisation in the biometric evidencing of the body.&lt;br /&gt;
&lt;br /&gt;
In American Artist’s &#039;&#039;2015&#039;&#039;, extant tension between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racializing formats interlink and reinforce each other. By evidencing the racial body, on one hand as a product of data, and on the other, an embodied, physiological construction of cultural and scientific ontologies of the Other, American Artist makes visible the contemporary and historical means through which race is lived and produced. By calling into question the visual and digital ways the racial body is made to evidence its being-in-the-world, Artist challenges and disrupts the documentary logics of surveillance apparatuses – that being, what Catherine Zimmer describes as the “production of knowledge through visibility”.  By entangling racializing forms of surveillance within a realist documentary coded format, American Artist calls into question what it means to document, record, or survey within the frame of documentary cinema. As data increasingly guides where we go, what we see, and whose bodies come into question, claims on the recorded seen and heard, as well as the digitally networked, must continually be interrogated. In the context of our current democratic crisis, where the volatile distinctions between “fact” and “fiction” has produced a plethora of unstable meanings, American Artist’s artistic &#039;&#039;2015&#039;&#039; is an example of emergent activist political intervention that interrogates the underlying assumption of documentary objectivity in both cinematic and data-driven formats, subverting the racial logics that remain imbricated within visual and post-visual systems of classifications. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Conclusion&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This article has explored the shifting terrain of racial discourse in the age and scalar magnitude of big data. Drawing from Paul Gilroy’s periodisation of racialism from Euclidian anatomy of the 19&amp;lt;sup&amp;gt;th&amp;lt;/sup&amp;gt; century to the genomic revolution of the 1990’s, I argue that race has &#039;&#039;always&#039;&#039; been deeply entwined with questions of scale and perception. Gilroy observed that emergent digital technologies are making way for new ways of seeing the body, and subsequently, conceiving humanity in novel scales detached from the visual. Similar insights inform Than and Wark’s prescient account of &#039;&#039;racial formations as data formations&#039;&#039; – the idea that race is increasingly being produced as a cultivation of data-driven proxies and abstractions. American Artist’s &#039;&#039;2015&#039;&#039; visualises ways in which these residual, and emergent characteristics of racialism are embedded in &amp;lt;s&amp;gt;the&amp;lt;/s&amp;gt; everyday systems of predictive policing technology. Through multimedia intervention, the work conveys racialism not as a single, static entity, but as a historical structure that mutates and evolves algorithmically across an ever-shifting geopolitical landscape of capital and power. For purposes of this analysis, American Artist allows us to grasp the many lives of racialism’s past and present, as well as the future modalities in which its determinations are not yet realised.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow5000&amp;diff=1634</id>
		<title>Toward a Minor Tech:Crichlow5000</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow5000&amp;diff=1634"/>
		<updated>2023-04-10T11:55:16Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: Created page with &amp;quot;COMING SOON&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;COMING SOON&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1193</id>
		<title>Toward a Minor Tech:Toward a Minor Tech:Crichlow 500</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1193"/>
		<updated>2023-01-20T16:05:17Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: /* Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’ */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:500 words]]&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’&#039;&#039;&#039; ==&lt;br /&gt;
&#039;&#039;&#039;Camille Crichlow&#039;&#039;&#039; &lt;br /&gt;
&lt;br /&gt;
Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race explicitly tied to anatomical scales of the body were belied by a breakthrough consensus – that race has no fundamental basis in human biology – the perceptual regimes to which racialism was attached were, as sociologist Paul Gilroy claims, ambivalently undone (1998). &lt;br /&gt;
&lt;br /&gt;
In the context of 21st century digital processing, another break in racial scale has emerged. There is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that, without critical interrogation, appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1, 2020). &lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. What if the historical compression of racial scale—a movement of race-craft inwards and downwards into the minute and microscopic signifiers of the body — now exerts upwards and outwards pressures into a globalised regime of datafication? In other words, how is racial epistemology reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’?  These questions are not to suggest that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of seeing, but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience. &lt;br /&gt;
&lt;br /&gt;
Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute neutrality that ‘big data’ regimes attempt to guarantee. How might ‘big’ and ‘small’ tech be mobilised towards liberatory practices of refusal that challenge scalar realignments of racialism, and transform domains of experience toward an end of race futurity?&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1191</id>
		<title>Toward a Minor Tech:Toward a Minor Tech:Crichlow 500</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1191"/>
		<updated>2023-01-20T16:04:12Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:500 words]]&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’ ==&lt;br /&gt;
&#039;&#039;&#039;Camille Crichlow&#039;&#039;&#039; &lt;br /&gt;
&lt;br /&gt;
Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race explicitly tied to anatomical scales of the body were belied by a breakthrough consensus – that race has no fundamental basis in human biology – the perceptual regimes to which racialism was attached were, as sociologist Paul Gilroy claims, ambivalently undone (1998). &lt;br /&gt;
&lt;br /&gt;
In the context of 21st century digital processing, another break in racial scale has emerged. There is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that, without critical interrogation, appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1, 2020). &lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. What if the historical compression of racial scale—a movement of race-craft inwards and downwards into the minute and microscopic signifiers of the body — now exerts upwards and outwards pressures into a globalised regime of datafication? In other words, how is racial epistemology reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’?  These questions are not to suggest that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of seeing, but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience. &lt;br /&gt;
&lt;br /&gt;
Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute neutrality that ‘big data’ regimes attempt to guarantee. How might ‘big’ and ‘small’ tech be mobilised towards liberatory practices of refusal that challenge scalar realignments of racialism, and transform domains of experience toward an end of race futurity?&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1133</id>
		<title>Toward a Minor Tech:Toward a Minor Tech:Crichlow 500</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1133"/>
		<updated>2023-01-20T15:37:42Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:500 words]]&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’ ==&lt;br /&gt;
Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race explicitly tied to anatomical scales of the body were belied by a breakthrough consensus – that race has no fundamental basis in human biology – the perceptual regimes to which racialism was attached were, as sociologist Paul Gilroy claims, ambivalently undone (1998). &lt;br /&gt;
&lt;br /&gt;
In the context of 21st century digital processing, another break in racial scale has emerged. There is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that, without critical interrogation, appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1, 2020). &lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. What if the historical compression of racial scale—a movement of race-craft inwards and downwards into the minute and microscopic signifiers of the body — now exerts upwards and outwards pressures into a globalised regime of datafication? In other words, how is racial epistemology reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’?  These questions are not to suggest that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of seeing, but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience. &lt;br /&gt;
&lt;br /&gt;
Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute neutrality that ‘big data’ regimes attempt to guarantee. How might ‘big’ and ‘small’ tech be mobilised towards liberatory practices of refusal that challenge scalar realignments of racialism, and transform domains of experience toward an end of race futurity?&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Contributors&amp;diff=1062</id>
		<title>Toward a Minor Tech:Contributors</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Contributors&amp;diff=1062"/>
		<updated>2023-01-20T15:00:35Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;List of contributors here&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Inga Luchs&#039;&#039;&#039; is a PhD candidate at the University of Groningen. In her research, she deals with questions of data classification and discrimination from a cultural and technical perspective.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Søren Bro Pold&#039;&#039;&#039; Digital Aesthetics Research Center, Aarhus University, works with the arts of the interface and interface criticism.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;xenodata co-operative&#039;&#039;&#039; investigates image politics, algorithmic culture and technological conditions of knowledge production and governance through art and media practices. The collective was established by curator Yasemin Keskintepe and artist-researcher Sasha Anikina.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Jack Wilson&#039;&#039;&#039; is a PhD researcher at the University of Warwick’s Centre for Interdisciplinary Methodologies. He is not a conspiracy theorist. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Winnie Soon&#039;&#039;&#039; is a Hong Kong-born artist coder and researcher, engaging with themes such as Free and Open Source Culture, Coding Otherwise, artistic/technical manuals and digital censorship.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Christian Ulrik Andersen&#039;&#039;&#039;, Digital Aesthetics Research Center, Aarhus University, is attempting to bring the knowledge and practices of digital culture and art to the fore.&lt;br /&gt;
&lt;br /&gt;
From a network of &#039;&#039;&#039;Feminist Servers&#039;&#039;&#039; the following authors contributed: mara karagianni - artist, software, sysadmin, ooooo - Transuniversal constellation, nate wessalowski - PhD student at Münster University, vo ezn - sound &amp;amp;&amp;amp; infrastructure artist.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Shusha Niederberger&#039;&#039;&#039; is a PhD researcher based at Zurich University of the Arts and working on user subject positions in datafied environments and aesthetic strategies of using otherwise.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Inte Gloerich&#039;&#039;&#039; (Utrecht University &amp;amp; Institute of Network Cultures) researches sociotechnical imaginaries around blockchain technology as they appear in for instance memes, startup culture, and art.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Gabriel Menotti&#039;&#039;&#039; is Associate Professor in Film &amp;amp; Media at Queen&#039;s University and an independent curator.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Sandy Di Yu&#039;&#039;&#039; is a PhD researcher at the University of Sussex and co-managing editor of DiSCo Journal (www.discojournal.com), using digital artist critique to examine shifting experiences of time.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Magdalena Tyżlik-Carver&#039;&#039;&#039; ferments data and investigates Critical Data and related practices through curating. She is Associate Professor in Digital Design and Information Studies at Aarhus University.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Geoff Cox&#039;&#039;&#039; should probably decalre to be Professor of Art and Computational Culture at London South Bank University, and co-director of Centre for the Study of the Networked Image (CSNI) but thinks this sounds a bit pompous.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Susanne Förster&#039;&#039;&#039; is a PhD candidate and research associate at the University of Siegen. Her work deals with imaginaries and infrastructures of conversational artificial agents.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Anna Mladentseva&#039;&#039;&#039; is a PhD researcher at University College London whose project focuses on the conservation of software-based works of art and design from the Victoria &amp;amp; Albert museum.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Jung-Ah Kim&#039;&#039;&#039; is a PhD researcher in Screen Cultures and Curatorial Studies at Queen’s University. She studies the relationship between weaving and computing and traditional Korean textiles.  &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Camille Crichlow&#039;&#039;&#039; is a PhD Researcher at the Sarah Parker Remond Centre for the Study of Racism and Racialisation (University College London). Her research interrogates how the historical and socio-cultural narrative of race manifests in contemporary algorithmic technologies.&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1058</id>
		<title>Toward a Minor Tech:Toward a Minor Tech:Crichlow 500</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow_500&amp;diff=1058"/>
		<updated>2023-01-20T14:58:50Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: Created page with &amp;quot; Category:Toward a Minor Tech Category:500 words  Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race explicitly tied to anatomical scales of the body were belied by a breakthrough consensus – that race has no fundamental basis in human biology – the per...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:500 words]]&lt;br /&gt;
&lt;br /&gt;
Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race explicitly tied to anatomical scales of the body were belied by a breakthrough consensus – that race has no fundamental basis in human biology – the perceptual regimes to which racialism was attached were, as sociologist Paul Gilroy claims, ambivalently undone (1998). &lt;br /&gt;
&lt;br /&gt;
In the context of 21st century digital processing, another break in racial scale has emerged. There is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that, without critical interrogation, appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: “modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data” (1, 2020). &lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. What if the historical compression of racial scale—a movement of race-craft inwards and downwards into the minute and microscopic signifiers of the body — now exerts upwards and outwards pressures into a globalised regime of datafication? In other words, how is racial epistemology reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’?  These questions are not to suggest that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of seeing, but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience. &lt;br /&gt;
&lt;br /&gt;
Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute neutrality that ‘big data’ regimes attempt to guarantee. How might ‘big’ and ‘small’ tech be mobilised towards liberatory practices of refusal that challenge scalar realignments of racialism, and transform domains of experience toward an end of race futurity?&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=631</id>
		<title>Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=631"/>
		<updated>2023-01-17T17:30:33Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &amp;lt;!-- ------------------------------------------------------------------- --------------------------peer-annotations------------------------------  To allow others to comment on the 1000 words version of your text,  we will work with embedded etherpads in the pages here on the wiki.  To embed an etherpad in your page and allow peer-annotations:  1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice.  2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page.  3. The etherpad should appear on the right side of the screen.  NOTE: You cannot use spaces in the id=&amp;quot;&amp;quot; value.  ------------------------------------------------------------------------ -------------------------------------------------------------------- --&amp;gt; ==&lt;br /&gt;
&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;eplite id=&amp;quot;Scaling up, Scaling Down&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt; ==&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;max-width:80ch;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’ ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The story of race, as Paul Gilroy tells it, moves simultaneously inwards and downwards. Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race were belied by breakthroughs in molecular biology, the representational regimes to which racialism was attached were ambivalently undone. In this movement beyond the old visual signifiers of race, Gilroy notes how the human body ceases to “delimit the scale upon which assessments of the unity and variation of the species are to be made” (845). In more contemporary terms, however, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. What if scalar compression of the microscopic to the molecular —a movement of race-craft inwards and downwards— now exerts upwards and outwards pressures into a globalised regime of datafication? To extend Gilroy differently in the present context of algorithmic culture, I consider how racial epistemology is reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’. In other words, I want to complicate the stakes and possibilities for dismantling racialism when the body is no longer its primary referent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. Following Joshua DiCaglio, I evoke scale here as an integral mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. Race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of a human ideal against an imagined nonhuman ‘other’. Rather than assume the truth of racial identity in an imagined “mute body”, analytic surveillance technologies produce racialisation as a scalar function of mass swathes of processed data (Gilroy 1998, p. 847). Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). ‘Suspicious’ (code word: Muslim) subjects flagged by the theatre of algorithmic security systems are rendered immobile at the border (Amoore 2006). Automated welfare eligibility checks keep struggling people from accessing the resources to which they are entitled (Rao 2019; Toos 2021). Credit-market algorithms widen the racialised gap between the haves and the have nots (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: that is, “&#039;&#039;modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data&#039;&#039;” (1, 2020).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy too predicted a shift that would constitute race as an entity divorced from perceptual regimes of the human eye. But rather than moving inwards, towards the invisible genomic interfaces of the body, algorithmic processes of classification constitute a digital re-coding of race by proxy &#039;&#039;en mass&#039;&#039;. This is not to say that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of &#039;&#039;seeing,&#039;&#039; but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
So, are we approaching a time where the age of visual, or embodied conceptions of racialism are ending? Not so fast – I would like to complicate this a bit further. Biometric technologies that produce digitised templates of bodily characteristics for authentication or verification purposes have troubled the notion that we have left behind racialism’s sticky attachment to the minute, perceptual scales of bodily difference in the digital age. Scholars such as Joseph Pugliese have shown how biometric technologies are ‘&#039;&#039;infrastructurally calibrated to whiteness’&#039;&#039; in their reduced capacity to recognize dark-skinned faces (2012, p. 57). In this regard, biometric technologies relegate racialised bodies outside the scope of human recognition, while at the same time, disproportionately subjecting them to heightened surveillance in service of local and global security apparatuses. What this disparity demonstrates, is that while forms of racialisation are increasingly migrating to the terrain of the digital, the epidermal materialisation of race has not yet faded, but is experiencing a resurgence in new digitised forms. Biometric technologies fit under the umbrella of ‘big data’ given that they often process large volumes of data and analytics. And yet, their capacity to racialise and produce difference is directly tied to the body – to the ‘material’ site of race itself. These extant tensions between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racialising formats interlink and reinforce each other.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The emergence of digital technologies and ‘big data’ may not, as Gilroy imagined, result in the ‘end’ of race. Rather, these technologies have complicated it. As racialism migrates to post-visual registers of datafication, residual modes of racialization remain intact in biometric modes of imaging the body. Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration, and transformation that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute universality and totality that ‘big data’ regimes attempt to guarantee under the pretense of neutrality. Initiatives such as the The Distributed Artificial Intelligence Research Institute, for example, use data to examine the effects of discriminatory policies, most recently publishing a case study on spatial apartheid in South Africa (Gebru et al. 2021). This study points to the potential capabilities of large-scale data analysis to redress the historical effects of racialisation. Here, big data analytics do not reconstruct racial category, but may be mobilised towards the liberatory practices that reframe ‘big data’ and transform domains of experience toward an end of race futurity.&lt;br /&gt;
&lt;br /&gt;
==== Bibliography ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Couldry, Nick, and Ulises A. Mejias. “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject.” &#039;&#039;Television &amp;amp; new media&#039;&#039; 20.4 (2019): 336–349.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; Minneapolis, Minnesota: University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gebru, Timnit, Luzango Mfupe, Nyalleng Moorosi, Raesetje Sefala, and Nyalleng Moorosi. “Constructing a Visual Dataset to Study the Effects of Spatial Apartheid in South Africa”. &#039;&#039;The Distributed AI Research Institute&#039;&#039;, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039; 21.5 (1998): 838–847.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Phan, Theo and Scott Wark. “Racial formations as data formations”. &#039;&#039;Big Data &amp;amp; Society&#039;&#039; 8.2 (2021), p. 1–5.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Pugliese, Joseph. “The Biometrics of Infrastructural Whiteness”. &#039;&#039;Biometrics: Bodies, Technologies, Biopolitics&#039;&#039;. Taylor and Francis, 2012.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Rao, Ursala. “Re-Spatializing Social Security in India”. &#039;&#039;Spaces of Security: Ethnographies of Securityscapes, Surveillance, and Control&#039;&#039;, eds. Low, Setha, and Mark Maguire. Paris: NYU Press, 2019. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Toh, Amos. “Automated Hardship: How the Tech-Driven Overhaul of the UK&#039;s Social Security System Worsens Poverty”. &#039;&#039;Human Rights Watch&#039;&#039;, 29 September, 2020. Web. &amp;lt;nowiki&amp;gt;https://www.hrw.org/news/2020/09/29/uk-automated-benefits-system-failing-people-need&amp;lt;/nowiki&amp;gt;. Accessed 15 December, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039; Chicago, IL: The University of Chicago Press, 2022. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Zuberi, Tukufu. &#039;&#039;Thicker Than Blood: How Racial Statistics Lie.&#039;&#039; Minneapolis: University of Minnesota Press, 2001.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:1000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=630</id>
		<title>Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=630"/>
		<updated>2023-01-17T17:30:14Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &amp;lt;!-- ------------------------------------------------------------------- --------------------------peer-annotations------------------------------  To allow others to comment on the 1000 words version of your text,  we will work with embedded etherpads in the pages here on the wiki.  To embed an etherpad in your page and allow peer-annotations:  1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice.  2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page.  3. The etherpad should appear on the right side of the screen.  NOTE: You cannot use spaces in the id=&amp;quot;&amp;quot; value.  ------------------------------------------------------------------------ -------------------------------------------------------------------- --&amp;gt; ==&lt;br /&gt;
&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;eplite id=&amp;quot;Scaling up, Scaling Down: Racialism in the Age of ‘Big Data’&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt; ==&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;max-width:80ch;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’ ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The story of race, as Paul Gilroy tells it, moves simultaneously inwards and downwards. Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race were belied by breakthroughs in molecular biology, the representational regimes to which racialism was attached were ambivalently undone. In this movement beyond the old visual signifiers of race, Gilroy notes how the human body ceases to “delimit the scale upon which assessments of the unity and variation of the species are to be made” (845). In more contemporary terms, however, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. What if scalar compression of the microscopic to the molecular —a movement of race-craft inwards and downwards— now exerts upwards and outwards pressures into a globalised regime of datafication? To extend Gilroy differently in the present context of algorithmic culture, I consider how racial epistemology is reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’. In other words, I want to complicate the stakes and possibilities for dismantling racialism when the body is no longer its primary referent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. Following Joshua DiCaglio, I evoke scale here as an integral mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. Race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of a human ideal against an imagined nonhuman ‘other’. Rather than assume the truth of racial identity in an imagined “mute body”, analytic surveillance technologies produce racialisation as a scalar function of mass swathes of processed data (Gilroy 1998, p. 847). Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). ‘Suspicious’ (code word: Muslim) subjects flagged by the theatre of algorithmic security systems are rendered immobile at the border (Amoore 2006). Automated welfare eligibility checks keep struggling people from accessing the resources to which they are entitled (Rao 2019; Toos 2021). Credit-market algorithms widen the racialised gap between the haves and the have nots (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: that is, “&#039;&#039;modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data&#039;&#039;” (1, 2020).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy too predicted a shift that would constitute race as an entity divorced from perceptual regimes of the human eye. But rather than moving inwards, towards the invisible genomic interfaces of the body, algorithmic processes of classification constitute a digital re-coding of race by proxy &#039;&#039;en mass&#039;&#039;. This is not to say that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of &#039;&#039;seeing,&#039;&#039; but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
So, are we approaching a time where the age of visual, or embodied conceptions of racialism are ending? Not so fast – I would like to complicate this a bit further. Biometric technologies that produce digitised templates of bodily characteristics for authentication or verification purposes have troubled the notion that we have left behind racialism’s sticky attachment to the minute, perceptual scales of bodily difference in the digital age. Scholars such as Joseph Pugliese have shown how biometric technologies are ‘&#039;&#039;infrastructurally calibrated to whiteness’&#039;&#039; in their reduced capacity to recognize dark-skinned faces (2012, p. 57). In this regard, biometric technologies relegate racialised bodies outside the scope of human recognition, while at the same time, disproportionately subjecting them to heightened surveillance in service of local and global security apparatuses. What this disparity demonstrates, is that while forms of racialisation are increasingly migrating to the terrain of the digital, the epidermal materialisation of race has not yet faded, but is experiencing a resurgence in new digitised forms. Biometric technologies fit under the umbrella of ‘big data’ given that they often process large volumes of data and analytics. And yet, their capacity to racialise and produce difference is directly tied to the body – to the ‘material’ site of race itself. These extant tensions between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racialising formats interlink and reinforce each other.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The emergence of digital technologies and ‘big data’ may not, as Gilroy imagined, result in the ‘end’ of race. Rather, these technologies have complicated it. As racialism migrates to post-visual registers of datafication, residual modes of racialization remain intact in biometric modes of imaging the body. Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration, and transformation that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute universality and totality that ‘big data’ regimes attempt to guarantee under the pretense of neutrality. Initiatives such as the The Distributed Artificial Intelligence Research Institute, for example, use data to examine the effects of discriminatory policies, most recently publishing a case study on spatial apartheid in South Africa (Gebru et al. 2021). This study points to the potential capabilities of large-scale data analysis to redress the historical effects of racialisation. Here, big data analytics do not reconstruct racial category, but may be mobilised towards the liberatory practices that reframe ‘big data’ and transform domains of experience toward an end of race futurity.&lt;br /&gt;
&lt;br /&gt;
==== Bibliography ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Couldry, Nick, and Ulises A. Mejias. “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject.” &#039;&#039;Television &amp;amp; new media&#039;&#039; 20.4 (2019): 336–349.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; Minneapolis, Minnesota: University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gebru, Timnit, Luzango Mfupe, Nyalleng Moorosi, Raesetje Sefala, and Nyalleng Moorosi. “Constructing a Visual Dataset to Study the Effects of Spatial Apartheid in South Africa”. &#039;&#039;The Distributed AI Research Institute&#039;&#039;, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039; 21.5 (1998): 838–847.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Phan, Theo and Scott Wark. “Racial formations as data formations”. &#039;&#039;Big Data &amp;amp; Society&#039;&#039; 8.2 (2021), p. 1–5.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Pugliese, Joseph. “The Biometrics of Infrastructural Whiteness”. &#039;&#039;Biometrics: Bodies, Technologies, Biopolitics&#039;&#039;. Taylor and Francis, 2012.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Rao, Ursala. “Re-Spatializing Social Security in India”. &#039;&#039;Spaces of Security: Ethnographies of Securityscapes, Surveillance, and Control&#039;&#039;, eds. Low, Setha, and Mark Maguire. Paris: NYU Press, 2019. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Toh, Amos. “Automated Hardship: How the Tech-Driven Overhaul of the UK&#039;s Social Security System Worsens Poverty”. &#039;&#039;Human Rights Watch&#039;&#039;, 29 September, 2020. Web. &amp;lt;nowiki&amp;gt;https://www.hrw.org/news/2020/09/29/uk-automated-benefits-system-failing-people-need&amp;lt;/nowiki&amp;gt;. Accessed 15 December, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039; Chicago, IL: The University of Chicago Press, 2022. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Zuberi, Tukufu. &#039;&#039;Thicker Than Blood: How Racial Statistics Lie.&#039;&#039; Minneapolis: University of Minnesota Press, 2001.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:1000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=522</id>
		<title>Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=522"/>
		<updated>2023-01-11T16:37:40Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &amp;lt;!-- ------------------------------------------------------------------- --------------------------peer-annotations------------------------------  To allow others to comment on the 1000 words version of your text,  we will work with embedded etherpads in the pages here on the wiki.  To embed an etherpad in your page and allow peer-annotations:  1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice.  2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page.  3. The etherpad should appear on the right side of the screen.  NOTE: You cannot use spaces in the id=&amp;quot;&amp;quot; value.  ------------------------------------------------------------------------ -------------------------------------------------------------------- --&amp;gt; ==&lt;br /&gt;
&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;eplite id=&amp;quot;CHANGEME&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt; ==&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;max-width:80ch;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’ ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The story of race, as Paul Gilroy tells it, moves simultaneously inwards and downwards. Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race were belied by breakthroughs in molecular biology, the representational regimes to which racialism was attached were ambivalently undone. In this movement beyond the old visual signifiers of race, Gilroy notes how the human body ceases to “delimit the scale upon which assessments of the unity and variation of the species are to be made” (845). In more contemporary terms, however, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. What if scalar compression of the microscopic to the molecular —a movement of race-craft inwards and downwards— now exerts upwards and outwards pressures into a globalised regime of datafication? To extend Gilroy differently in the present context of algorithmic culture, I consider how racial epistemology is reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’. In other words, I want to complicate the stakes and possibilities for dismantling racialism when the body is no longer its primary referent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. Following Joshua DiCaglio, I evoke scale here as an integral mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. Race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of a human ideal against an imagined nonhuman ‘other’. Rather than assume the truth of racial identity in an imagined “mute body”, analytic surveillance technologies produce racialisation as a scalar function of mass swathes of processed data (Gilroy 1998, p. 847). Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). ‘Suspicious’ (code word: Muslim) subjects flagged by the theatre of algorithmic security systems are rendered immobile at the border (Amoore 2006). Automated welfare eligibility checks keep struggling people from accessing the resources to which they are entitled (Rao 2019; Toos 2021). Credit-market algorithms widen the racialised gap between the haves and the have nots (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: that is, “&#039;&#039;modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data&#039;&#039;” (1, 2020).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy too predicted a shift that would constitute race as an entity divorced from perceptual regimes of the human eye. But rather than moving inwards, towards the invisible genomic interfaces of the body, algorithmic processes of classification constitute a digital re-coding of race by proxy &#039;&#039;en mass&#039;&#039;. This is not to say that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of &#039;&#039;seeing,&#039;&#039; but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
So, are we approaching a time where the age of visual, or embodied conceptions of racialism are ending? Not so fast – I would like to complicate this a bit further. Biometric technologies that produce digitised templates of bodily characteristics for authentication or verification purposes have troubled the notion that we have left behind racialism’s sticky attachment to the minute, perceptual scales of bodily difference in the digital age. Scholars such as Joseph Pugliese have shown how biometric technologies are ‘&#039;&#039;infrastructurally calibrated to whiteness’&#039;&#039; in their reduced capacity to recognize dark-skinned faces (2012, p. 57). In this regard, biometric technologies relegate racialised bodies outside the scope of human recognition, while at the same time, disproportionately subjecting them to heightened surveillance in service of local and global security apparatuses. What this disparity demonstrates, is that while forms of racialisation are increasingly migrating to the terrain of the digital, the epidermal materialisation of race has not yet faded, but is experiencing a resurgence in new digitised forms. Biometric technologies fit under the umbrella of ‘big data’ given that they often process large volumes of data and analytics. And yet, their capacity to racialise and produce difference is directly tied to the body – to the ‘material’ site of race itself. These extant tensions between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racialising formats interlink and reinforce each other.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The emergence of digital technologies and ‘big data’ may not, as Gilroy imagined, result in the ‘end’ of race. Rather, these technologies have complicated it. As racialism migrates to post-visual registers of datafication, residual modes of racialization remain intact in biometric modes of imaging the body. Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration, and transformation that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute universality and totality that ‘big data’ regimes attempt to guarantee under the pretense of neutrality. Initiatives such as the The Distributed Artificial Intelligence Research Institute, for example, use data to examine the effects of discriminatory policies, most recently publishing a case study on spatial apartheid in South Africa (Gebru et al. 2021). This study points to the potential capabilities of large-scale data analysis to redress the historical effects of racialisation. Here, big data analytics do not reconstruct racial category, but may be mobilised towards the liberatory practices that reframe ‘big data’ and transform domains of experience toward an end of race futurity.&lt;br /&gt;
&lt;br /&gt;
==== Bibliography ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Couldry, Nick, and Ulises A. Mejias. “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject.” &#039;&#039;Television &amp;amp; new media&#039;&#039; 20.4 (2019): 336–349.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; Minneapolis, Minnesota: University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gebru, Timnit, Luzango Mfupe, Nyalleng Moorosi, Raesetje Sefala, and Nyalleng Moorosi. “Constructing a Visual Dataset to Study the Effects of Spatial Apartheid in South Africa”. &#039;&#039;The Distributed AI Research Institute&#039;&#039;, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039; 21.5 (1998): 838–847.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Phan, Theo and Scott Wark. “Racial formations as data formations”. &#039;&#039;Big Data &amp;amp; Society&#039;&#039; 8.2 (2021), p. 1–5.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Pugliese, Joseph. “The Biometrics of Infrastructural Whiteness”. &#039;&#039;Biometrics: Bodies, Technologies, Biopolitics&#039;&#039;. Taylor and Francis, 2012.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Rao, Ursala. “Re-Spatializing Social Security in India”. &#039;&#039;Spaces of Security: Ethnographies of Securityscapes, Surveillance, and Control&#039;&#039;, eds. Low, Setha, and Mark Maguire. Paris: NYU Press, 2019. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Toh, Amos. “Automated Hardship: How the Tech-Driven Overhaul of the UK&#039;s Social Security System Worsens Poverty”. &#039;&#039;Human Rights Watch&#039;&#039;, 29 September, 2020. Web. &amp;lt;nowiki&amp;gt;https://www.hrw.org/news/2020/09/29/uk-automated-benefits-system-failing-people-need&amp;lt;/nowiki&amp;gt;. Accessed 15 December, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039; Chicago, IL: The University of Chicago Press, 2022. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Zuberi, Tukufu. &#039;&#039;Thicker Than Blood: How Racial Statistics Lie.&#039;&#039; Minneapolis: University of Minnesota Press, 2001.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:1000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=477</id>
		<title>Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=477"/>
		<updated>2022-12-21T21:27:54Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &amp;lt;!-- ------------------------------------------------------------------- --------------------------peer-annotations------------------------------  To allow others to comment on the 1000 words version of your text,  we will work with embedded etherpads in the pages here on the wiki.  To embed an etherpad in your page and allow peer-annotations:  1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice.  2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page.  3. The etherpad should appear on the right side of the screen.  NOTE: You cannot use spaces in the id=&amp;quot;&amp;quot; value.  ------------------------------------------------------------------------ -------------------------------------------------------------------- --&amp;gt; ==&lt;br /&gt;
&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&lt;br /&gt;
== &amp;lt;eplite id=&amp;quot;CHANGEME&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt; ==&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Scaling Up, Scaling Down: Racialism in the Age of ‘Big Data’ ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The story of race, as Paul Gilroy tells it, moves simultaneously inwards and downwards. Breaking the surface of skin and enveloping the racial body politic in ever-minute scales of perceptual closeness, the genomic revolution of the 1990’s gestured toward racialism’s still potential demise: the end of race itself. As older conceptions of race were belied by breakthroughs in molecular biology, the representational regimes to which racialism was attached were ambivalently undone. In this movement beyond the old visual signifiers of race, Gilroy notes how the human body ceases to “delimit the scale upon which assessments of the unity and variation of the species are to be made” (845). In more contemporary terms, however, there is a sense that race is being remade not within extant contours of the body’s visibility, but outside corporeal recognition altogether. What if scalar compression of the microscopic to the molecular —a movement of race-craft inwards and downwards— now exerts upwards and outwards pressures into a globalised regime of datafication? To extend Gilroy differently in the present context of algorithmic culture, I consider how racial epistemology is reproduced, reconstructed, and reified within the scalar magnitude of ‘big data’. In other words, I want to complicate the stakes and possibilities for dismantling racialism when the body is no longer its primary referent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As large-scale automated data processing reproduces patterns of racialisation indiscernible to the human eye, the question of scale has again become relevant to a post-visual discourse of race. Following Joshua DiCaglio, I evoke scale here as an integral mechanism of observation that establishes “a reference point for domains of experience and interaction” (2021, p. 3). Scale structures the relationship between the body and its abstract signifiers, between identity and its lived outcomes. Race has always been a technology of scale: a tool to define the minute, miniscule, microscopic signifiers of a human ideal against an imagined nonhuman ‘other’. Rather than assume the truth of racial identity in an imagined “mute body”, analytic surveillance technologies produce racialisation as a scalar function of mass swathes of processed data (Gilroy 1998, p. 847). Predictive policing, for example, increasingly relies on an accumulation of data to construct zones of suspicion through which the racial body is interrogated (Brayne 2020; Chun 2021). ‘Suspicious’ (code word: Muslim) subjects flagged by the theatre of algorithmic security systems are rendered immobile at the border (Amoore 2006). Automated welfare eligibility checks keep struggling people from accessing the resources to which they are entitled (Rao 2019; Toos 2021). Credit-market algorithms widen the racialised gap between the haves and the have nots (Bhutta et. al 2022). While racial categories are not explicitly coded within the classificatory techniques of analytic technologies, large-scale automated data processing condense and map racialising outputs that appear neutral. Thao Than and Scott Wark define these algorithmically generated racial formations as ‘data formations’: that is, “&#039;&#039;modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data&#039;&#039;” (1, 2020).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy too predicted a shift that would constitute race as an entity divorced from perceptual regimes of the human eye. But rather than moving inwards, towards the invisible genomic interfaces of the body, algorithmic processes of classification constitute a digital re-coding of race by proxy &#039;&#039;en mass&#039;&#039;. This is not to say that racialism as it has been historically constituted is being dismantled by the grand scale of computational processing; or that other modes of racialist discourse are not still firmly rooted within material experience. Rather, I reference the loosening of race from the grips of not only ocular modes of &#039;&#039;seeing,&#039;&#039; but perceptual regimes of racial scale, whereby race category is not only assigned to the small-scall signifiers of the body, but inferred through large-scale algorithmic correlation, categorisation, and abstraction of data. While racialisation and data have always been constitutive (Womack 2021; Zuberi 2001), the scale of ‘big data’ mask an insidious realignment whereby race seems to disappear, while its effects are more deeply inscribed within lived experience.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
So, are we approaching a time where the age of visual, or embodied conceptions of racialism are ending? Not so fast – I would like to complicate this a bit further. Biometric technologies that produce digitised templates of bodily characteristics for authentication or verification purposes have troubled the notion that we have left behind racialism’s sticky attachment to the minute, perceptual scales of bodily difference in the digital age. Scholars such as Joseph Pugliese have shown how biometric technologies are ‘&#039;&#039;infrastructurally calibrated to whiteness’&#039;&#039; in their reduced capacity to recognize dark-skinned faces (2012, p. 57). In this regard, biometric technologies relegate racialised bodies outside the scope of human recognition, while at the same time, disproportionately subjecting them to heightened surveillance in service of local and global security apparatuses. What this disparity demonstrates, is that while forms of racialisation are increasingly migrating to the terrain of the digital, the epidermal materialisation of race has not yet faded, but is experiencing a resurgence in new digitised forms. Biometric technologies fit under the umbrella of ‘big data’ given that they often process large volumes of data and analytics. And yet, their capacity to racialise and produce difference is directly tied to the body – to the ‘material’ site of race itself. These extant tensions between data and the lived, phenotypic, or embodied constitution of racialism suggests that these two racialising formats interlink and reinforce each other.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The emergence of digital technologies and ‘big data’ may not, as Gilroy imagined, result in the ‘end’ of race. Rather, these technologies have complicated it. As racialism migrates to post-visual registers of datafication, residual modes of racialization remain intact in biometric modes of imaging the body. Yet, racialisation is not overdetermined by large-scale automated data processing. Beyond ‘opting out’ of data regimes or obfuscating oneself from surveillance apparatuses, possibilities of transfiguration, and transformation that refuse racialising and colonialist ‘data relations’ remain conceivable (Couldry and Mejias). This begins with refusing the absolute universality and totality that ‘big data’ regimes attempt to guarantee under the pretense of neutrality. Initiatives such as the The Distributed Artificial Intelligence Research Institute, for example, use data to examine the effects of discriminatory policies, most recently publishing a case study on spatial apartheid in South Africa (Gebru et al. 2021). This study points to the potential capabilities of large-scale data analysis to redress the historical effects of racialisation. Here, big data analytics do not reconstruct racial category, but may be mobilised towards the liberatory practices that reframe ‘big data’ and transform domains of experience toward an end of race futurity.&lt;br /&gt;
&lt;br /&gt;
==== Bibliography ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Amoore, Louise. “Biometric Borders: Governing Mobilities in the War on Terror.” &#039;&#039;Political geography&#039;&#039; 25.3 (2006): 336–351.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Bhutta, Neil, Aurel Hizmo, and Daniel Ringo. “How Much Does Racial Bias Affect Mortgage Lending? Evidence from Human and Algorithmic Credit Decisions,” Finance and Economics Discussion Series 2022-067. Washington: &#039;&#039;Board of Governors of the Federal Reserve System&#039;&#039;, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Brayne, Sarah. &#039;&#039;Predict and Surveil: Data, Discretion, and the Future of Policing.&#039;&#039; New York, NY: Oxford University Press, 2020.&lt;br /&gt;
&lt;br /&gt;
Chun, Wendy Hui Kyong. &#039;&#039;Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition&#039;&#039;. Cambridge: MIT Press, 2021. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Couldry, Nick, and Ulises A. Mejias. “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject.” &#039;&#039;Television &amp;amp; new media&#039;&#039; 20.4 (2019): 336–349.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
DiCaglio, Joshua. &#039;&#039;Scale Theory : a Nondisciplinary Inquiry.&#039;&#039; Minneapolis, Minnesota: University of Minnesota Press, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gebru, Timnit, Luzango Mfupe, Nyalleng Moorosi, Raesetje Sefala, and Nyalleng Moorosi. “Constructing a Visual Dataset to Study the Effects of Spatial Apartheid in South Africa”. &#039;&#039;The Distributed AI Research Institute&#039;&#039;, 2021.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Gilroy, Paul. “Race Ends Here.” &#039;&#039;Ethnic and racial studies&#039;&#039; 21.5 (1998): 838–847.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Phan, Theo and Scott Wark. “Racial formations as data formations”. &#039;&#039;Big Data &amp;amp; Society&#039;&#039; 8.2 (2021), p. 1–5.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Pugliese, Joseph. “The Biometrics of Infrastructural Whiteness”. &#039;&#039;Biometrics: Bodies, Technologies, Biopolitics&#039;&#039;. Taylor and Francis, 2012.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Rao, Ursala. “Re-Spatializing Social Security in India”. &#039;&#039;Spaces of Security: Ethnographies of Securityscapes, Surveillance, and Control&#039;&#039;, eds. Low, Setha, and Mark Maguire. Paris: NYU Press, 2019. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Toh, Amos. “Automated Hardship: How the Tech-Driven Overhaul of the UK&#039;s Social Security System Worsens Poverty”. &#039;&#039;Human Rights Watch&#039;&#039;, 29 September, 2020. Web. &amp;lt;nowiki&amp;gt;https://www.hrw.org/news/2020/09/29/uk-automated-benefits-system-failing-people-need&amp;lt;/nowiki&amp;gt;. Accessed 15 December, 2022.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Womack, Autumn. &#039;&#039;The Matter of Black Living: the Aesthetic Experiment of Racial Data, 1880-1930.&#039;&#039; Chicago, IL: The University of Chicago Press, 2022. Print.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Zuberi, Tukufu. &#039;&#039;Thicker Than Blood: How Racial Statistics Lie.&#039;&#039; Minneapolis: University of Minnesota Press, 2001.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:1000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=465</id>
		<title>Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=465"/>
		<updated>2022-12-20T22:44:39Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;!-- -------------------------------------------------------------------&lt;br /&gt;
--------------------------peer-annotations------------------------------&lt;br /&gt;
&lt;br /&gt;
To allow others to comment on the 1000 words version of your text, &lt;br /&gt;
we will work with embedded etherpads in the pages here on the wiki.&lt;br /&gt;
&lt;br /&gt;
To embed an etherpad in your page and allow peer-annotations:&lt;br /&gt;
&lt;br /&gt;
1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice. &lt;br /&gt;
2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page. &lt;br /&gt;
3. The etherpad should appear on the right side of the screen.&lt;br /&gt;
&lt;br /&gt;
NOTE: You cannot use spaces in the id=&amp;quot;&amp;quot; value.&lt;br /&gt;
&lt;br /&gt;
------------------------------------------------------------------------&lt;br /&gt;
-------------------------------------------------------------------- --&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CHANGEME&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Having some technical difficulties - I will upload text tomorrow (21/12)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:1000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=439</id>
		<title>Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Crichlow&amp;diff=439"/>
		<updated>2022-12-20T14:10:02Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: Created page with &amp;quot;&amp;lt;!-- ------------------------------------------------------------------- --------------------------peer-annotations------------------------------  To allow others to comment on the 1000 words version of your text,  we will work with embedded etherpads in the pages here on the wiki.  To embed an etherpad in your page and allow peer-annotations:  1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice.  2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page....&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;!-- -------------------------------------------------------------------&lt;br /&gt;
--------------------------peer-annotations------------------------------&lt;br /&gt;
&lt;br /&gt;
To allow others to comment on the 1000 words version of your text, &lt;br /&gt;
we will work with embedded etherpads in the pages here on the wiki.&lt;br /&gt;
&lt;br /&gt;
To embed an etherpad in your page and allow peer-annotations:&lt;br /&gt;
&lt;br /&gt;
1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice. &lt;br /&gt;
2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page. &lt;br /&gt;
3. The etherpad should appear on the right side of the screen.&lt;br /&gt;
&lt;br /&gt;
NOTE: You cannot use spaces in the id=&amp;quot;&amp;quot; value.&lt;br /&gt;
&lt;br /&gt;
------------------------------------------------------------------------&lt;br /&gt;
-------------------------------------------------------------------- --&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CHANGEME&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Text coming soon&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:1000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow&amp;diff=438</id>
		<title>Toward a Minor Tech:Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow&amp;diff=438"/>
		<updated>2022-12-20T14:08:13Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: Replaced content with &amp;quot;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CHANGEME&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/div&amp;gt;  Text coming soon&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CHANGEME&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Text coming soon&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
	<entry>
		<id>http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow&amp;diff=437</id>
		<title>Toward a Minor Tech:Toward a Minor Tech:Crichlow</title>
		<link rel="alternate" type="text/html" href="http://cc.practices.tools/wiki/index.php?title=Toward_a_Minor_Tech:Toward_a_Minor_Tech:Crichlow&amp;diff=437"/>
		<updated>2022-12-20T13:43:08Z</updated>

		<summary type="html">&lt;p&gt;Crichlow: Created page with &amp;quot;&amp;lt;!-- ------------------------------------------------------------------- --------------------------peer-annotations------------------------------  To allow others to comment on the 1000 words version of your text,  we will work with embedded etherpads in the pages here on the wiki.  To embed an etherpad in your page and allow peer-annotations:  1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice.  2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page....&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;!-- -------------------------------------------------------------------&lt;br /&gt;
--------------------------peer-annotations------------------------------&lt;br /&gt;
&lt;br /&gt;
To allow others to comment on the 1000 words version of your text, &lt;br /&gt;
we will work with embedded etherpads in the pages here on the wiki.&lt;br /&gt;
&lt;br /&gt;
To embed an etherpad in your page and allow peer-annotations:&lt;br /&gt;
&lt;br /&gt;
1. Change the id=&amp;quot;&amp;quot; value from CHANGEME into an etherpad name of choice. &lt;br /&gt;
2. Scroll down and click &amp;quot;Save page&amp;quot; to save the page. &lt;br /&gt;
3. The etherpad should appear on the right side of the screen.&lt;br /&gt;
&lt;br /&gt;
NOTE: You cannot use spaces in the id=&amp;quot;&amp;quot; value.&lt;br /&gt;
&lt;br /&gt;
------------------------------------------------------------------------&lt;br /&gt;
-------------------------------------------------------------------- --&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;pad&amp;quot;&amp;gt;&amp;lt;eplite id=&amp;quot;CHANGEME&amp;quot; show-chat=&amp;quot;false&amp;quot; /&amp;gt;&amp;lt;/div&amp;gt;Text coming soon&lt;br /&gt;
[[Category:Toward a Minor Tech]]&lt;br /&gt;
[[Category:1000 words]]&lt;/div&gt;</summary>
		<author><name>Crichlow</name></author>
	</entry>
</feed>