Toward a Minor Tech:Niederberger5000
Claiming distance. User subjectivity in data driven environments and trans*feminist imaginations of doing otherwise
Abstract
In the last years, critical data studies has brought forward a body of work analysing the ramifications of the developing data driven digital environments that structure large part of our digital practice today. However, the position of the user has been only little developed in this field, especially in regard to questions of subjectivity and agency.
In this paper, I discuss the ways digital infrastructures create user subjectivity on different levels of technology. I start with the observation of a crisis in user subjectivity manifesting in the migratory waves from Twitter to Mastodon end of 2022, that highlights the role of infrastructure for user subjectivity. The return of the server with Mastodon stands for a relational subjectivity that foregrounds connections and communities, rather than the abstract autonomous identity of the user of cloud based services. Through this I examine the broader contemporary neoliberal subjectivity that is deeply shaped by an understanding of identity construction as part of consumer culture. I then contrasts this with the empty subject that is projected through data regimes itself. I argue that this new governmentality of data is radically extending neoliberal subjectivity of disconnectedness. I reclaim this space of distance as field for cultural practice of resistance and alternatives ways of doing technology. Through a small case study of Feminist Servers I discuss artistic and activist practice of articulation of relationality to technology both conceptually, and practically as lived practice.
Tags
subjectivity, data, infrastructure, practice, Feminist Servers, Trans*Feminist Servers, platforms, surveillance, social media, Twitter, Mastodon, feminism
Introduction
Critical Data Studies is a new field in Media Studies, and has developed a substantial body of work about the cultural and political ramification of data driven environments (Boyd and Crawford; Iliadis and Russo). It raised important questions of flaws and bias in data (Eubanks), how data driven systems are enhancing inequality (O’Neil), extend colonial modes of exploitation and thingification (Couldry and Mejias), and install new forms of discrimination (Benjamin). But the user position remains underdeveloped in this field, even though the ever growing body of circulating data is collected from user interaction with digital platforms,
The user is being conceptualised in another emerging field of what I call “user studies”, a body of work in anthropology, especially in regard to sense making processes of algorithms (Siles et al.; Bucher; Rader and Gray; Devendorf and Goodman). These concepts articulate technology not as essentialist independent artefacts, but something that is created through shared praxis, as culture (Seaver). These studies are an important contribution to the understanding of the position users have in the contemporary data driven digital world.
However, through their focus on users as individuals, they often fail to address the political dimensions. User subjectivity is a cultural form – a shared imagination of what technological practice is meant when we talk about users. This imagination is deeply political, because it is not only a bottom up sense-making process as investigated by Users Studies, but subjectivity is exactly the place where the social – including power relations in technology as analysed in Critical Data Studies – are inscribed in the self-understanding of users.
How then is this subjectivity of users structured through their contemporary environment of data driven platforms? How can we think both through subjectivity as a place of being affected, and also a place of claiming agency? And what can we learn from artistic-activist practice about addressing subjectivity and opening space for action, for being user differently?
The Twitter crisis
After Elon Musk bought Twitter at the end of October 2022, people started discussing alternatives. One of the alternatives was Mastodon, a micro-blogging service like Twitter. In contrast to Twitter, Mastodon is not corporate owned. It is a network of connected servers, which are often run by small collectives and non-profit organisations. After the acquisition of Twitter by Musk and during every wave of policy change that followed it, the Mastodon network showed waves of new registrations. During little more than three months, the Mastodon network grew from 4.5 to 9 million users and more significantly, from 3’700 to 17’000 servers1. To contrast: Twitter has 238 million users2, so even with the steady growth of Mastodon’s user count, changing from Twitter to Mastodon still is a movement trough technological scale.
And from the users side, it was often experienced as a crisis in subjectivity:
Fig 1: Screenshot of a Twitter post of a friend of mine, saying that “as long the alternatives (formulated pointedly) are ‘from nerds for nerds’, this discussion is of little use. […].” (my translation)
It is important to understand that is not only a personal crisis. My friend articulating here that he is not a nerd, and thus Mastodon is not for him is not only about him. It also is about the cultural form of the user being different than that of the nerd.
Subject positions
Both the general user and the nerd are subject positions of technological practice. Subject positions themselves are cultural imaginations (Goriunova, ‘Uploading Our Libraries: The Subjects of Art and Knowledge Commons’). They are role models or figurations, and offer a position in the world from which to make sense. They are not the same as individual subjectivity, they are shared and work as a background against which individual subjectivity can be developed. Subject positions are articulated in the cultural domain. As Goriunova insists, they are also aesthetic positions in the sense that they formulate a position from where practice is possible. And of course, they are shaped by practice and the communities which are formed through practices. Goriunova has exemplified this for very specific practices at the intersection of commons and digital activist / artistic practices (Goriunova, ‘Uploading Our Libraries: The Subjects of Art and Knowledge Commons’), but the principle of interconnectedness of practice and subjectivity applies also to the more general field of everyday use.
The return of the server: infrastructure and subjectivity
One aspect in the crisis of subjectivity is the return of the server. Scale as a numerical quantifier is an important aspect of platforms, because they thrive on network effects: the more numerous their users, the more valuable the platform is for everybody (Srnicek 45). But as already mentioned, the difference between Twitter and Mastodon is not only one of numerical scale in user count, but foremost one of organisation on an infrastructural level. Twitter is operating as centralised platform, it is a unified service accessed through an app and its data and processes are located in the cloud. Mastodon however runs on a decentral network of federated3 servers, that are connected by a shared protocol.
Of course technically speaking, also big technology and the cloud are operating on servers. Servers are still the main nodes in the infrastructure of the internet: it is on servers where data is stored and where user requests are processed. But servers have been abstracted away in order to make technical systems scaleable (Monroe). This is another step in the chain of abstractions on which our digital infrastructure is built on. Through this abstraction in service of scaleability, servers have disappeared from the view of users. And with it, a contextual and materialist understanding of digital infrastructure, that includes specific machines, local contexts and a diversity of practices turned into immaterial services and apps. Servers have been replaced with the cloud, a metaphor suggesting quite the opposite of the massive, energy-hungry data centers that are powering large scale digital infrastructure. Thus, in the age of cloud computing, we simply can’t know on how many servers Twitter is running.
The stated return of the server happens very prominently at the first step of the signup process for Mastodon. Here, Mastodon asks users to pick a server, and with it about a specific context to join. In order to answer this, users need to identify themselves in different ways than on big technology platforms. Signing up to a commercial platform asks you to identify yourself as a classical autonomous (self-contained) liberal individual. In contrast, the sign up process of Mastodon asks to choose a server, which means to identify yourself in relation to a community.
My friend’s interpretation of this request as nerdy points to an understanding of servers being outside of user subjectivity and as technological artefacts are belonging to the nerd. But it also points to something deeper: As the signup process of Twitter indicates, contemporary user subjectivity is closely aligned with the liberal subject, that refers to an autonomous self. The liberal autonomous self is also a subject position, a more general one, that is serving as a blueprint of user subjectivity. And as Marxist philosopher Louis Althusser in the 1960ies explained, the social and political implications of subjectivity is continuously actualised in individuals through a process called ‹interpellation› through what he called institutions, in our case infrastructure (Althusser et al.). This means that subjectivity is never only personal, or interior, but that the personal, the psychological and the individual are deeply linked to the world and its social, economic, political and cultural formations. Althusser called this fundamental link between subjectivity and the world ideology. In this understanding, ideology is not a qualitative judgement, it just states that our conception of the world and our actions are structured by forces outside of individual agency.
Identity construction, validation and agency: the neoliberal subjectivity
The signup process is a specific moment in contemporary user practice, and it provides a privileged moment for analysis. However, it is not the only moment of technological interpellation. After the signup process has been completed, Twitter and Mastodon still maintain some differences, but are working quite similar. This similarity is what made Mastodon to a viable alternative to Twitter in the first place.
The most direct way of user interpellation on social media platforms is how they invite users to contribute. Most platforms use questions for that: Twitter asks on its web interface «What’s new?», exactly the same words as the Mastodon Tusky client. The Elk frontend for Mastodon puts it slightly different: «What’s going through your mind?», the Twitter app asks: «What is happening right now?», and Facebook even adds a personalised identifier when asking: «What are you doing right now, Shusha?»4. As anthropologist Ignacio Siles observed, these are forms of ‹algorithmic interpellation›, and as such part of a personalisation process between users and platforms (Siles). Users in reverse are personificating platforms and algorithms too, for example they have mental images about the character of a platform. This call and answer between user and platform is part of a personal relationship between user and platform. Platforms engineer and maintain these relationships actively and consciously in various forms and intensities. But as my example of engagement prompts above shows, the management of relationship is present both on commercial and alternative platforms like Mastodon.
This observation indicates that user subjectivity includes more layers than the interpellation through infrastructural topology I discussed before. Users are continuously mobilised by their digital infrastructures as specific subject positions. But digital infrastructures are not operating in a void, they are part of a wider social, economic, political and cultural situation, that implies a broader notion of subjectivity itself. As sociologist Sarah Banet-Weiser has outlined in her extensive study on brand culture, this is broadly speaking a ‘consumer citizen’, the basic capitalist subjectivity per se (Banet-Weiser).
The consumer citizen enacts fundamental values through consumption. Banet-Weiser traces this back to the post-war era in the US, with the proliferation of mass production and mass consumption. Since then, markets have changed towards addressing increasingly smaller niches, and in the way included identity and difference as part of consumer subjectivity. Today, «advanced capitalist dynamics manage, contain, and actually design identities, difference, and diversity as brands» (Banet-Weiser 52). The relationships between brands (and in our case platforms) and consumers is not structured by consumption anymore, but trough identity construction and validation. This includes new ways of consumer practice, that are fact unrecognised labor.
The neoliberal subject is an ‘entrepreneur of the self’. It inherits autonomy and rationality from the classical liberal subject. But as feminist scholarship has showed through the study of changes in the representation of femininity in marketing (Banet-Weiser), or in new stereotypes (Gill), the neoliberal subject incorporates a focus on agency, free choice, and self-regulation that extends to self-surveillance.
We can clearly see how well the neoliberal ‹entrepreneur of the self› connects to the users of platforms, where identity is created and sustained through continuous acts of self-narration, embedded in a market of attention (Citton), and organised by metrics that feed back on self-validation turning into self-surveillance, and thus inducing pressure for ever continuing self-improvement (Grosser).
The focus on individual agency in neoliberal subjectivity is casting structural problems as individual failures. We can see this in the public discourse about the ramifications of digital technologies. It is the individual that is asked to train herself to spot the fake, to curate the feed, to master the tools, to block the cookies, to fight addiction, and so on. Neoliberalism “is structured by an individualism that has almost entirely replaced notions of the social or political” (Gill 443). We can see this neoliberal subjectivity also in educational initiatives, promoting digital self-defense and digital detox, thus hailing the user as self-regulating autonomous individual. To be clear, I am deeply grateful for these initiatives and use them in teaching regularly. It is all we have at the moment, and many of these initiatives manage to clearly articulate the structural dimensions of the problems they raise. However, they too address the user as the neoliberal subject.
Even if Twitter and Mastodon embody neoliberal subjectivity in the ways they call their users into action, there still are important differences in on what terrain the two platforms operate. Apart of the infrastructural dimension, the biggest difference between Mastodon and Twitter is the role data plays for them. Twitter is data driven in the sense that all data collected about the user, her interests, and patterns of interaction is used to calculate what this person gets to see. This includes advertisement, but also the posts displayed in the feed. On Mastodon in contrary, there are no advertisements and the feed is composed in chronological order, filtered through an elaborated system of relatedness5, but is basically static. This means that the appearances of posts can be explained by relations and actions of users, and not through a black box algorithm.
The empty subject of data
Social media platforms like Twitter, Facebook and Instagram have been developing as part of an emerging regime of data. This regime is much more extensive than social media platforms, but social media platforms have played an important role in its formation and the normalisation of user tracking, profiling and the operationalisation of categorisation into surveillance (Zuboff). As Wendy Chun observes, the production and maintenance of ‘authentic’ relationships between platform and users is fundamental to the data regime. It is through authenticity that we become transparent. To be a user, for her, is to play a specific role “in a drama called ‘big data’” (Chun 145). This play is scripted, and as we have seen in the discussion of prompts, the role of the users is to perform authenticity. If ever a user would have forgot her role, there are keys clear enough to bring him back to the play.
The data collected through tracking, surveillance or voluntary disclosure however is operated not as representation of its source, but as proxy for open questions. A proxy is a stand-in for something that cannot be measured directly (O’Neil). Thus, proxies make the unknown computable. But by doing so, they “introduce uncertainty, even as they serve to reduce it” (Chun 136). The data about your browsing history is not meaning your browser history, but is taken as an indicator of personality, age, gender, interests, economic situation and many more, often secret categories. This uncertainty is also the space, where bias is located. The recorded traces we leave thus take on a life of their own.
Goriunova discusses this under the term ‘digital subject’: “neither a human being nor its representation [in data (authors clarification)] but a distance between the two.” (Goriunova, ‘The Digital Subject: People as Data as Persons’ 128). For her, the relation between humans and their data is not representational, but organised through a ‘distance’, that is not extending between fixed points, but is subject to constant change (Goriunova, ‘The Digital Subject: People as Data as Persons’). Distance can be used as another way of describing the uncertainty space of the proxy, and it is through distance that data projects its own subject.
For law philosopher Antoinette Rouvroy, the subject of data is fundamentally empty. This is because data is never addressing the subject directly, but only on an infra-personal level of collected data points (fragmentary) and on a supra-individual level of categorisation (categorial). In data, meaning making is reduced into an automatic process of statistical analysis. Thus, the only subject data knows is a statistical body, a “‘probabilistic subject’ [which] is not the same as the actual, experiential, present and sentient subject.” (Rouvroy 151). Because data collection is unspecific towards its application, it is not methodical (that is: not meaningfully orientated towards knowledge), but operates purely on a numerical scale, where more data means better data.
Big data constitutes its own reality, that is the base for the creation of a new probabilistic subject devoid of all meaningful relations to its point of origin. This replacement of the specific in favour of the statistical is also an important factor of AI applications, like LLM (large language models) like Chat-GPT, and automatic image generation systems like DALL-E (Bender et al.; Salvaggio; Salehi).
If we think back to the discussion of the neoliberal subjectivity of the ‘entrepreneur of the self’, we can see how well it connects to the behaviourist nature of the data regime. A behaviouristic perspective is not interested in reasoning, sense-making process, affects or motivation, just in behaviour – observable acts. This is extending radically the disconnection of context of the autonomous, rational and calculating liberal subject into data, with new consequences. As Rouvroy points out, data is organising the world not by cause and effect, but by controlling contingency. Data controls what posts will show up in your Instagram feed, what information is available to you, what relations will be sustained through it, but it also decides upon access to and conditions of services and resources. Through this control of contingency, data behaviourism is intervening in the faculty of people to make choices, it prescribes the very conditions of agency (Rouvroy). This is a crucial part of an ongoing transformation of subjectivity, that Rouvroy describes as a process of draining, and what has been described in literary references as ‘zombiefication’ (Andersen and Pold). With the emptying of the subject through data comes a destabilisation of subjectivity of the user: the feeling of being seen but not meant. On a governmental level, this is often the other way around: being meant but not seen.
The behaviourist understanding of humans is the opposite of a perspective considering subjectivity. Subjectivity is the space, where “the social or cultural ‘gets inside’, and transforms and reshapes our relationships to ourselves and others.” (Gill 433). Considering subjectivity thus means insisting on context, thick meaning, and relations. Asking about subjectivity in data regimes is claiming distance – this twisted and thick relation between persons and data – as political terrain, as also Goriunova hightlights (Goriunova, ‘The Digital Subject: People as Data as Persons’). And claiming distance is asking about other options and ways of living with our data subjects, it is a struggle over meaning and agency.
Being user otherwise: Feminist Servers
So, insisting on thinking through subjectivity and usership is part of the struggle over subjectivity, and the political subject, and it raises questions of agency. How can the project of subjectivity be something to actively engage in? In short: is it possible to use otherwise?
One example for alternative thinking through how subject positions are invoked trough technology is formulated in The Wishlist for *TransFeminist Servers (A Wishlist for Trans*Feminist Servers). This is an actualisation of an older text, the Feminist Server Manifesto (Constant), both were written by a «community of people interested in digital discomfort», as the Wishlist puts it.
Both the Manifesto and the Wishlist choose the server as their protagonist, in the form of a self-articulation. A protagonist is what Goriunova calls a figure of thought that offers „a position from which a territory can be mapped and creatively produced“ (Goriunova, ‘Uploading Our Libraries: The Subjects of Art and Knowledge Commons’). Through this self-articulation, the Trans*Feminist Servers produce a different imaginations of technology, including the roles of the user.
At the center of this articulation are questions of servitude. „Are you being served?“ is the title of publication documenting the worksession with the same name, that took place in Brussels in 2014 (Hofmüller et al.). During a 3 day event at Constant, an artist-run space in Brussels6, artists and practitioners met to discuss concepts and exchange alternative practices involving servers. Who is being served, and are the conditions of services?
The question of servitude challenges notions of use-fullness and use-ability with their focus on functionality, efficiency, and scaleability, that are markers of big technology’s abstraction. I have argued this to be an active refusal of the ideological function of infrastructure, leading to an exchange of functionality for communality, opening up technological practice into a space to be inhabited (Niederberger).
The change of perspective offered by using the server as a protagonist also sheds light on big technologies and the subjectivities they invoke: while users should be on the receiving end of servitude, they are so only under very specific conditions marked by privilege: the chances of being served are not equally distributed, and often vulnerable communities find themselves and their content not protected by consumer technology platforms.
And this is one of the reason for Feminist*Servers to exist as communities and real existing infrastructures7: a real need to protect themselfes, content and communication, and to create safer spaces online for vulnerable communities (spideralex). So, Trans*Feminist Servers are both a thinking tool and lived technological practice (Snelting and spideralex).
Trans*Feminist Server offer through their reformulation relationships of care instead of extraction. This foregrounds practices of care: administration, maintenance, moderation (meaning the whole work of making a community work), documentation, and last but not least also using the services and giving feedback on functionality. The wiki of one of the feminist servers names the roles included in Feminist Server practice as “Guardians, fire extinguisher, interfaces and scribes” (anarchaserver).
Careful relationality is – as we have seen – something that is denied for the liberal subject. As both texts insist: (Trans*)Feminist Servers exist only because they are cared for by a community, because the need of having them is expressed in acts of making them exist. The territory offered by Trans*Feminist Server thus is structured by affection, not extraction.
Both the narrative work in the Manifesto and the Wishlist, and the lived work on communities and infrastructure can thus be seen as a practice of claiming distance. This practice is radical in the sense that it re-articulates the whole territory anew – both conceptually with the protagonist of the server, and practically. Also, distance itself is re-articulated through this practice as a more general relationship to technology, independent of data.
Thus, being part of a Trans*Feminist Server means partaking in an ongoing negotiation of the conditions for serving and service. Use here is not an act of consumption, but one of creation and re-creation, that include the whole terrain of relationality with and through technology.
Conclusion
Data driven platforms engage users at the same time as customers (of a service), producers (of data), targets (of surveillance), and raw material (of statistical knowledge). User subjectivity is shaped through all these aspects. I have discussed the interpellation of infrastructural aspects like the server on the example of Mastodon in difference to Twitter. But on the level of interface, platform users are hailed as subjects of neoliberal capitalism, that enact central values like self-determination and free choice as part of user practice. The practice of users is part of what I called relationship management through platforms, that are mobilising authenticity as way of creating valuable data. This is an aspect of the empty subject put forward by data regimes. Data regimes don’t operate through representation, but by proxies and distance. Thus, data regimes create truth not about the world, but out of data. This truth is structured by correlation and statistics. But data regimes are not only detached from the lived reality of people, they are intervening into lifes through feedback based on categorisation and probability, controlling the conditions of existence indirectly, thus limiting the faculty of people to make choices, their agency.
Because of subjectivity can be seen as a hinge between the personal and the structural, the individual and the shared, it opens possibilites of agency, that are not reproducing the patterns of exploitation. Insisting on subjectivity, I argue, opens a space for reclaiming the distance between the reality people live in and data. For this I discuss Trans*Feminist Servers as narrative work that is able to re-articulate a territory of technological relationality through the figure of the server as a protagonist. This territory articulated by that is structured by relations of care, and not extraction. Because Feminist Servers exist also as real infrastructures and communities, this conceptual re-articulation is not only a conceptual tool of critique, but also shared lived reality.
1 User Count Bot for all known Mastodon instances @mastodonusercount@mastodon.social
2 https://www.businessofapps.com/data/twitter-statistics/
3 "Federation is a concept derived from political theory in which various actors that constitute a network decide to cooperate collectively. Power and responsibility are distributed as they do so. In the context of social media, federated networks exist as different communities on different servers that can interoperate with each other, rather than existing as a single software or single platform.” (Mansoux and Abbing 125)
4 I collected these examples on my personal devices (Android mobile phone and MacBook), running a german language operating system. The original wording in german is: “Was gibt’s Neues?” (Twitter and Tusky), “Was geht dir gerade durch den Kopf?” (Elk) and “Was machst du gerade, Shusha?” (Facebook). All translations mine.
5 A short explanation can be found here in the section of timelines: https://techcrunch.com/2022/11/08/what-is-mastodon/. For a more in-depth discussion see (Mansoux and Abbing)
6 https://constantvzw.org/site/?lang=en
7 There exist different feminist servers, a list can be found in the history of one of them (anarchaserver): https://alexandria.anarchaserver.org/index.php/History_of_Anarchaserver_and_Feminists_Servers_visit_this_section
Reference
A Wishlist for Trans*Feminist Servers. 2022, https://etherpad.mur.at/p/tfs.
Althusser, Louis, et al. On The Reproduction Of Capitalism: Ideology And Ideological State Apparatuses. Verso, 2014.
anarchaserver. Be a Guardian, a Fire Extinguisher, a Scriba, an Interface - Anarchaserver. 2022, https://alexandria.anarchaserver.org/index.php/Be_a_guardian,_a_fire_extinguisher,_a_scriba,_an_interface.
Andersen, Christian Ulrik, and Søren Bro Pold. ‘The User as a Character, Narratives of Datafied Platforms’. Computational Culture, no. 8, July 2021, http://computationalculture.net/the-user-as-a-character-narratives-of-datafied-platforms/.
Banet-Weiser, Sarah. Authentic TM: The Politics of Ambivalence in a Brand Culture. NYU Press, 2012.
Bender, Emily M., et al. ‘On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜’. Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, Association for Computing Machinery, 2021, pp. 610–23. ACM Digital Library, https://doi.org/10.1145/3442188.3445922.
Benjamin, Ruha. Race after Technology: Abolitionist Tools for the New Jim Code. Polity Press, 2019.
Boyd, Danah, and Kate Crawford. ‘CRITICAL QUESTIONS FOR BIG DATA: Provocations for a Cultural, Technological, and Scholarly Phenomenon.’ Information, Communication & Society, vol. 15, no. 5, 2012, pp. 662–79, https://doi.org/10.1080/1369118X.2012.678878.
Bucher, Taina. ‘The Algorithmic Imaginary: Exploring the Ordinary Affects of Facebook Algorithms’. Information, Communication & Society, vol. 20, no. 1, Jan. 2017, pp. 30–44. Taylor and Francis+NEJM, https://doi.org/10.1080/1369118X.2016.1154086.
Chun, Wendy Hui Kyong. Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition. MIT Press, 2021.
Citton, Yves. The Ecology of Attention. Polity Press, 2016.
Constant. [Version 0.1] A Feminist Server. 3 June 2014, https://transhackfeminist.noblogs.org/post/2014/06/03/version-0-1-a-feminist-server-constantvzw/.
Couldry, Nick, and Ulises Mejias. The Cost of Connection. How Data Is Colonizing Human Life and Appropriating It for Capitalism. Stanford University Press, 2019.
Devendorf, Laura, and Elizabeth Goodman. The Algorithm Multiple, the Algorithm Material: Reconstructing Creative Practice. The Contours of Algorithmic Life, UC Davis. https://www.slideshare.net/egoodman/the-algorithm-multiple-the-algorithm-material-reconstructing-creative-practice
Eubanks, Virginia. Automating Inequality. How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press, 2018.
Gill, Rosalind. ‘Culture and Subjectivity in Neoliberal and Postfeminist Times’. Subjectivity, vol. 25, 2008, pp. 432–45, https://doi.org/10.1057/sub.2008.28.
Goriunova, Olga. ‘The Digital Subject: People as Data as Persons’. Theory, Culture and Society, vol. 36, no. 6, 2019.
---. ‘Uploading Our Libraries: The Subjects of Art and Knowledge Commons’. Aesthetics of The Commons, edited by Felix Stalder et al., Diaphanes, 2021.
Grosser, Benjamin. ‘What Do Metrics Want? How Quantification Prescribes Social Interaction on Facebook’. Computational Culture, no. 4, Nov. 2014. http://computationalculture.net/what-do-metrics-want/.
Hofmüller, Reni, et al. Are You Being Served? (Notebooks). Constant, 2014, https://areyoubeingserved.constantvzw.org/AreYouBeingServed.pdf.
Iliadis, Andrew, and Federica Russo. ‘Critical Data Studies: An Introduction’. Big Data & Society, no. 1–7, 2016, https://doi.org/10.1177/2053951716674238.
Mansoux, Aymeric, and Roel Roscam Abbing. ‘Seven Theses on the Fediverse and the Becoming of FLOSS’. The Eternal Network. The Ends and Becomings of Network Culture, edited by Kristoffer Gansing and Inga Luchs, Institute of Network Cultures / transmediale e.V., 2020, pp. 125–40.
Monroe, Dwayn. ‘Seeding the Cloud’. Logic, no. 16, 2022, pp. 91–102. https://logicmag.io/clouds/seeding-the-cloud/
Niederberger, Shusha. ‘Feminist Server – Visibility and Functionality. Digital Infrastructure as a Common Project’. Springerin | Hefte Für Gegenwartskunst, no. 4, 2019, pp. 8–9. https://springerin.at/en/2019/4/feminist-server-sichtbarkeit-und-funktionalitat/
O’Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown, 2016.
Rader, Emilee, and Rebecca Gray. ‘Understanding User Beliefs About Algorithmic Curation in the Facebook News Feed’. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Association for Computing Machinery, 2015, pp. 173–82. ACM Digital Library, https://doi.org/10.1145/2702123.2702174.
Rouvroy, Antoinette. ‘The End(s) of Critique: Data-Behaviourism vs. Due-Process’. Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology, edited by Katja De Vries and Mireille Hildebrandt, Routledge, 2013, pp. 143–65.
Salehi, Niloufar. ‘I Tried out SyntheticUsers, so You Don’t Have To’. Niloufar’s Substack, 8 Apr. 2023, https://niloufars.substack.com/p/i-tried-out-syntheticusers-so-you.
Salvaggio, Eryk. ‘How to Read an AI Image’. Cybernetic Forests, 2 Oct. 2022, https://cyberneticforests.substack.com/p/how-to-read-an-ai-image.
Seaver, Nick. ‘Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems’. Big Data & Society, vol. 4, no. 2, Dec. 2017, p. 2053951717738104. SAGE Journals, https://doi.org/10.1177/2053951717738104.
Siles, Ignacio. Datafication as Culture. Living with Algorithms in Latin America. https://www.youtube.com/watch?v=YDibmqbGTsU. online.
---. ‘The Mutual Domestication of Users and Algorithmic Recommendations on Netflix’. Communication, Culture & Critique, vol. 12, no. 4, 2019, pp. 499–518.
Snelting, Femke, and spideralex. Forms of Ongoingness. Interview by Cornelia Sollfrank, Video, Transskript, 16 Sept. 2018, http://creatingcommons.zhdk.ch/forms-of-ongoingness/.
spideralex. ‘CREATING NEW WORLDS with Cyberfeminist Ideas and Practices’. Beautiful Warriors. Technofeminist Praxis in the Twenty-First Century, edited by Cornelia Sollfrank, Minor Compositions, 2020, pp. 35–56.
Srnicek, Nick. Platform Capitalism. Polity Press, 2016.
Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs, 2019.