Algorithmic state apparatus

Auteurs

DOI :

https://doi.org/10.34619/mqbq-xbdm

Mots-clés :

simulacra, ideology, technicity, subjectivity, panopticism, capitalist realism, artificial intelligence, critical theory

Résumé

A critical reappraisal of the ways in which digital technologies and algorithmic governance instate “human” experience, agency and social structures, is long overdue. With the advent of AI, theory finds itself at a crossroads, confronted by an “edge-of-the-construct,” which has ceased merely to be a metaphor for the phantasmatic relationship between the technē of representation and posthumanist transcendentalism, rather it designates theory's own precarious situation, as prosthesis of reason and autonomous critical agency. This scenario, often depicted as a boundary between the human and the technological, reflects a preoccupation with simulationism and the control exerted by computational systems on “reality,” as well as a desire to recuperate this “beyond of experience” for a new existentialism, a new humanism.

It is a readymade cliché that the emergence of Large Language Models necessitates a re-evaluation of preconceptions about intelligence, consciousness and the role of humans in a technologically constituted world, et cetera. Yet if the rapid development of AI and hyperautomation challenges both anthropocentric as well as post-Anthropocenic conceptions of agency, it does this not by indicating the rapid dis-integration of “subjective experience” within a “consensual hallucination,” as William Gibson famously put it, of “reality” (modernism's hand-me-down), but by disintegrating the very framework of “experience” in general and of “consensus” in particular.

While terms like algorithmics and technicity are often affected to mean predetermined, end-orientated reductive systems that translate input into output, cause into effect, intention into action, their entire genealogy (from Aristotle to Mumford, Giedion, McLuhan and beyond) speaks to a poiēsis or poetics of spontaneity, indeterminacy, complexity. It isn't merely that algorithms are generative, but that they are ambivalently so. Every apparent algorithmic bias is ambivalently determined. This extends to the arbitrary, stochastic and interoperable nature of “representation,” “experience” and “reality.”
Drawing from Althusser's thesis on Ideological State Apparatuses, alongside Fisher's capitalist realism, we may posit a subjective experience and consensual reality as emergent from – and as – states of ambivalence, such that the “concreteness” of social relations posited by (e.g. Marxist) critical theory is seen to be deeply intertwined with ad hocalgorithmic governance rather than actualising or reifying an underlying political teleology. Likewise the history of panopticism, simulationism and the “society of the spectacle” (as theorised by Bentham, Debord, Foucault and Baudrillard).
What is here called the Algorithmic State Apparatus transgresses at every point the logic of panoptic surveillance under conditions of AI – of subjective experience and the consensual-real – producing human hypotheses (radically simulacral egotic artefacts) from solipsistic neuro-computational networks (theoretical-real Universal Turing Machines). This stateless control system operates in the place where ideology cannot see – in the recursive hyperspace between omniscience and the unverifiable; necessity and the impossible – erectin edifices of pure metaphor, autopoetic and indeterminate, yet as if productive of all past, present and future realisms.

Biographie de l'auteur

Louis Armand, Centre for Critical and Cultural Theory, Charles University, Prague, Czech Republic

Louis Armand’s critical works include Feasts of Unrule (2024), Entropology (2023), Videology (2017), The Organ-Grinder’s Monkey: Culture after the Avantgarde (2013), Event States (2007), Literate Technologies (2006), Solicitations: Essays on Criticism and Culture (2005), Technē (1997) and Incendiary Devices (1993). Edited volumes include Pornoterrorism (with Jaromír Lelek, 2015), Contemporary Poetics (2007), Language Systems (with Pavel Černovský, 2007), Technicity (with Arthur Bradley, 2006) and Mind Factory (2005). His work also appears in the Palgrave Handbook of Critical Posthumanism (2022) and the Oxford Research Encyclopaedia (2017), among others. He is the Director of the Centre for Critical and Cultural Theory, Charles University, Prague.

Téléchargements

Publiée

2024-11-29

Comment citer

Armand, L. (2024). Algorithmic state apparatus. Revista De Comunicação E Linguagens, (60), 65–90. https://doi.org/10.34619/mqbq-xbdm