Words of Welcome
What does the old Nazi-era Tempelhof airport have to do with current machine learning datasets? What does intelligence as common sense have to do with common sensing? In the panel discussion that opens AI Anarchies, Hito Steyerl and Alex Hanna trace histories of forgotten possibilities for anti-fascist technologies, and assess the present conditions of determinism embedded in AI. Working with Saidya Hartman’s notion of “critical fabulation”, Steyerl and Hanna discuss how we might refuse, reject, and re-imagine the sociotechnical systems we have inherited.
Hito Steyerl is a filmmaker and writer, based in Berlin. Alex Hanna is director of Research at the Distributed Artificial Intelligence Institute (DAIR), based in California.
Rhythmicity & Desire
Allami’s ongoing research interrogates the processes of creating experimental music based on one’s culture and individuality. As an extension of his recent work on tuning and the inherited biases of modern sonic technologies, this unique improvised live performance focuses on rhythm, and explores the real-time potential of custom machine learning of audio datasets, based on and inspired by the cross-pollination of Arabic music fundamentals with experimental compositional techniques, instrumentation and technologies.
We open up the Autumn School with a a tech check - an acknowledgment of constraints of frames of AI, while keeping an eye on feminist philosophies of technology and activism for the digital world. Is it possible to imagine ourselves outside the logics set by tech, from network ontologies to the ‘dirty’ logics of machine learning-based societies, and the spectre of Web3? How does AI’s culture and computation serve as a prompt for thinking about technology, ethics, and anarchies? And what kind of frame do we want to place around anarchies in this moment?
Wildflowers and Broken Machines: A Tech Check for Feminist Philosophies and Politics
Jac sm Kee and Sarah Sharma have been circling around each other for some time through their varied engagements with feminism and technologies: activism, academia, internet governance policy, theories of Luddism and tech-determinism. Meeting for the first time after interacting online for a few weeks before arriving in Berlin, they will kick off the first day of AI Anarchies with something like a third date feeling: relaxed, curious, and perhaps restless. Will they find a way to resolve the different, and possibly convergent, terms between feminist philosophies of technology and feminist activism for the digital world? Their conversation explores whether it is possible to imagine ourselves outside the logics set by tech and what this might mean for both feminist politics and digital life.
Jac sm Kee is a feminist internet activist, researcher, and design archivist from Kuala Lumpur. Sarah Sharma is associate professor and director of the ICCIT at the University of Toronto.
Recording of the talk can be watched here: https://vimeo.com/akademiederkuenste/autumnschool1410.
A Dub Approach to Machine Listening
Dub is a technique, method, and an approach to musical and sonic production that dwells on the porous boundaries between de- and recomposition of a given signal. Dub’s origins are the technological leftovers of ‟post-colonial” Jamaica. It is far from being, as the standard story goes, a ‟happy accident,” a glitch borne of a studio mistake. Instead, it is a bodily approach to sound that (re-)centres listening in and with time, defying the constraints of linear, colonial knowledge. To paraphrase Frantz Fanon, Dub introduces invention to existence.
In this workshop, we will approach Dub as a playful, facetious alternative pathway for a material inquiry of machine listening. Please bring along your ears, a desire to listen, and a couple of quotes from your favourite thinker, storyteller, or songwriter.
In this half-day workshop, participants will be introduced to the practice of how to install, maintain, and run their own websites and services, also known as “self-hosting.” Using the open-source yunohost platform, we will walk through the process of setting up a web server, complete with webmail, cloud, and workflow services capable of supporting 100+ users. The UNIX command line’s basics are also trained so that participants can securely log into their server and administer it regardless of their physical location. It takes just one in a community to give the gift of high-quality, low-carbon Internet infrastructure ‒ to free yourself and others from centralised and privacy-eroding services!
The legacy of anti-computing has laid the groundwork for crucial moments of dissent in response to the logic and technique of computational technologies. Anti-computing has given rise to enticing models outside Silicon Valley Frames: feminist computing, decolonial computing, concepts of non-binary computing, methods for queering computing. On Day Two, we open with provocations around current models of machine learning - and their relationship to political and social hierarchies that are evident and those that are harder-to-see. How do we understand the alienation and divisions created by algorithmic practice today as the logical continuance of a long history of actuarial science, in service of the state? As not new, but old? Over the course of today, how can we create environments for thought experiments around alternative paradigms for computing and technologies, using revision of machine learning as an exemplary case study?
Anti-Computing: Models and Thought Experiments
Ramon Amaro’s and Jackie Wang’s respective practices as thinkers and writers have guided many of us into the break: a break where we can begin to consider the possibility of what Caroline Bassett calls “anti-computing”, the histories of dissent against computational logics, utopias, and imaginaries. In their research, study, and forthcoming book The Black Technical Object: On Machine Learning and the Aspiration of Black Being, Amaro delves into the gaps in the relationship between the black experience, machine learning research, and the racial history of scientific explanation. Wang, using books, poems, performance, and scholarship, traces an understanding of carceral politics, predictive logic towards a politics of disruption, resistance, and liberation. Jackie Wang meets Ramon Amaro in the dream state of co-thinking, to discuss Sylvia Wynter, Fanon, and sociogenic alienation. Together, they offer us ways to understand the condition of being subject to computing. Their themes may include statistical analysis as tied to taxonomic hierarchies and alternative understandings of the relationship between technology and race. The duo provides clues and threads of thought experiments, while working towards alternative computational systems.
Ramon Amaro is an author, scholar, and lecturer of Art and Visual Culture of the Global South at University College London. Jackie Wang is a black studies scholar, poet, and multimedia artist, and assistant professor of American Studies and Ethnicity at USC in California.
Recording of the talk can be watched here: https://vimeo.com/akademiederkuenste/autumnschool1510.
My Grandmother and AI
My grandmother’s name is Emilia. I believe she is in her 70s. She is a first-generation Nigerian immigrant in Great Britain. She doesn’t know how to use her mobile phone; she doesn’t even know how to use her remote control. She is very religious and spends most of her time watching evangelical American pastors preach the gospel to her. She has no clue what AI is. How do we teach her about it? (And why should she want to learn about it?)
This workshop wants its participants to step outside the ongoing intellectual conversations around AI and its impact on society. This workshop calls upon the community to challenge implicit assumptions and biases, and to explore how we integrate people from various global communities into the discourse and development of AI.
It targets people like my grandmother, who have no understanding of the role that AI plays and functions in their lives. This workshop hopes to create a number of AI provocations for people like her, to spur a conversation around the implicit beliefs and biases hat may be normalised in AI.
Haunting Echoes: Newly Forgotten Technologies
In this workshop from sound artist and researcher Wesley Goatley, participants create sound sculptures from discarded Amazon Echo devices to tell new stories about the relationships between humans and AI. Participants will first be introduced to the Amazon Echo as a critical object that reflects the hopes and dreams of AI tech companies, while also being a totem for discussions about extractive capitalism, critical futures, and civilisation-scale myth-making. Wesley will then introduce participants to the basic principles behind his work of repurposing broken and discarded Amazon Echoes with new Alexa speech and audio.
Participants acquire a practical introduction to sound editing and composing for the Echo, as well as how to generate custom Alexa voices to create compositions with both music and speech for the Echo. The outcomes from these experiments will be shown in a work-in-progress installation with Amazon Echoes at the end of the AI Anarchies programme.
Games provide a mode of experimentation with alternative strategies and tactics in a competitive and constantly changing environment. Game worlds gesture toward unrealised potentials; their rules and metaphors can be found in every social or political action. Troll Swamp is a large-scale board game for multiple players, based on the formats and characters from the popular Dungeons & Dragons tabletop game. The game uses role-play and teamwork to act out scenarios about online trolling. Through the analogue nature of the game, a playful situation emerges where the user plays an imaginary character. Guided by the Troll Master (a storyteller navigating the players through the game), each player will face their online habits and relationships with others. Troll Swamp has the potential to enable its players to regain confidence, rethink dynamics of interaction and reclaim a different space of expression online.
Jennifer Walshe and Wobbly perform the latest chapter of MOREOVER, a continually-evolving work made through, with, about, in, for and against machine learning. MOREOVER surfs the detritus of contemporary life, ranging from synthetic datasets to our doomed efforts to get yoga arms or successfully rotate PDFs. It touches on a range of touchpoints and moods: Jaron Lanier; the ways computers are winning the Turing Test; former Google CEO Eric Schmidt’s concerns about the potential for AI to create racist toy bears; the horror of your partner’s private browsing history; the metabolic compromises we all must make, and the availability of off-the-shelf solutions for governing a population.
How do we understand the AI of the future in relation to the AI of the past? What aspects of the evolution of the machine learning and statistical methods we have today are unchangeable; are there inherent extractive qualities to ML systems that cannot be avoided? What can be unlearned about today’s AI? How do we grapple with the scale and speed of contemporary AI and imagine a leftist, communal, distributive, or transfeminist AI? In sum: what wild alternatives, and anarchic AIs, might we start to imagine together?
Towards AI Anarchies: Wild Imaginings and Alternatives
Mimi Ọnụọha’s work has investigated the political, computational, and aesthetic implications of being both invisible and hypervisible to datasets. Tiara Roxanne‘s work involves decolonialism, arguing its impossibility. They ultimately ask: What other gestures of healing exist? Together, they invite us all to consider: What does it mean to gather and hold space for each other, to think big and dream wild about technology outside contemporary ambitions and frames?
Mimi Ọnụọha is a Nigerian-American artist and researcher, and visiting professor at the Tisch School of the Arts, New York University. Tiara Roxanne is a Tarascan Mestiza scholar and artist based in Berlin, and a postdoctoral fellow at Data & Society in New York City.
KUNCI Study Forum & Collective
KUNCI Study Forum & Collective experiments with methods of producing and sharing knowledge through the acts of studying together at the intersections between affective, manual and intellectual labour. Since its founding in 1999 in Yogyakarta, Indonesia, KUNCI has been continuously transforming its structure, methods and ways of working. Initially formed as a cultural studies study group, KUNCI’s practices currently emphasise collectivising study by way of space-making, discussion, library, research, publishing, press and school organising. The practices and pedagogy of KUNCI have been a source of inspiration for the AI Anarchies Autumn School. In developing space and flow for the school, we draw on a number of lessons in their book The Classroom is Burning, Let’s Dream about a School of Improper Education (Ugly Duckling Presse, 2020), including the need for processing, reflection, and for the restful, easeful aspects of study. Throughout the Autumn School, KUNCI members (including Nuraini Juliastuti and Syafiatudina) will be part of the live processing experiment.
The recording of the talk can be watched here: https://vimeo.com/akademiederkuenste/autumnschool1710.
Feminist Tech Principles – Foresight Workshop
In this workshop, we will use a future-oriented methodology to further develop feminist tech principles. The guiding principles were developed by 25+ individuals from around the world. They are a living collection of approaches that help us take a holistic view of digitisation and look at it in terms of patterns of discrimination that are intersectional in impact. The methodology used in this workshop will help participants draft preferable future scenarios and identify what actions are needed today to make these visions a reality. This will be a fun, interactive workshop in which participants gain insights into existing future-forecasting methods while also getting the chance to draft preferable futures collectively.
Intersectional AI Zine-Making Collaboratory: What’s in Your AI Toolbox?
There is power in knowing one’s tools well. This possibility is currently missing from AI tools, which carry an aura of magic and require massive technical resources to be able to wield. What felt intimidating when you first started in machine learning? What do you wish you knew then? The INTERSECTIONAL AI TOOLKIT argues that anyone should be able to understand what AI is and help shape what AI ought to be. This ZINE-MAKING COLLABORATORY invites you to imagine easily accessible AI tools – as tangible and approachable as a hammer or a garden spade. What wild shifts in infrastructure, language, communities, approaches, or perspectives would need to take place in the field? How does the logic of classification that provides the framework for machine learning, materially and ideologically, also need to shift? What’s the role of engineers, artists, researchers, and activists in facilitating these foundational changes? What would you put in your toolbox?
A performative meditation in the wake of the fall of Roe v. Wade. What does kin mean when rapidly developing reproductive technologies shift our relationships? How much control should we have over the body of a person giving birth and a life before it begins?
Day Fours provocation pushes for a deep consideration of difference, and a focus on the role of the body in our built technological systems. What would an AI that helps counteract ableism, taken up as the core tenet of many modes of oppression, even look like? With this knowledge and embedded wisdom, what are methods of enchanting and remystifying AI, and computation, to push beyond the technologies we have now? Should we “tear it all down,” as Legacy Russell writes, to create a blank slate for an AI or system that took the human body’s needs into account? Where would we begin to build? To this end, how can the binary of logic/reason and feeling, affect, and lived experience, be deconstructed to allow for a multiplicity of uncapturable reconsideration of what we want AI to be, to move towards?
Bodies as Systems; the Body in Systems
Louise Hickman and Laura Forlano explore how the knowledge, activism, and practices of crip- activists, crip-engineers, and scholars might reframe AI, and demand more from it. Both insist that if we look to these communities as inventive hackers of digital technologies, then we might find that we have never been short on imaginaries, or modes of interdependence. Hickman and Forlano will lead this session alongside a human live-captioner to show different kinds of skills, mutuality, and care possible within sociotechnical infrastructures. But, all technological infrastructures are prone to failure. What if we were, to take up technology as an affirmative kind of failure? Their provocations push for a deep consideration of difference, and a renewed focus on the role of the body in our built technological systems.
Laura Forlano is a associate professor, Institute of Design, Illinois Institute of Technology. Louise Hickman is a research associate at the Minderoo Centre for Technology and Democracy at the University of Cambridge (UK).
Recording of the talk can be watched here: https://vimeo.com/akademiederkuenste/autumnschool1810.
An Introduction to Adversarial Acoustics
‟I call my vocal condition experimental: every day, every encounter is an experiment where my voice, once a constant in my self-conception, is now a variable.” 
This lecture-workshop introduces participants to the use of voice in machine learning systems. Through a mix of theoretical discussion and practical computing exercises, we will critically explore the legacy of vocal forensics, tackling the use of the voice as a tool for profiling and measurement as well as exploring the space for an ‟experimental vocal practice” capable of troubling the assumed relationship between speech, subjectivity and the body that AI systems seek to reify. Participants explore how aesthetic practices can be used as part of an investigative method that opens up computational systems to sociotechnical concerns. We will take Jonathan Sterne’s account of a ‟crip vocal technoscience” as a starting point to explore methods that complicate the easy relationship between interiority and exteriority, and build an adversarial approach to vocal acoustics. No programming experience is necessary for this session.
 Sterne, J., 2019. ‟Ballad of the dork-o-phone: Towards a crip vocal technoscience”. Journal of Interdisciplinary Voice Studies, 4(2), pp.179-189.
The Good Robot, Live! Why Technologies of the Future Need Feminism
Join feminist researchers and podcast hosts Eleanor Drage and Kerry Mackereth as they ask the experts: What is good technology? Is it even possible? And how can feminism help us work towards it? In this interactive workshop, we explore the many feminisms critiquing, reshaping and dynamically remaking AI. We’ll then think collectively about our key question ‒ What is good technology? ‒ and join together in some creative play to imagine what good feminist technologies could look like. Next, we’ll look at how some other people ‒ from academics to technologists ‒ have responded to our key provocation of the ‟good robot”. Finally, you’ll have the chance to feature on a Good Robot episode that we’ll record live from the workshop, one that embraces anarchy in its exploration of what it means for a technology to be ‟good” and the role that feminisms need to play in this process.
When we work towards collective futures, we gather momentum from our daily practices. What lessons can we pick up as we create more vulnerable spaces, make more space for (access) needs and take more time to attend to them in our present environments? In this workshop, the Feminist Health Care Research Group offers space to excavate what instructions we can find in our rendered understanding of the spaces, institutions and relations we are already shaping and using today. In a narrative framework, we further connect to fears and hopes, expectations and anxieties that we cross through as we envision futures enmeshed with technology. This workshop by FHCRG offers space for exchange, body exercise and playful prompts.
We close the Autumn school with a recognition of the realities of the present and where we live now. The Russian invasion of Ukraine and its ripples across Europe; the legacies of migration and colonial conquest in Europe; the human costs of being subject to gamified algorithmic modes of governance and work; the challenges to modern digital life presented by ideas of de-growth, particularly in terms of environmental sustainability; the violence of borders that prop up the nation state.
Whose State of Things? Europe, the World, and Technology
In their session, Heba Y. Amin and Nelly Y. Pinkrah, will sound out the ways reductive histories of technology tell us something about the current "state of things". They ask: What is at stake at this political moment, when contemporary constructs of technology and digital representation emerge from an already problematic representational logic that was conceived and embedded within a colonial discourse? There is an inquiry to be made about how we relate to and through technology today, and who and where one has to be – on the historical, epistemological, geopolitical, and socio-cultural maps of contemporary technical life – in order to produce adequate knowledge that engages "proper study".
Heba Y. Amin is an artist based in Berlin and professor of art at ABK Stuttgart. Nelly Y. Pinkrah is a writer, activist and scholar commuting between Berlin and Vienna, professionally based at the Technical University in Dresden.
Nothing Is Going To Happen
The Glass Room is a public intervention that aims to demystify technology through immersive, thought-provoking, self-learning exhibitions. Over the past seven years, the project has developed from a large-scale exhibition to a portable, adaptable version that has reached over 300,000 people in more than 60 countries. Shown at festivals, conferences, universities, museums, schools, and libraries, the exhibition has contributed to the public discourse and critical examination of Big Tech. How can the audience begin to engage in the conversation without feeling alienated by barriers ‒ real or imagined? How can actors in this space reach people directly affected by the social, political, and economic consequences of technology? Find out how to discover critical considerations about technology where you encounter it, and spark interest within your communities. This interactive workshop will cover Tactical Tech’s methodology and innovative approach in the design, development, and production of the exhibition over time.
Resistance against Digital Colonialism
How are digital technologies related to contemporary colonialism? We will collectively explore this question in the workshop, drawing from the participants’ and workshop leaders’ experiences and expertise. The main aim is to make visible the complex and often hidden colonial structures in digitalised societies. Colonialism is a variety of forms of structural domination of the global South by the global North. Digitalised societies are enmeshed with colonial structures through various mechanisms and areas: digital markets, working conditions, extractivism, electronic waste, energy consumption, data protection violations, manipulation, discrimination and oppression, among other issues. The workshop explores this variety and systematises it. It also focuses on different cases of resistance against colonialism in the Digital Age. The latter are manifold and creative but heavily struggling with strong colonial, capitalist logic. The pedagogical approach is to work in small groups, interactive, and case-based.
Solarpunk and a Fossil-Free Internet
The internet emits over 3% of the world’s carbon—more than the aviation industry. It is the largest coal-powered system of machines on the planet. We need a fossil-free internet by 2030. How do we get there? Take a tour of the Museum of the Fossilized Internet. Explore solarpunk art and research. Learn how we can use the web to accelerate a just transition, and better understand the systemic drivers of climate crisis and its injustices—including how AI amplifies these harms. There is hope, but we have to nurture it.
The Guts of Glut
On the final evening of the school, Johanna Hedva ‒ writer, artist, musician ‒ presents The Guts of Glut, a video screening and a talk about Glut (a superabundance of nothing), their 2021 sound piece, installation, and video game. The work is disorienting as it shows all the ways AI has always been with us, in the forms of mystical voices and divination. Hedva will present the themes and processes behind developing the sound and video game of this multifaceted project. Screams in their rawest form weave in and out of the sound piece. They are made by Arid and Mud, two AI vocal clones created in collaboration with Jessika Khazrik. Hedva and Khazrik deceived vocal-clone software through layers of vocoding in an effort to critique the surveillance embedded in current AI software. In the video game, the viewer follows in the searching wake of a teratoma avatar, listening to the inhuman clones speak words built and gathered by Hedva through divination techniques that predate those of buying and posting algorithms. Hedva will talk about the themes of nothingness and cloning, the mystical uncanny, and divination across fields and eras.
Recording of the talk can be watched here: https://vimeo.com/805466833/42655f7f24.
Group Processing (with KUNCI Study Forum as Guest)
The final day of the school is devoted to processing and decompression, with much space for meetings and delving deeper. Please relax and enjoy the company of collective study. The curators, along with Nada Bakr, will shape a morning session for the study group to reflect on the week’s ideas. Some workshop groups might take the option to extend their collaborations and display their work on this day. Others may choose to take a nap. After a stimulating set of days, we encourage you to find one another and talk, debate, plot, plan, and form connections with new thinkers and co-conspirators toward more anarchic models.
Dreaming Beyond AI
Technological change and automation have been a cause for specific, unusual, and new kinds of trauma across geographies and times, particularly for the most marginalized among us. This violence is often mental, and the trauma, unseen: they open up psychic wounds that can harden us, and limit our vitality. This heavy knot can become further entangled with unacknowledged pain that we feel in our bodies. In many cases, these wounds can disconnect us from the possibilities of our true selves, individually defined. How do we learn to live with technological change as it shapes our internal life?
The AI Anarchies Autumn School is an opportunity to question the way experiences around AI are transmitted, and to create a social forum through this questioning. During the Dreaming Beyond AI workshop, we will explore our ongoing and emerging perceptions and reflections on AI ecosystem(s), by centering our bodies and our emotions. We use games and roleplaying as methods to engage the body, and develop a practice of questioning of dominant narratives.
The organization of this workshop is divided into 3 parts and follows the 3-step motion of Dreaming Beyond AI: ·
Breathe: Feeling our bodies and the presence of each other through a sound walk;
Dream: Collective researching and plotting the dynamics of resistance through role-playing;
Connect: Developing collective ways of liberation from surveillance systems.
Maximum Participants: 30
Requirements: We ask participants to please wear comfortable clothes. Please note that the workshop involves physical touch and contact among participants.