Discover more from Ghost in the Machine
Ghost in the Machine
PMSing on SZA's latest album and thoughts on mainstream AI
I go through hell and rebirth every month. It starts within the 7-10 days prior to my period. In addition to some bodily symptoms, the most confusing PMS symptoms are based on my moods and emotions. I become more susceptible, cynical, and prone to hurting the closest ones. I am more likely to be reminiscing about my exes - especially the ones who probably don’t think of me at all. This descent into my personal hell feels like each and every foundation of my life and self-worth are burnt to its ashes. This also enables valuable introspection, messy realizations about life and the self. My mom, my therapist, my best friend, and I usually go through these and unpack them, and finally forget it all to rejoin the cacophony of life.
This time, my PMS gem was with SZA’s last album, SOS. I love SZA’s vulnerability - it helps everybody say out and loud that they want to kill their exes, they settle for sh*tty relationships, they don’t know their worth etc.
I am under the soft covers of my bed, my tears are swiping through my cheeks until they reach my noise-canceling headset. I am processing all the heartbreaks, and self-worth issues and proudly crying until I hear:
I get a sense of excitement every time I hear AI in the mainstream through the artists I like. I used to complain about the fact that nearly all the imaginations of AI in our collective consciousness would come from Hollywood sci-fi films that usually centre human-like machines and apocalyptic visions of future. It feels like this is evolving?
Nowadays, I see my friends posting Lensa AI avatars on Instagram, talking to me about the images they generated on DALL-E, and of course the “algorithm” a.k.a. the recommendation algorithm on social media platforms I don’t stop talking about (watch my Tedx talk here but don’t tell me you watched it). A.I. has been entering the mainstream vocabulary as something that people use.
What A.I. actually is and how it should be understood is a topic of debate. It is even complicated even for experts and regulators. For instance, European regulators have been circling back and forth on the conceptual and practical boundaries of A.I. in the EU AI Act.
I am interested in thinking about how people think about AI because its content and shape are as cultural as they are scientific. It is alive and moldable, it feeds off of fiction, imagination, and our perception of ourselves. Some researchers and mainstream imaginaries refer to artificial life-forms that surpass human intelligence when thinking about AI. For others AI is an umbrella term for data processing technologies including machine learning among others and human-level AI is a pipe dream. The first one dominates in terms of influence and funding.
It is also the first one that instantly situates humanity as the backdrop. Without having an understanding of what it is to be human, one cannot think of AI that is autonomous and more capable, and more intelligent than humans. Therefore, AI is a great mirror for us, an endless mirror filled with our projections and self-perception. That’s why SZA says in her song Ghost in the Machine:
Robot got future, I don’t
The lyrics echoed somehow with what Sam Altman was saying on Ezra Klein’s podcast a while ago. According to Altman, the CEO of OpenAI (the company behind DALL-E, GPT-3 etc.), a self-intelligent AI is quite inevitable:
EZRA KLEIN: Do you believe in 30 years we’re going to have self-intelligent systems going off and colonizing the universe?
SAM ALTMAN: Look, timelines are really hard. I believe that will happen someday. I think it doesn’t really matter if it’s 10 or 30 or 100 years. The fact that this is going to happen, that we’re going to help engineer, or merge with or something, our own descendants that are going to be capable of things that we literally cannot imagine. That somehow seems way more important than the tax rate or most other things.
I read “The fact that this is going to happen” as “robot got future I don’t” or AGI is destiny while the remaining life on earth for humans is estimated to be shorter than a few generations because of capitalism engendered climate crisis.
Robot got more heart than I
For some AGI folks and the Cartesian thought, humanity lies in the brain, the place where intelligence happens. Others locate it in the heart and its various associations such as emotions, feeling, consciousness so on and so forth.
SAM ALTMAN: I mean, look, we’re heading into the deepest philosophical questions that have been on humanity’s mind for a very long time, but maybe have never been as relevant or as decision-relevant as they are now. But I think a lot of these things really come down to, A, do you believe that a sense of self exists at all or is everything just like — there’s this body, and there’s this brain, and there’s energy flowing through a neural network in your head like there could be in a computer. And as that is running, it creates this illusion of a sense of self that is getting tortured but it really is not there at all and that it’s all the same thing.
Conceptualising a human as “energy flowing through a neural network in your head” might not inherently problematic. But how those who have the resources and the power to set the tech agenda understand “humanity” has an impact beyond one’s own preferences and beliefs.
Beyond different conceptions of humanity, there is the question of who is considered human. Historically, not every body has been granted humanity. Systematic dehumanisation is still used to justify violence and exploitation. In a human-centric world, being accepted as human is a privilege. In Enslaved Minds: Artificial Intelligence, Slavery and Revolt1, Dr Kanta Dihal cites Joanna Bryson's "Robots should be slaves" paper:
Bryson argues that robots should be slaves because ‘in humanising them, we [. . .] further dehumanise real people’ (p.63). Her argument is convincing in the context of contemporary robotics, in which the humanization of non-intelligent machines leads to an exacerbation of the oppression of marginalized human groups. The robot Sophia, which was granted citizenship of Saudi Arabia in 2017, is a case in point: since Saudi Arabia currently limits citizenship and residency for many of its immigrant inhabitants, the country prioritizes robot rights over human rights (Griffin 2017). An inanimate object has been given rights as a publicity stunt. As Sophia does not have agency or intelligence, it is a safe object to use for the purposes of its owner: it does not have the capacity to consent to or object to these projections.
What is more, the word robot means “forced labour”.
Many enthusiasts of AI and AGI argue that AI could create immense wealth by performing an increasing number of tasks that used to be performed by humans; therefore decreasing the cost of capital. This is called automation. Previous waves of automation were achieved by the free labour of enslaved people. The so-called “AI Revolution” or “the Fourth Industrial Revolution” fall within the continuity of “wealth creation for some - exploitation for many”.
New York Times columnist Jamelle Bouie’s “hot take” on Tiktok develops a similar line of thought:
Enable 3rd party cookies or use another browser
A new way of thinking?
My argument is that thinking about AI with humans as a backdrop is a problem. It’s like falling in love with the toys that we create in our own image. Some sort of God-complex that contains and projects very well known tropes of hierarchy and domination in this current paradigm governed by the good and worst of Western humanism.
I think we should be wary of comparing AI tools to human-like capabilities and aspiring to recreate and attribute a kind of humanness to the tools we create. This would require us to decenter ourselves and our place in the universe.
Thanks for reading Ghost in the Machine! Your support keeps me going <3
Cave, Stephen, Kanta Dihal, and Sarah Dillon, AI Narratives: A History of Imaginative Thinking about Intelligent Machines 2020