Published
August 1, 2023
Key words
artificial intelligence, TikTok, visual cultures
This research project was carried out as a collaboration between Amie Galbraith, Gabrielle Aguilar (gabriellekaguilar@gmail.com) and Madeline Brennan as part of the Digital Methods Initiative Data Sprint in July 2023.
This project was recently presented at the Storytellers + Machines 2024 at The School of Digital Arts in Manchester, UK.
Like an all-seeing eye, TikTok’s AI-enabled visual filters render the previously invisible visible. The filters are simple in nature; users take an image, AI recognizes the user’s face and features, and transforms them into a visually appealing artwork. This project delves into the unexpected use of the ‘AI Style’ filter, also referred to as the “#detectillness” filter on TikTok. While initially designed for creative expression and entertainment, its influence transcends the boundaries of artistry. Women, in particular, have taken to the filter as a way to detect an array of hidden or imperceptible mental and physical illnesses. We ask, what visual and cultural aesthetics are at play in such an encounter? Furthermore, what are the broader socio-cultural implications of such novel engagements with AI?
What begins as creative curiosity with an AI filter can quickly turn into self-diagnoses without input from medical professionals. Experiences materialize on a spectrum ranging from creative and whimsical to insidious and concerning as the filter resonates with users to varying degrees of intensity. Through a mixed methods approach of web-scraping, intertextual analysis and composite imaging, we give form to the affective relations between AI, the sick body and the imagination. Our work uncovers how visual and cultural elements of fantasy, entangled with a bleak history of women’s healthcare, ignite user imaginations and build representations which validate their suffering. This project ultimately demonstrates how easily AI tools initially designed for play and creativity can gain popularity as tools of knowledge production in marginalised and disadvantaged communities. Such communities are more at risk of ascribing a certain significance and power to AI tools which speak to their feelings of insignificance and lack of power in the offline world.