Sydney, Australia Sat, Oct 26, 2024 at 12:00 midn AEDT
Hong Kong, Hong Kong Fri, Oct 25, 2024 at 9:00 pm HKT
Beijing, China Fri, Oct 25, 2024 at 9:00 pm CST
Vienna, Austria Fri, Oct 25, 2024 at 3:00 pm CEST
London, United Kingdom Fri, Oct 25, 2024 at 2:00 pm BST
New York, USA Fri, Oct 25, 2024 at 9:00 am EDT
Chicago, USA Fri, Oct 25, 2024 at 8:00 am CDT
Denver, USA Fri, Oct 25, 2024 at 7:00 am MDT
Los Angeles, USA Fri, Oct 25, 2024 at 6:00 am PDT
Animation is media in motion, and the Expanded Animation Symposium has evolved since 2013. Originally a one-day side event at the festival Ars Electronica, it is now a three-day conference highlighting animation’s role in media arts. For the first time, Expanded 2024 includes two fully peer-reviewed paper tracks, focusing on works exploring audio-visual expression at the intersection of art and technology. With contributions from artists, curators, and researchers, its open-access materials, like the conference proceedings, we aimed to foster inclusivity. In this session, we are delighted to have award-winners and speakers of the Expanded 2024 who will present their work in 5-minute lightning talks.
Juergen Hagler, General Chair, and Martin Kocur, Publication Chair, were part of the organizing team of Expanded 2024. Juergen will outline the vision for the 2024 Expanded Animation conference, focusing on its growth into a platform for exploring animation and interactive art at the intersection of technology, art, and research. He will also talk about the connection to Ars Electronica. Martin will explain the creation of the conference proceedings, detailing the selection and compilation of peer-reviewed papers to ensure high-quality content that reflects the innovative spirit and global collaboration at the heart of Expanded 2024.
As machine learning systems are rapidly changing our world, the role of data science as a discipline is elevated. However, existing approaches in science communication and data visualization are not sufficient to address the complexity of such systems and the profoundly changing relationship between humankind and data. We introduce the “Data Art & Science” (DAS) project in which the role of art as an integral part in the discourse about data is investigated through a series of commissioned art projects and close monitoring of their creation process and impact. For this paper, we propose a Data Art Research model and use it to discuss one of these art projects, “23nm”. We aim to show the value in creating a system to reason about DAS and position the latter as a new way of approaching data-reliant projects in academia and industry to increase the quality of public knowledge and discourse.
This paper refers to an ongoing interdisciplinary, art-based research project which aims to question the disembodied rationale of Western epistemology and related regimes of dominant knowledge by focusing on vegetal agency. Following Michael Marder’s [2013] approach of “Plant Thinking” sphagnum mosses become the protagonists in this endeavor, as they are to be considered as key regulators of peatland ecosystems resisting the severe consequences of anthropogenic interference such as desiccation. Informed by feminist posthumanism the project aims to transgress the distinction between subject and object by tracing the formation of configurations, where nature and culture constructs each other which Haraway [2004] refers to as nature(s)culture(s).
The project invites the audience to participate in the seemingly rational practices of objectification such as collecting, inspecting, sorting and categorizing and juxtaposes them with seemingly affective practices of performance and dance. By activating the entire body to explore the dynamics of vegetal phenomena, the project critically approached the disembodied rationale of Western epistemology thereby exposing the dichotomy of human activity and plant passivity as a myth. The result is a distributed choreography involving human and more-than-human actors.
“Sensitive Floral” is an interactive, generative artwork that ventures into the exploration of a unique generative system, biomimetically emulating the reactive behaviors of the Mimosa pudica plant. By synthesizing the complexity of fractal tree data structures with the Cellular Automata (CA) mechanisms of grid computations, the artwork mirrors nature’s adaptive and responsive traits. This approach demonstrates the potential of generative design, organize coding rules and geometric relationships within parameters to shape biomimetic aesthetics. The interaction allows users to initiate a ripple of movement by simply touching the screen, triggering large numbers of changes across thousands of leaves, as like the group leaf movements observed in Mimosa pudica. This subtle yet intricate movement of digital flora not only reflects the beauty of natural phenomena but also opens further opportunities for expression in shape and movement behavior, brings a subtle immersive experience in the field of tech-art and human-computer interaction.
“Beyond My Skin” (2022-2024) is an interdisciplinary research project presented in the forms of an interactive real-time installation, hybrid performance, and mixed reality experience. By exploring the hybrid relationship between bodies and their digital representation in a posthuman context, the project creates a physical-digital space where visitors can experience new dimensions of self-perception and interaction with themselves and others, in a constant interchange between physical and virtual worlds.
This paper explores the creative tensions involved in bringing together the rapid ideation of theatre devising process with the meticulous development typical of 3D and immersive production. With reference to a collaboration between University of Plymouth and the Wardrobe Ensemble theatre company, the text focuses on the integration of Unreal Engine into devised theatre production. It discusses: the navigation of these differing temporalities; methodologies for creating media in a way is incorporated into the devising process; and the promotion of media assets as active elements within the narrative. The approach of “t(h)inkering” is introduced—combining the concepts of thinking and tinkering—as a process that eschews linearity of common 3D, animation and VFX production pipelines, emphasizing instead reflection and experimentation in media creation. The text explores how low-fidelity assets and fast turnaround times facilitated a dynamic dialogue between theatre and creative technologists and concludes by emphasizing a shift in perspective towards viewing Unreal Engine as a sandbox for experimentation rather than a linear production tool, advocating for a process-oriented approach to media creation.
The rise of diffusion models in AI image synthesis revolutionized computational image generation, challenging the traditional notion of photographic objectivity. This advancement posed a dual challenge for media artists: to harness AI’s potential while preserving artistic uniqueness. In Fencing Hallucination, we explored integrating AI models into a traditional hand-coded computation pipeline, introducing real-time interactivity to AI image synthesis and increasing the artistic control of the AI-generated images. The research culminated in an interactive installation exhibited at various venues, attracting over 800 audience visits. The system generated authentic chronophotographs from diverse human-AI inter- actions while maintaining consistent aesthetic control. This project exemplifies the potential of combining AI models and hand-coded programs, which pave the way to better human-AI co-creative systems.
This Work-in-Progress (WiP) paper introduces ITERATIVE BODY SYNTHESIS, a media installation that acts as both a research infrastructure for an experimental online investigation and a neural backend and monitoring interface for a virtual identity on Instagram. The project aims to investigate the impact of algorithmic decision-making on body representations in social media, exploring how these platforms may shape and distort forms of representation and perception such as body aesthetics, choreography, iconography, and authenticity. Given the opacity of this social media platform, ITERATIVE BODY SYNTHESIS is conceptualised as an infrastructure for a distributed monitoring. It is implemented through an incremental development process akin to adaptive black box testing methodologies, enabling us to navigate the research landscape without prior knowledge of the inner workings of the systems under investigation. To this end, we have developed a prototype of a self-operating software architecture based on principles of black box testing, machine learning, and synthetic data.
From the earliest mechanical water clocks, humans have divided time into calculable units through tools, continuously seeking the most precise frequencies in nature to measure time more accurately. Advancements in technology have led to the emergence of the concept of “attosecond,” a unit of time down to seventeen decimal places. However, even with such minute divisions, this increasingly precise measurement of time has become disconnected from human’s most direct bodily experiences, letting the relationship between time and bodily sensation insensible. “Time Organ” project investigates this “insensible” of human sensory time. Through virtual reality, heart rate sensing, and custom-built water-dripping mechanical devices, it creates multi-sensory experiential situations. Based on scientifically quantified time and bodily frequencies, the project generates sensory experiences from asynchronous to synchronous, encompassing visual, auditory, and tactile sensations. Inspired by the concept of extending organs, the project aims to create a tool for perceiving the passage of bodily time. By using human’s innate synchronicity and sensory extensions, it seeks to rediscover the perception of bodily time, thereby realigning human’s temporal sensations with their bodily frequencies.
Bonnie Mitchell is a new media artist and Professor at Bowling Green State University in Digital Arts, in Bowling Green, Ohio, USA. Mitchell is a member of the ACM SIGGRAPH History and Digital Arts Committee where she focuses on the development of the SIGGRAPH archives and coordination of the SPARKS lecture series. Mitchell’s artworks explore spatial and experiential relationships to our physical, social, cultural, and psychological environment through interaction, abstraction and audio. Her current creative practice focuses on development of physically immersive environments using interaction via electronics and special FX to reveal change over time. Her work has been exhibited internationally at numerous venues.
Victoria Szabo is a Research Professor of Visual and Media Studies at Duke University, and directs the PhD in Computational Media, Arts & Cultures and the Certificate in Information Science + Studies. Her work focuses on immersive and interactive media for digital humanities and computational media art. She is co-lead of Psychasthenia Studio, and artists’ games collective. She was Chair of Art Papers at SIGGRAPH Asia 2023 in Sydney and will be Art Papers Chair for SIGGRAPH Asia 2024 in Tokyo. She is also Chair of the Art Advisory Group for ACM SIGGRAPH and a member of the Digital Arts Community Committee.