Artist(s):
· Yuting Xue · Yuxuan Bai · Elke ReinhuberSchool Information:
Artwork Information:
Description:
Bend to is an immersive artwork allowing individuals to anticipate and interact with plants’ phototropic senses in a futuristic environment where ultraviolet (UV) radiation has increased. It is based on plants’ phototropic sense and created against the context of the current varying climate, which will lead to uncertain harmful UV radiation reaching the earth’s surface in the future, offering an embodied experience for post-humans to perceive environmental changes alongside plants. This project integrates 3D scanning and time-lapse photography to collect phototropic data of plants in the real-world and present the subtle process of how plants respond to light through the virtual reality (VR) experience. With hand position tracking, users are stimulated to actively sense and explore the position of their own bodies and plants in the dynamic natural space.
It immerses individuals into the sensual experience of plants’ responses to surroundings through proprioceptive interaction. With the embodied engagement of hand-tracking and virtual touching, this artwork aims to enhance participants’ awareness of their bodily position and strengthen connections with the spatial senses of plants. More importantly, it encourages individuals to explore the intricate relationship between nature and post-humans in the future light-radiated environment.
Technical Information:
The interactive system of Bend To consists of three components: data collection, data processing, and data visualization. We extracted the data from the phototropism experiment, predicted the phototropic response rate of seedlings, and ultimately visualized and facilitated interaction within the VR experience. To collect empirical data on the response of plants to light, we use two time-lapse cameras to document ongoing alterations on 2-week-old soybean seedlings under 302nm UV-B light and 450 nm blue light spectrum. For “data processing,” the TrackMate plugin of ImageJ was employed for keypoint detection and tracking in time-lapse images to produce positional data changes between the x, y, and z-axis coordinates at the start and end of light irradiation. For “data visualization,” different soybean seedlings were scanned by a 3D scanner, EinScan Pro HD, to get high-quality FBX models. In Unity HDRP, those models are baked as the “Visual Effect Graph” properties to generate dynamic particle effects of soybean seedlings. Regarding data mapping, the processed data was used to control the phototropic response rate and the target point coordinates. The process of phototropic bending is fully automated by “C# Scripts” and is based on three different rates provided by the data processing.

