“this place was never empty...”
creative tech
term 4
spring 2025
Maxim Safioulline
Touchdesigner
deliverables
interactive installation
poetry zine
Set in Pasadena’s Arroyo Seco Park, this is a sensory reflection on bodies, memory, and the traces we leave—both human and nonhuman. The project consists of a gestural interaction prototype built in TouchDesigner using 3D scans of the park, and a zine of poems exploring nonlinear time.
Through the digital interface, users “speak” to a reconstructed landscape not with words, but with gestures—raising the question: can technology truly understand our embodied intentions? What might it mean for emotion to be expressed in nonverbal ways, and how might a virtual space respond?
In the end, this project became a way to think about how people and places remember each other—and how even the emptiest-looking spaces are full of traces, if you learn how to look (or move) differently.
proposal
context
After my online research, I found some interesting angles: people never stop planning, imagining, and using this place. Some things never happened and some are just local stories. This area is also crucial to wildlifes that don't happen within people’s view. But they all relate to the name and place Arroyo—giving me a sense of temporal and spatial overlap. None of this can be described by a linear trajectory or physically touched.
To reflect this layered, nonlinear relationship, I first created a zine that translates these fragments into poetry. I printed the pages on transparent and tracing paper to visualize the sense of overlap.
At the same time, most interactive systems see the body as a tool for optimizing control. But in life experience, our gestures are often messy, expressive, and full of emotion. Therefore, I thought of recreating in digital space some of the corners of the park where traces of nature and humanity appear to mingle, and using our gestures to explore our relationship with digital space. Through gestures, we try to intervene in the digital world, but a screen forever separates us—like reaching for something present yet untouchable, known yet never grasped.
I trying to build a system where the zine and gesture-based interactions form a conceptual whole—where one medium captures the layering of time and narrative, and the other explores the body's intuitive, emotional relationship to space. Together, they invite the audience to enter this landscape to activate all the presence.
zine design overview
interactive design process
.tox
files and imported them into TouchDesigner.interface
to visualize my hand gesture.ply
model into TouchDesigner and set up the initial connections for rendering.- The distance between the thumb and index finger of the first hand controls the scaling of the model.
- The distance between the thumb and index finger of the second hand controls the rotation of the model.
- In order to be able to develop more applications for model-based interaction in touchdesigner, I decided to use a point cloud model.
- My final goal was to control the density of the particles in the point cloud model with the sentence distance between the thumb and forefinger of the first hand, and to control the scaling of the model with the sentence distance between the thumb and forefinger of the second hand, and the scaling was accompanied by vibrations of the particles.
- I entered interface base as my MediaPipe loop is the first layer of connection:
- base1 was created and used as the main point cloud processing container.
- Used hand_data and a series of math + select nodes to map gesture data to model parameter controls.
- geo1, geo2 , render1, cam2, light1 render and output the current point cloud scene.
- At this point, gesture has been successfully connected to the point cloud control network, but the vibration and density control logic has not yet been introduced.
- I entered base1 as my thrid layer:
- Import the scanned ply point cloud model using pointfilein.
- Set the initial pose of the model using pointTransform1.
- Extract and smooth gesture distance data using math4, select2, filter2 for controlling subsequent parameters.
- The first hand data is connected to:
- point delete/set: used to control the number of points displayed, indirectly controlling the “density”.
- noise1 → add1: superimposed vibration effect, the amplitude of the noise is controlled by the gesture.
- The left hand data is connected to:
- scale input of transform1 to scale the overall model. The left hand data is connected to: transform1's scale input, which realizes the overall model scaling.
- reference:supermarket sallad
focusing problem
It addresses three main issues:
Reduction of Gesture to Function — Most systems treat gestures as commands (e.g., pinch to zoom), ignoring their emotional and expressive potential. This project explores gestures as carriers of feeling and memory.
Disconnection Between Body and Digital Space — Interactions often remain abstract and impersonal, mediated by screens or sensors. This work asks: What does it mean to reach into a space you cannot touch?
Lack of Spatialized Emotional Feedback — Few systems let users feel their presence in a digital environment. This project uses hand gestures to manipulate a point cloud model, allowing users to “shape” memory-like landscapes through embodied, sensory interaction.
research questions for future:
precedents
Inspired by the “gesture triggering” in the tutorial, I followed it and added my own twist: I changed the trigger from just a color change to triggering a change in the density of the particle system, while letting the transparency slowly fade. I think this is more dynamic and allows the audience to experience more visual changes than a single response. In this process I realized that a “gesture” is not just a signal, but a process that can be used to create a dynamic feedback rhythm. This is in conjunction with my concept that the particles in the point cloud model symbolize fragmented information, and that our gestural interventions can harvest visual feedback.
What I found valuable was the idea that gestures can carry meaning beyond functionality, and that emotional depth can emerge when interaction systems support open-ended, bodily expression. This aligns directly with my own project, which uses hand gestures not to command, but to sculpt and distort a point cloud space—a metaphor for memory and presence.
This strongly informed my thinking about point cloud models. Even though they are abstract and untouchable, my use of gesture interaction—and the variability it enables—can help users feel that their presence has weight, and that memory is something they are actively shaping.
future possibilities
direction: Trauma healing, Alzheimer's assistance, or end-of-life care environment
02 Public interactive exhibition
direction: Historical perception reconstruction in urban renewal project, or ecological protection areas, archaeological sites, digital museums, etc.
03 Remote Affective Communication
direction: Long-distance family ties, digital relics, virtual nostalgic space