top of page

VI-SCENT.Lab_Data Dream

vi-scent data dreams

Touchdesigner | Stable Diffusion | Adobe After Effect

Interaction Design | VI-Scent Performance | Human-AI Cooperation | Artificial Intelligence Generated Art | Experience Design

VI-SCENT.Lab_Data Dream

Touchdesigner + Stable Diffusion: Olfactory Visualization

In the VI-SCENT.Lab_Data Dream project, I combined Stable Diffusion with Touchdesigner to create a real-time AI visual output system controlled by scent memory prompts. Users input their scent-triggered memories into the system, which utilizes natural language processing to transform these memories into a virtual visual world. This project not only explores the synergy between abstraction and rationality in the representation of senses and emotions by algorithms and AI but also examines the increasingly blurred boundaries between humans and AI in sensory and emotional interactions.

 

As the creative lead, I collaborated closely with programmers to integrate olfactory interaction into this AI-generated art project. By training the AI to associate visual elements with sensory keywords, we developed an immersive, AI-enhanced experience that offers poetic and innovative interactions for brands in the tea, whiskey, and perfume industries. This fusion of technology and sensory experiences not only enriches brand engagement but also opens new possibilities for commercial applications.

Visual direction - Yuqing Liu

Touchdesigner - Yuqing Liu

Video editing - Yuqing Liu

Interaction design- Yuqing Liu

Background

This project builds on my previous work, where I collected a scent memory archive from exhibition visitors. Using the memory narratives left by these visitors on the 24 scent samples I provided, I have collated different memories of each scent and extracted key words from their written narratives. This has allowed me to delve deeper into the power of scent as a memory trigger and explore new ways of visualizing these olfactory memories using cutting-edge AI technology.

dab4964d810cafdd6691205c040385d.jpg

Development

From recording the essence of each audience member’s scent-triggered memories, the system extract key phrases and transform them into prompts that the AI can recognize. These prompts are intricately combined with my own artistic style and output format, all within the dynamic environment of Touchdesigner. This fusion allows me to generate real-time visualizations of olfactory experiences, vividly rendering these scent memories into captivating visual forms. Each visualization uniquely represents an individual’s memory, offering a novel and immersive way to experience and interact with the ephemeral nature of scent.

Olfactory Visualisation of AIGen Artworks

Live Event

The Case for Tea Brand Experience

My olfactory visualization AI art generation system collaborated with Chinese tea, whiskey, and perfume brands to create a unique multi-sensory interactive experience for consumers. This allowed them to directly connect AI-generated visuals with their personal sensory memories. Additionally, I transformed the real-time AI-generated visuals into 'scent memory stamps,' where participants could write down their memories and send them along with tea products, creating a complete loop in the commercial experience.