Interaction Design | Scent Installation | Human-AI Cooperation | Artificial Intelligence Generated Art | Experience Design
18/02/2023 - Present( Ongoing Project)
Individual project
VI-SCENT.Lab: Memory Archiving System
VI-SCENT.Lab_Memory Archiving System is an interdisciplinary research and artistic project exploring how human olfactory memories can be translated into machine-interpretable forms through artificial intelligence.
Through a participatory archive, participants recall memories triggered by specific scent stimuli and describe their emotional associations. These narratives are processed by generative AI models to produce visual interpretations of the sensory and emotional qualities embedded in the memories.
To date, the project has collected over 1,500 scent-triggered memory narratives across 24 scent categories through workshops, experiments, and exhibitions in the United Kingdom, Italy, and China.
By combining olfactory interaction, participatory data collection, and generative AI systems, the project proposes a new framework for human–AI communication in which smell functions as a bridge between personal memory, collective cultural knowledge, and computational interpretation.
Concept, Research & System Design – Yuqing Liu
Interaction Design & Visual Development – Yuqing Liu
Project Documentation – Meng Li
Background
Olfaction is one of the most powerful triggers of human memory, yet it remains one of the least explored senses within digital culture and human–machine interaction. Unlike visual or textual information, smell evokes memories in a deeply emotional and associative way that is often difficult to articulate or document. My earlier research began with collecting scent-triggered memory narratives from exhibition visitors using 24 categorized scent samples. Through these encounters, I observed how smell could activate highly personal recollections while also revealing patterns of shared cultural memory. Building on this archive-based practice, the project investigates whether olfactory memory can be translated into a perceptual language interpretable by artificial intelligence, allowing subjective sensory experiences to be computationally reinterpreted and transformed into an evolving visual archive.

System Methodology
The project is developed through a participatory research process that translates olfactory memory into computational visual representations. The system operates through four interconnected stages.
01. Olfactory Stimulus
Participants encounter a set of 24 curated scent samples, each designed to evoke associative memories. These scents function as sensory triggers that activate personal recollections, emotions, and narrative fragments.
02. Memory Narrative Collection
Participants are invited to describe the memories or emotions evoked by each scent stimulus. These written narratives form the primary dataset of the project, capturing subjective sensory experiences that are normally difficult to document or share.
03. AI Translation Process
Key semantic elements are extracted from these narratives and converted into structured prompts interpretable by generative AI models. Rather than producing literal illustrations, the system generates visual interpretations that reflect the emotional and sensory qualities embedded in the memories.
04. Visual Archive System
The resulting visual outputs are organised into an evolving olfactory memory archive, where each generated image represents a computational translation of an individual scent-triggered memory. Over time, this archive accumulates patterns of sensory perception and collective cultural memory.
Archive Dataset
The project is supported by a growing archive of scent-triggered memory narratives collected through participatory workshops, exhibitions, and experimental sessions. To date, the archive contains more than 1,500 individual scent-memory narratives across 24 scent categories, contributed by participants from the United Kingdom, Italy, and China. By aggregating these personal recollections, the archive begins to reveal patterns of shared sensory perception and cultural memory embedded within olfactory experiences. This evolving dataset forms the foundation of the project, enabling artificial intelligence systems to reinterpret human olfactory memory through computational visualisation.




The project approaches scent as an interface for interaction, inviting participants to engage with olfactory stimuli as triggers for recalling personal memories and emotions. Rather than functioning as passive viewers, visitors become active contributors to the evolving archive by sharing the memories evoked through these sensory encounters. Artificial intelligence acts as a perceptual translation layer that transforms these subjective narratives into visual representations of olfactory memory. Through this process, individual recollections are gradually assembled into a growing archive that reflects how smell connects memory, emotion, and cultural experience. Over time, this archive becomes a shared sensory resource that allows audiences to explore scent-driven memories in new ways and opens possibilities for future scent-based virtual environments.




As a scent artist, nothing brings me more joy than seeing my work resonate with audiences and sparking new connections and friendships. The love and support that my work has received at the exhibition has been truly inspiring, and I am grateful for every conversation and interaction that has resulted from it. I am excited to continue developing this project in the future, exploring new ways to capture and visualize scent memories and to further push the boundaries of what is possible in the world of scent and art. I look forward to the next phase of output, and to sharing my passion for scent and memory with even more people in the future.












