YI

 

Works

Exhibitions

CV          








Emotional Echoes

A Real-time AI-Driven Synesthetic Experience

Interactive Installation.  2025
Tools: Stable Diffusion, TouchDesigner.


"Emotional Echoes" is an experimental project exploring how artificial intelligence perceives, interprets, and visualizes human emotions through music. By using real-time Valence-Arousal (VA) analysis of vocal and musical input, and generating visuals with a customized Stable Diffusion model, the project functions as an "Emotion-to-Visual Translator," fostering a cross-linguistic, cross-cultural emotional dialogue.










CORE IDEAS


How can AI understand and visualize the essence of human emotion?
  • The project positions itself as an emotion-visual translator in the era of machine perception.
  • Innovative exploration into machine interpretation of musical affect.

THEORETICAL SUPPORT


  • Psychoacoustics and affective computing
  • Synesthesia and sensory fusion
  • New paradigms of emotional expression in the digital age













OPERATIONAL WORKFLOW


1. Audience Interaction

The audience is invited to speak, hum, or vocalize into the microphone.

2. Real-time Emotion Analysis
The captured audio is transmitted to the laptop, where an AI model analyzes the emotional tone using a Valence-Arousal (VA) framework (e.g., joy, melancholy, excitement, calmness).

3. Generative Visual Output Based on the emotional classification, a custom version of AI generates a synesthetic visual interpretation of the sound.

4. Visual Rendering & Projection
The generated visuals are rendered in TouchDesigner and sent directly to the projector for real-time display on the screen.

5. Audio Feedback
Simultaneously, the audience’s voice is played back through the speaker, creating a full audiovisual feedback loop that immerses the participant.