top of page
Between Sound and Space

Designing and developing real-time audio-reactive visuals in virtual reality

Overview

Between Sound and Space is an independent research and development project exploring how sound can shape form, motion, and light within immersive space.

Over the course of a month, I developed an interactive experience where dynamic visuals react to live audio input, all within a VR environment.

Using TouchDesigner and the Oculus Rift, I transformed sound into a captivating, immersive spectacle.

With no existing documentation to guide the process, I relied entirely on self-directed experimentation to bring the visuals into VR. 

My Role

Creative Technologist, Visual Programmer, Technical Artist

Timeline

1 month

TouchDesigner, Python, OpenXR, Oculus Rift

Tools

Goals

  • To design, code and integrate a visually engaging, audio-reactive system using TouchDesigner.

  • Integrate the system into a VR headset for a fully immersive experience.

  • Explore how sound-driven design can create immersion, rhythm, and emotional presence in real time.

Process

1. Exploration & Experimentation
This project began with a simple question: Can TouchDesigner’s real-time visuals exist inside a VR headset?
With no documentation or tutorials available, my process became entirely experimental. Every day in the XR Lab, I tested to find a path from TouchDesigner to VR without performance loss. 
This became a form of research through creation.

​

2. Building the Visual System
Using TouchDesigner’s node-based programming environment, I designed and coded a modular system  that processed audio data and translated it into visual behavior.
Each component controlled a different layer of motion — from geometry distortion and lighting to emission intensity and rhythm-based scaling — creating a dynamic, ever-changing 3D landscape.

​

3. Coding Audio Reactivity
With Python scripting, I refined how sound data influenced visual parameters.
I used FFT analysis to detect frequency ranges, mapping bass to scale and motion, mids to color warmth, and highs to brightness and particle emission. This balance created visuals that felt synchronized to the sound without being overwhelming.

​

4. Integration into VR
I initially tested the system with the HTC Vive, before finalizing integration with the Oculus Rift.
Because no standard pipeline existed, every connection and configuration was achieved through experimentation. Achieving a stable TouchDesigner → Oculus Rift workflow was the project’s core technical breakthrough.

​

5. Refinement & Iteration
Conducted multiple test sessions, refining both the visual system and the VR experience for smooth, real-time interaction. Each iteration revealed new possibilities for using sound as a design material, turning the project into an evolving sensory experiment.

​

6. Final Implementation:

Successfully achieved a seamless, immersive environment where users can experience sound as vibrant, evolving visuals in virtual reality.

Result

The final prototype rendered a real-time audio-reactive 3D environment inside the Oculus Rift. The visuals pulsed, expanded, and shifted in rhythm with sound, translating audio into geometry and light in a way that felt tangible. Inside VR, the experience became meditative and spatial sound transformed into something you could almost touch. What began as an experiment in technical integration evolved into a study of presence, rhythm, and sensory design.

(Please switch on audio!)

User Testing

Challenges

  • No existing workflow for connecting TouchDesigner directly to VR . The main hurdle was the scarcity of resources and documentation for integrating TouchDesigner with VR hardware. Much of the process involved hands-on experimentation, troubleshooting compatibility issues, and learning through trial and error in the XR lab.

  • Maintaining real-time responsiveness while streaming to the Oculus Rift

  • Balancing performance, visual fidelity, and user comfort in 3D space

Reflection

​

This project taught me how code and sound can converge to create emotion in motion.It reinforced my
belief that true immersion isn’t about realism, but resonance, when what you hear and what you see
move together, and presence becomes palpable.

Further Considerations

Integrate hand-tracking or gesture-based controls to let
users sculpt visuals in real time

Open-source a TouchDesigner
-to-VR workflow template for artists and designers

Create accessibility modes
using visual rhythm for non-auditory interaction

bottom of page