xr audio-vfx

Go Back To All Projects
Audio-reactive concept in Unity

overview

This project focuses on the interaction between music and visual effects in extended reality settings, exploring both procedural effects and user input to drive custom interactions.

So far, I've explored these concepts in both augmented and virtual reality, but I'm currently focusing on AR to build out a product that can be used at concerts and music festivals to expand on the visual performance by adding immersive 3D visual effects.

process

Concept

The production value of music festivals has expanded enormously over the last decade, yet there are obvious constraints to what's possible in a physical environment. In contrast, digital worlds have few limitations, allowing the laws of physics to be ignored.

This product aims to connect both physical and digital settings in order to push the boundaries of what's possible for real-time, live visual effects.

Concept sketch showing AR content above a crowd

VR PRoTOTYPING

My first explorations for this concept were purely in VR. My goal was to learn best practices for capturing audio data and translating it into values that I could use to control my environment. To get the most out of the audio, I divided the signal into separate frequency bands so that different variables could react to different elements in the music.

I started to work with various forms of geometry and experimented with things like scale, material emissivity, and other custom shader parameters to acheive effects only possible in a digital environment.

Early geometry study experimenting with scale.
Study showing a combination of scale and emissivity.
Bringing visual elements together to compose a stage.
Adding additional lighting and customizable effects.

VR PRoTOTYPING

My first explorations for this concept were purely in VR. My goal was to learn best practices for capturing audio data and translating it into values that I could use to control my environment.

One of my favorite effects was a custom shader I built for a canopy that covered the stage area.

I used a frequency band mapped to the lower frequencies to pick up information from the kick and bass. This allowed the canopy to pulse in-sync with the backbone of the track.

All of these effects were great to look at, but they were missing a level of user interaction.

My next step was to build an interface that could take in user input. I was happy with the current interaction the shaders had with the audio, so I added an additional layer of lighting for users to interact with.

These controls allow a user to easily change the configuration of the lights, including the color, movement, and strobe effects.

After some testing, I discovered a need to adjust many settings at the same time, so I included some presets to make transitions as quick as possible.

AR PROTOTYPING

To continue my explorations, I shifted my focus into augmented reality, as these types of effects start to take on a whole new perspective when placed in your true environment.

This early experiment shows how music can interact with a particle system made with Unity's VFX Graph.

This example mixes a combination of shaders and unity VFX to achieve a unique effect that responds in real time to audio input.

Signed distance fields allow particle strips to conform to the surface of a cube, while the intensity of the audio controls scaling and rotations speed.

LIVE TESTING

This early test shows how these types of effects can be integrated into live music settings, offering an extension of the visual performance.

Although there are fewer opportunities to test in this environment, it has proven critical do so in order to dial in normalization settings, so effects can successfully respond to audio at a variety of volumes.

One of the most successful iterations used a mesh-based particle effect combined with a custom shader that dissolves the mesh instances based on audio input.

MIXED REALITY

The ultimate vision for this project is to create a fully immersive experience on mixed reality headsets that breaks down the current barriers of mobile AR. With new devices like Apple's Vision pro on the horizon, I've started prototyping in mixed reality on my Quest 2 to explore the possibilities a device like that might bring.

The passthrough on devices like these allow for more complex, physics based interactions with music and VFX that could bring a whole new type of immersive experience to concerts and music festivals.

looking forward

This project is still very much in an exploratory phase, but as I shift to mixed reality environments and explore increasingly spatial shaders and VFX, I'm feeling more inspired than ever to make this a viable product for music festivals of the future.

I will likely release a build of the mobile AR app on TestFlight sometime soon. I plan to update this page when it's available.