vr synth

Go Back To All Projects
Still shot from the demo video below showing physics-based interactions with hand tracking

overview

This experience explores the idea of natural music creation in XR through the use of hand tracking and physics-based interactions.

The ultimate goal is to create a product that seamlessly integrates musical instruments in a way where there is little to no learning curve. By doing away with the controllers, anyone can instinctively play melodies and explore sound design through various knobs and effects they can control directly with their hands.

process

concept

The initial concept was to create a digital clone of one of my physical synthesizers in XR. Many analog synths can be fairly expensive, so I thought it would be great to be able to achieve a similar experience and sound in XR that could be more accessible for more people.

One of the best parts of a physical synth is it's tactile nature. Because of this, I knew I wanted to explore hand tracking to get as close to the real thing as possible.

Early prototyping

To make the synth work with physics-based interactions, the first step was creating a few keys in Unity using rigidbodies, colliders, and joints to get the behavior I wanted.

From there, I worked on creating knobs that could be mapped to essential controls such as the low pass filter, envelopes, and effects through a custom C# script.

With the basic building blocks in place, I assembled a whole octave of the keyboard so I could test how well the hand tracking was working to play different notes.

Beyond they keyboard and knobs, the other critical component to my prototype was creating a signal flow that would route my audio appropriately and give the user the same level of control over the sound design that one would expect in any synthesizer.

This involved a combination of Unity's built in audio mixer and effects with some of the more specific functionality such as envelopes incorporated though my own scripts.

To simplify the process, I decided to use a looped samples instead of true oscillators, and I kept the synth in mono. I also only used one oscillator (a saw wave).

Testing & Iterating

Once I was happy with my set up for the signal flow, my focus turned to usability. My main goal was to get a consistent and reliable physics response that would allow me to naturally hit the desired notes.

Through several rounds of self testing as well as reaching out to some friends to do the same, it became clear that the hand tracking just wasn't accurate enough to play the synth reliably. Whether it was accidentally hitting two notes rather than one, or missing a note entirely, the experience felt like more of a struggle than one that came naturally.

To make the experience more enjoyable, I opted to rescale the keyboard in a way that seemed to greatly reduce the errors that were being made while playing it. This rescaled synth felt more like hitting a set of drums than a keyboard, but it was much easier to play, and as a result, was much more fun.

Although it was now much easier to hit the correct keys, another usability issue I was running into involved the overall physics set up. I was getting some strange behaviors and unwanted interactions between my hands, the table, and the keys, however though some additional testing, I was able to reconfigure the colliders in Unity and get back to a setup that behaved as expected.

REFINEMENT

With the major usability issues out of the way, I was able to finally build a full setup that I could easily play. For the first time, I could hit all the notes I intended to, and the synth started to sound much more musical as a result.

To refine the prototype further, I focused on creating a more inspiring environment that could respond visually to the audio and the physical interactions with the synth. One of the main parts of this was a shader and particle effect that made the keys glow when hit. I used other shaders to complete the look of the synth and environment in a similar style.

As a final touch, I added the ability to play a drum loop in the background that creates a rhythm to jam to and serves as the backbone for a song. In the future, I'd love to integrate a sequencer that can make this part of the experience more customizable and interactive.

Front view of the synth setup within Unity's scene view.

looking forward

In testing the build shown above with a few other people, it's clear there are still improvements to be made. For a larger person like me (I'm 6' 3"), all notes are within reach, but for shorter people, it's more difficult. This is also approaching the limit of the range in which hand tracking works.

On top of that, there's a noticeable amount of latency that has an effect on how well a user is able to stay in time with the beat, and there's a number of improvements I could make to improve the sound as well.

Although there are still many improvements that can be made, I've learned a lot throughout the testing process so far, and I'm excited to apply those learnings towards the next iteration.

Top down view showing on of the smaller scale iterations of the synth.