This set of virtual reality experiments explores a variety of custom interactions programmed with C# and built using Unity's XR Interaction Toolkit. Each was a prompt as part of an 8 week hands-on bootcamp with the goal of advancing my C# knowledge and understanding of realtime physics interactions in Unity.
The intention of the prototypes below is to demonstrate functionality, however the concepts can be applied in a number of other settings for more fully formed games and experiences.
Each build was tested on my Oculus Quest 2, however the XR Interaction Toolkit that these are built upon is platform agnostic, so these experiences will work on most other major headsets.
Modular Firearm - This interaction explores the idea of incorporating various attachments to a firearm that enable different abilities, and can be mixed and matched to create a completely new weapon. A large mag clip enables fully automatic firing, while a plasma cartridge supercharges the bullets. An advanced scope allows you to zoom in for pinpoint accuracy, and also features a laser to see each bullet's trajectory without looking through the scope.
Level Editor - This mini-project is a fully functional app that allows users to create a completely customized environment using a series of prefabs that can be spawned in different colors, or deleted by throwing them away. A user can navigate between edit mode and play mode in order to test their design seamlessly. Creations can be saved to a json file and loaded again in a new session when the user wants to continue to work on their level.
Mobile Event Horizon - This interaction explores the idea of shooting a projectile that turns into a black hole when it explodes and pulls in objects near it due to its massive gravity. There is also a trajectory curve that allows a user to see where the projectile will land.
Control Volume - This interaction is one of my favorites, because it really takes advantage of VR's strengths by leveraging spatial 3D controls that would not be practical otherwise. Once in the control volume, the user is able to click a button on the controller to assume control of the spaceship and control its movement based on the hand's position and rotation within the volume.
Safe Cracking - Although a seemingly simple interaction, this experiment is fairly complex, dealing with custom interactables, physics joints, and 3D math to achieve the final result. It also uses custom shaders to give visual feedback depending on the part of the safe being interacted with.
Gravity Manipulation - This experience explores the idea of two types of gravity manipulating interactables. One pulls objects towards the user, while the other pushes them away. used together, they can hold the objects in mid-air and control their position. A number of custom particle effects were used to bring these devices to life and to indicate active vs inactive states.
Drone Agent - This experiment explores a custom AI drone that navigates through a set of defined positions. Obstacles are placed in its way, and the AI is able to navigate around them to stay on course. If a user tries to throw something at the drone to knock it down, it is able to stabilize itself and steer back towards it's path. All drone controls are done using realtime physics.
The process of working through these VR interactions has heightened my understanding of C# and tgiven me a huge confidence boost to pursure more advanced XR projects.
I'm currently finishing up a full length VR Sneaking Game as part of an 8 week bootcamp, and I'm excited to apply this gained knowledge in other settings such as mixed and augmented reality as well.