DietitianVR

A VR simulation for the Meta Quest made for dietitians. There are a total of four different modules, each showcasing a patient with a different condition and very different attitude. The goal is to prepare students for communicating with patients in the real world, including those who may be non-compliant or outright aggressive. The simulations also help reinforce proper procedure and treatment options for various conditions.

My Role

I worked in Unity and C# for years before beginning this project, but had only dabbled in VR. This was my first real virtual reality project, and oh boy did I have my work cut out for me.

The project was initially going to focus on procedure, including a segment about how to properly place an NG tube. I even built a prototype that included a somewhat crude NG tube placement segment. However, the team members at Rutgers and UConn eventually decided that those types of interactions could better be handled by real-world dummies, and that VR tools showing things like NG tube placement already existed. After much deliberation, we agreed that the focus should be more on the actual dialogue between dietitian and patient.

With the educational goal decided, it was time to get to work. Our artists at NMSU did a great job on the character models, textures, and animations. I handled everything on the Unity and programming side. I also lent my voice to the character “Angry Craig,” and made some of the hospital room objects in Blender. I used a dialogue engine called “Ink” to set up the dialogue interactions based on the script provided by the UConn team. The rest of the programming was done in C#. I wasn’t given many guidelines or specifications, and took some liberties with the features in order to make the simulation more immersive and interactive.

What I Learned

Put simply: VR is hard. Working on a MacBook for a majority of the time complicated matters, as I was unable to preview the application on my headset in real time – a feature Unity only allows for its Windows users. Furthermore, the dialogue scripts were long, and all of the lipsync work and animations were time-consuming. We were hindered by some complications with importing Blendshapes into Unity, and the animators had to re-work some of the assets. For our first VR project, we likely set the scope a little too big.

Despite the difficulties, the finished modules were effective and immersive. I gained valuable experience not just on working in Unity, C#, and virtual reality, but on managing scope and communicating with clients about technical specifications to ensure the final product satisfies all parties involved.

Credits: Game Design, Lead Programmer

Go Back