Virtual Reality Diver Interface
Another Note: Much of the code for this project comes from a class sample that is still in use by the professor and is not mine to share publicly on the internet. If you would like to see the code my team and I wrote for this project, please contact me at email@example.com.
What is it?
The Virtual Reality Diver Interface is a project I completed as a part of Professor Lopes' Intro to Human Computer Interaction class (detailed below). This project stands out among the other projects from this class and from other classes since it leaves much room for creativity, which I, along with my team of 3 other students, took full advantage of. Though this was a group project, since I was the only person on my team who had prior experience with Unity development and with VR, I led the team and did much of the implementation work.
Our submission for this project was an incredible success. We wowed audiences during the live demo and we stood out among our peers for our unique methods of handling the challenges of the VR interface. Because of this, our project was awarded the Reach For The Stars award in the class and received a perfect score.
The purpose of this project is to design the user interface for an augmented-reality diving mask. To accomplish this, we were given a simple VR diving simulation Unity scene that we had to edit in order to provide the following functionality:
View battery life
View oxygen level
Get notified when oxygen levels are low
Receive phone calls, including the ability to answer or reject calls
Dial crewmates on the phone
Dial specific frequencies on the phone
Call the emergency line
Locate crewmates in the world
Support all of this functionality even when swimming through mud which obscures eyesight and makes hands invisible (though still functional)
Along with these tasks, we were also given the following limitations:
Audio cannot be used as an input mechanism
No aspect of the interface can be anchored to the world
After we completed our designs, we had to give a live demo of our designs to our classmates and to anyone else who was interested during our class' Diver Conference.
When approaching the design for this task, we made a point not to include any HUD elements.
Though HUD elements are great tools for designing interfaces on flat displays, in VR/AR, putting vital information in corners of the user's view makes that information difficult to read since the user needs to move their eyes to see the information, which is really uncomfortable. Also, it is distracting to have information constantly taking up space in the user's view, as that information may block their surroundings.
Thus, we deemed it necessary to find VR/AR-specific workarounds for the elements that would typically be a part of the HUD to avoid these troubling issues.
Our design revolves around two components:
Quick-menus on the wrists that allows fast access to core information and functionality
A full menu that can be accessed by using a "pull-down" gesture which contains all information and functionality
The quick menus are attached to the wrists. They appear only when the wrists are turned face-up so they will not bother the diver when they are using their hands and do not need the information on the menus.
The left-hand menu contains all information the user needs to know, including the battery level, the oxygen level, and the current depth. All of this information is conveyed in number form and through dynamic graphics.
The right-hand menu allows for communication with crewmates. It contains the option to call crewmates, answer and deny calls, dial the emergency line, and enable/disable the tracking of crewmates.
Full Menu - Motion
The full menu is unique in how it is accessed and how it moves once it is enabled.
To access the full menu, the user needs to perform the "pull down" gesture. To do this, the user must raise their arm in the air, make a fist, and pull the fist directly downwards. To get rid of the full menu, the user must perform the "pull up" gesture, which is identical to the "pull down" gesture but in reverse.
Once the full menu is enabled, it moves dynamically to avoid the pitfalls of HUDs. The menu will remain in once place when the user makes small motions, but if the user moves or looks too far from the menu, the menu will jump back in front of the user's vision. Keeping the menu in place during small motions allows the user to move their head while reading the menu which prevents the disorientation associated with reading only with eye movements when using HUD menus. Allowing the menu to move once large motions are made ensures that the menu is always near the player and that it is locked to the player and not the world.
Full Menu - Features
The full menu supports all of the features of the quick menus and more! The user can interact with the full menu by pointing at elements to make a cursor appear and using a pinch gesture (pinch the thumb and pointer) to select.
At the top of the full menu, there is a dock which contains the current battery, oxygen, depth, and time under water.
On the left side of the full menu, there is a list of contacts. Clicking on a contact will bring up a new menu specific to the contact which includes their frequency and world position as well as the abilities to enable/disable tracking of the crewmate and to call them.
At the center of the full menu, there is a keypad which the user can use to dial crewmates. The keypad includes numbers, a backspace key, and a clear key which deletes the entire input. There are also call and hang up buttons at the bottom that appear at the proper times to allow users to accept/reject phone calls, call the frequency they entered with the keypad, and hang up the phone during a call.
To track crewmates, we implemented live 3D arrows that float a set distance away from the user and point directly at the crewmate's location. If the crewmate moves, the arrows will rotate around the user to point in the correct direction.