Most VR experiences are still struggling to find good ways to move around beyond your room’s footprint in VR. Many of the experiences that have tried virtual flight or walking induce simulation sickness because they inadvertently rotate the player’s camera while rotating the virtual room.
However, I learned that if you stabilize the player’s camera by preventing the virtual room from rotating, you might be able to eliminate simulation sickness entirely. This opens up the possibility for free virtual movement in all directions, while leaving rotating up to the player in the real world.
Have you ever watched a sci-fi or superhero movie and wondered what it would be like to fly? I mean really fly — in any direction and free from the confines of a cockpit or vehicle.
I recently launched a demo of HVR, a VR jetpack shooter designed to do just that. It was originally created as a week-long experiment to explore VR locomotion and has since evolved into a fast-paced game to test the limits of the flight system. I wanted to share some of the challenges with the locomotion mechanic that I had to overcome to make it happen.
What’s so difficult about flying in VR?
VR is incredible at creating massive experiences, but first-person movement in VR typically is limited to teleportation, movement on a track, or forward movement in a vehicle. This is meant to reduce what is called simulator sickness. In a nutshell, simulator sickness is like getting seasick or carsick, except instead of physical motions making you sick, the visual illusions and motions that you see in VR are the cause.
As a general best practice to minimize simulator sickness, Oculus, Google, and Unity all recommend moving players at a constant speed, avoiding rapid acceleration, and trying to keep movement in the forward direction.
Another interesting trick that helps reduce this is to add fixed reference points around the player such as a cockpit or a vehicle. Unfortunately, linear forward motion can get repetitive, vehicles shield you from the environment, and joysticks put another interface between you and virtual world.
With great immersion comes great responsibility
To fix VR flight I first had to create a prototype to figure out what didn’t feel right and find ways to correct it without diminishing the experience. This is where having background in UX design and research came in handy.
Isolating the variables
To narrow down on a solution that wouldn’t give anyone motion sickness, I isolated all of the possible variables related to VR movement and tested several permutations of them with over 20 people who had never tried VR before.
Here were the variables that I identified:
- Real-world player (rotation and movement of the head in the real world)
- Real room (rotation and movement of the real world environment)
- Virtual camera (rotation and movement of the “head camera” in the virtual world)
- Virtual room (rotation and movement of the virtual environment)
- Virtual physics (how the virtual world responds to actions such as thrust and gravity)
This experiment was targeted towards people playing in their own home, so physically moving the room wasn’t an option.
I couldn’t control the real world movement and rotation of the player, but it was important to pay attention to them because their actions still had an effect on things like gravity and body orientation.
I could control the virtual camera, but wanted to allow the player to look freely, so I opted not to change that.
So the remaining two variables I chose to play with were the virtual physics and the virtual room.
Stabilizing the Virtual Physics
My first prototype into Iron man-style flight wasn’t spectacular. Applying physics forces to the hands to simulate rockets was cool in theory, but it was unbalanced and sent me into mind-numbing spins.
I stabilized this by faking the perception of control. The controllers would still emit sound and fire, but instead of applying forces to the hands and creating an unbalanced thrust, I applied them to the room’s center of gravity. Boom — no more spinning and it still feels like you’re controlling the action.
DEALING WITH MOTION SICKNESS
After getting basic stabilized flight working, I started putting people that were unfamiliar with VR into the prototype and observed something interesting. While they were able to fly around in a stable manner, once they hit a wall and the physics started rotating the virtual room even just slightly, they immediately felt disoriented.
Through lots of informal A/B testing, I discovered that preventing ANY rotation on the virtual room eliminated motion sickness. I’ve even tested this with people prone to motion sickness and aside from the “jelly legs” or “butterflies” during take-off or landing, they never got sick.
I broke a bunch of the rules—this shouldn’t work, but why does it?
Maintaining a link between the real world and virtual one
My theory is that while VR allows us to augment senses such as sight, sound, and even touch, it can’t augment the sensation of gravity and firm footing. These literally anchor us to the real world. Instead of fighting that link, this method aims to create a healthy fusion between the two in order to maintain a sense of stability and presence.
Most VR experiences that induce sickness rotate your virtual world to follow the action and do not take into account that because the virtual camera is linked to the virtual room, any rotation to the room translates back to the player.
This ends up upsetting your sense of balance — or your vestibular system. At this point, you have a difficult choice: move your head to match the visuals or move your head to match the sensation of gravity.
This is why it’s best to fixate on a distant object when you’re motion sick during a bumpy ride and why dancers spot their turns. Your head and eyes will automatically adjust your neck and body to accommodate for the movement, which smooths out the motion and allows for better balance. Unfortunately you can’t do that if the environment is constantly rotating around you. By leaving in-game rotation untouched, I was able to move the player at any speed and in any direction and let the human body take care of the rest.
Where this could be applied
While I can only speak to how well it worked for HVR, I think it also could solve simulation sickness with other “open-air” experiences such as VR video, roller coasters, and games. Flight simulators and other vehicle experiences seem to have already dealt with this by putting players inside of an immovable cage to provide the illusion that their environment is locked in place.
Limitations of this method
Unfortunately, you can’t move players upside down or bank at extreme 90° angles, because they can’t easily put themselves in that position. Players can physically turn around to correct for yaw, but roll and pitch are limited to their ability to tilt their head.
For situations that require extreme angles, the “cockpit” method may be a better method to try.
This is just the beginning
VR locomotion is a key factor in creating large-scale immersive experiences that don’t alienate newcomers to VR. This is only one perspective on VR locomotion that requires much more testing and doesn’t consider other possible factors such as sense of control, acceleration and level design. Despite that, I hope my findings are able to empower developers and designers to create better experiences.
Room for improvement
The true test of this experiment was how it would fare in the wild. People have been so excited to play HVR that some of them have posted their own unsolicited experiences to Youtube. This turns out to be an excellent informal research tool to see what can be improved! I've compiled a public playlist of all of the videos I've found on HVR so far:Watch HVR reactions