The locomotion problem in Virtual Reality
(Seriously, I hesitated some time between this version and the original, but that’s not the point of this article, and I kinda like the 80’s vibe anyway…)
I think we can all agree here, Virtual Reality (VR) is now, and not science-fiction anymore. “Accessible” (not cheap by any stretch of the imagination) hardware is available for costumers to buy and enjoy. Now you can experience being immersed in virtual worlds generated in real time by a gaming computer and feel presence in it.
The subject that I’m about to address doesn’t really apply to mobile (smartphone powered) VR since theses experiences tend to be static ones. Mobile VR will need to have reliable positional tracking of the user’s head before hitting this issue… We will limit the discussion on actual computer-based VR
One problem still bother me, and the whole VR community as well is: In order to explore a virtual world, you have to, well, walk inside the virtual world. And doing this comfortably for the user is, interestingly, more complex that you can think.
You will allways have a limited space for your VR play room. You can’t physically walk from one town to another in Skyrim inside your living room, the open world of that game is a bit bigger than a few square meters.
The case of cockpit games like Elite:Dangerous aside, simulating locomotion is tricky. Any situation where you’re moving can induce nausea.
Cockpit-based game grounds you in the fact that you’re seated somewhere and “not moving” because most of the object around you don’t move (the inside of the spaceship/car/plane). This make it mostly a no problem, you can do barrel rolls and looping all day long and keep your meal inside your stomach. And you have less chance to kill yourself than inside an actual fighter jet 😉
Simulator (VR) sickness is induced by a disparity between the visual cues of acceleration you get from your visual system, and what your vestibular system sense. The vestibular system is your equilibrium center, it’s a bit like a natural accelerometer located inside your inner ears.
The Annwvyn Game Engine, and how I started doing VR
If you know me, you also probably know that I’m developing a small C++ game engine, aimed at simplifying the creation of VR games and experiences for “consumer grade” VR systems (mainly the Oculus Rift, more recently the Vive too), called Annwvyn.
The funny question is : With the existence of tools like Unreal Engine 4 or Unity 5, that are free (or almost free) to use, why bother?
There are multiple reasons, but to understand why, I should add some context. This story started in 2013, at a time where you had to actually pay to use Unity with the first Oculus Rift Development Kit (aka DK1), and where UDK (the version of the Unreal Engine 3 you were able to use) was such a mess I wouldn’t want to touch it…
Using Ogre3D’s OpenGL renderer with the Oculus Rift SDK
Hello there!
The process of getting a scene rendered by Ogre to the Oculus Rift is a bit envolved process. With a basic conaissance of Ogre, and trials and error while browsing the Ogre wiki, Documentation and source code itself I got the thing runing each time Oculus changed the way it worked.
Since we are in the version 0.8 of the SDK, and that 1.0 will come with probably not much change in this front, I think I can write some sort of guide, while browing my Ogre powered VR game engine, and tell you the story of how it works, step by step.
I’ll paste here some code with explaination. It’s not structured into classes because I don’t know how you want to do. I don’t use the Ogre Application framework because I want to choose myself the order where things happen