Lately, I’ve been dabbling into some “closer to the metal” kind of programming.
On most compilers (Visual Studio’s one for instance) it used to be rather easy to mix assembly code and C++ code using a feature called inline assembly where the ASM code will be put in a block (decorated with a special macro/symbol like _asm for instance), and when the compiler sees that, it will put the content of this “as is” inside of the compiled code.
My first real introduction with programming was with the C programming language, on a French website that was, at the time, known by the “Le Site Du Zéro” (think “The newbie’s website”).
At this time, I was in middle school, and I was pretty bored by it. It’s around that time that I started to have really access to the internet at home, and started spending a lot of time on it.
If there’s one open source library that I really like and think has a great level of usefulness for both myself, and a whole community, it’s Ogre.
Before going on the story of why I felt loading glTF files into ogre was a necessary thing to do, and why I decided to actually write a loader myself, I need to tell you a bit about Ogre:
Good old Ogre3D I prefer to write it “Ogre”, but really, it’s name is OGRE.
Sometimes I wonder why some things are inside the C and C++ standard libraries, and some aren’t.
As far as I can be bothered to read the actual “standards” document (that are mostly written in legalize, not in understandable English for you and me), these languages are defined against an “abstract machine”, and the actual real-world implementation of them, on computers that actually exists, should follow the behavior described for that thing, modulo some implementations details.
There’s one single thing that I find truly frustrating when dealing with multiple 3D-related software : making them exchange 3D assets.
You don’t have much warranty that what has been put out of one software will look the same into something else (e.g. a Game Engine. You may work with meters, and find out that Unreal works in centimeters. They could use different conversations for texturing, material definitions may just “not work”…)
So, I recently had the chance to try out an HTC-Vive on a Linux machine. Naturally, I installed Arch on it 😉
The installation is pretty straight forward, but there are some little catches if you want to do OpenGL development on Linux With OpenVR (OpenVR is the API you use to talk to the SteamVR runtime.)
SteamVR has a Linux beta since February 2017. They also announced that the SteamVR runtime itself is implemented with Vulkan only.
I don’t post regularly on this blog, but I really should post more… ^^”
If you have ever read me here before, you probably know that one of my pet project is a game engine called Annwvyn.
Where did I get from
Annwvyn was just “a few classes to act as glue code around a few free software library”. I really thought that in 2 months I had some piece of software worthy of bearing the name game engine. Obviously, I was just a foolish little nerd playing around with an Oculus DK1 in his room, but still, I did actually manage to have something render in real time on the rift with some physics and sound inside! That was cool!
Everything started as just a test project, then, I decided to remove the int main(void) function I had and stash everything else inside a DLL file. That was quickly done (after banging my head against the MSDN website and Visual Studio’s 2010 project settings, and writing a macro to insert __declspec(dllexport) or __declspec(dllimport) everywhere.)
The need for testability and the difficulties of retrofitting tests
So let’s be clear: I know about good development practice, about automated testing, about TDD, about software architecture, about UML Class Diagrams and all that jazz. Heck, I’m a student in those things. But, the little hobby project wasn’t intended to grow as a 17000 lines of C++ with a lot of modules and bindings to a scripting language, and an event dispatch system, and a lot of interconnected components that abstract writing data to the file system (well, it’s for video game save files) or rendering to multiple different kind of VR hardware, to go expand the Resource Manager of Ogre. Hell, I did not know that Ogre had such a complex resource management system. I thought that Ogre was a C++ thing that drew polygon on the screen without me having to learn OpenGL. (I still had to actually learn quite a lot about OpenGL because I needed to hack into it’s guts, but I blogged about that already.).
Lets just say that things are really getting out of hands, and that I seriously needed to start thinking about making the code saner, and to be able to detect when I break stuff.
So, the other day I was working on some Ogre + Qt5 code.
I haven’t really worked with Qt that much since Qt4 was the hot new thing, so I was a bit rusty, but I definitively like the new things I’ve seen in version 5. But I’m not here to discuss Qt 5 today. ^^
There’s a few weird things Qt does that I can’t really warp my head around.
(Seriously, I hesitated some time between this version and the original, but that’s not the point of this article, and I kinda like the 80’s vibe anyway…)
I think we can all agree here, Virtual Reality (VR) is now, and not science-fiction anymore. “Accessible” (not cheap by any stretch of the imagination) hardware is available for costumers to buy and enjoy. Now you can experience being immersed in virtual worlds generated in real time by a gaming computer and feel presence in it.
The subject that I’m about to address doesn’t really apply to mobile (smartphone powered) VR since theses experiences tend to be static ones. Mobile VR will need to have reliable positional tracking of the user’s head before hitting this issue… We will limit the discussion on actual computer-based VR
One problem still bother me, and the whole VR community as well is: In order to explore a virtual world, you have to, well, walk inside the virtual world. And doing this comfortably for the user is, interestingly, more complex that you can think.
You will allways have a limited space for your VR play room. You can’t physically walk from one town to another in Skyrim inside your living room, the open world of that game is a bit bigger than a few square meters.
The case of cockpit games like Elite:Dangerous aside, simulating locomotion is tricky. Any situation where you’re moving can induce nausea.
Cockpit-based game grounds you in the fact that you’re seated somewhere and “not moving” because most of the object around you don’t move (the inside of the spaceship/car/plane). This make it mostly a no problem, you can do barrel rolls and looping all day long and keep your meal inside your stomach. And you have less chance to kill yourself than inside an actual fighter jet 😉
Simulator (VR) sickness is induced by a disparity between the visual cues of acceleration you get from your visual system, and what your vestibular system sense. The vestibular system is your equilibrium center, it’s a bit like a natural accelerometer located inside your inner ears.
If you know me, you also probably know that I’m developing a small C++ game engine, aimed at simplifying the creation of VR games and experiences for “consumer grade” VR systems (mainly the Oculus Rift, more recently the Vive too), called Annwvyn.
The funny question is : With the existence of tools like Unreal Engine 4 or Unity 5, that are free (or almost free) to use, why bother?
There are multiple reasons, but to understand why, I should add some context. This story started in 2013, at a time where you had to actually pay to use Unity with the first Oculus Rift Development Kit (aka DK1), and where UDK (the version of the Unreal Engine 3 you were able to use) was such a mess I wouldn’t want to touch it…
Hello there! The process of getting a scene rendered by Ogre to the Oculus Rift is a bit envolved process. With a basic conaissance of Ogre, and trials and error while browsing the Ogre wiki, Documentation and source code itself I got the thing runing each time Oculus changed the way it worked. Since we are in the version 0.8 of the SDK, and that 1.0 will come with probably not much change in this front, I think I can write some sort of guide, while browing my Ogre powered VR game engine, and tell you the story of how it works, step by step.
I’ll paste here some code with explaination. It’s not structured into classes because I don’t know how you want to do. I don’t use the Ogre Application framework because I want to choose myself the order where things happen