Why glTF 2.0 is awesome!

There’s one single thing that I find truly frustrating when dealing with multiple 3D-related software : making them exchange 3D assets.

You don’t have much warranty that what has been put out of one software will look the same into something else (e.g. a Game Engine. You may work with meters, and find out that Unreal works in centimeters. They could use different conversations for texturing, material definitions may just “not work”…)

There’s scale issues, animation issues, texture binding issues, material problem in general.

All of this is generally dealt with a huge and horrible mess of import/export scripts. And the more of these you need to run in your toolchain, the worst it gets

Still, there are a number of file formats that are more or less standardized in this industry, but none of them does really fit all the use cases for real time rendering/video games (and by extension, VR).

Most of them are way better suited as “authoring” format: Files for 3D modeling programs, like Autodesk’s FBX or Collada from the khronos group. They are complex, and they generally are too-flexible in how they can be implemented and how they can represent a specific thing, and just contain “too much data” for what you would want to use from a programming stand point.

FBX seems to be a de-factor standard, especially in the video game industry, because of the predominance of Autodesk software as modeling animation tools. When they aren’t using 3Ds Max, they are using Maya… 😉

Generally, what you would want to do is to transform theses assets into serialized binary files that can be loaded quickly into a game engine. This is generally done on a per-engine basis. For what I can tell, the general workflow of “putting thing inside Unity” generally involve putting (for example) .FBX files into the project directory, and when Unity will build your game it will convert every resources in formats optimized for the target platform.

Still, this approach of reinventing the wheel and reinventing one “transmission format” for each and every target application of 3D assets doesn’t seems to be the right thing.

In the 2D world, this problem was fixed when the whole industry basically standardized around the JPEG file format for 2D pictures. Every single camera out there is (special cases or advanced users aside) outputting JPEG encoded and compressed image files.

Obviously, a JPEG file doesn’t have all the details that a Photoshop “project” (a PSD) file would contain, but JPEG is not intended for authoring images, it’s intended for sharing them.

What we need is a “JPEG for 3D”, and that’s exactly what the Khronos group, with an open community effort, developed, and released as the “OpenGL Transmission Format”, and version 2.0 of the specification was released in June 2017.

This format permit to describe 3D objects (and even whole scenes with lights and camera, and animations) in a standard format. A contrario to Collada for example, there is no room to interpretation, and not 2 ways of describing the same thing in glTF

glTF represent model data as simple binary buffer. “Accessors” permit to interpret that data contextually. This is particularly adapted to loading theses files in OpenGL, where you could load each buffer into the GPU with glBufferData, then parse each accessor with glVertexAttribPointer to bind in to the location of each vertex element you have in a buffer.

glTF is based on a simple scene graph description, written in JSON, and a number of raw resources that are the binary buffers and texture data as images.

The standard is generally easy to understand in it’s plain text, but some of the notions used are simpler to understand with some drawings. Thankfully, there’s this awesome “What the Duck is glTF?” on the official repository:

glTF overview poster

This poster describe most of what you need to understand about the glTF file format

There are currently a ton of software, tools and libraries that now support glTF. and I hope they will be even more in the future.

This format is developed by the khronos group, alongside an official implementation of an exporter for blender, and it already has a lot of support form the industry

The glTF NASCAR jacked has some really imporant sponsors!

I don’t think I really need to develop more why I think every effort going towards having more support for glTF will be beneficial for the whole industry/comunity, but supporting glTF today should be your go to option.

Juan Linietsky, main Godot developer actually written a lengthy article comparing multiple format, concluding that we should all support glTF2, and I cannot agree more!

In my next post, I will relate what it took to write a glTF import plugin in C++ for the popular open-source Ogre rendering engine. It is currently a work in progress project, but It starting to become usable, and can be found here

Install and run SteamVR on ArchLinux (for using an HTC-Vive) and do OpenGL/OpenVR developement

So, I recently had the chance to try out an HTC-Vive on a Linux machine. Naturally, I installed Arch on it 😉

The installation is pretty straight forward, but there are some little catches if you want to do OpenGL development on Linux With OpenVR (OpenVR is the API you use to talk to the SteamVR runtime.)

SteamVR has a Linux beta since February 2017. They also announced that the SteamVR runtime itself is implemented with Vulkan only.

First thing first, I am unaware if it’s currently possible to update the firmware of the base station on Linux, The setup I’m using is also running on Windows and the firmwares where already updated on that platform. That’s one thing I can’t tell you about.

So, to get started, you will need to have Steam running on your machine. ArchLinux is now a 64bit only distribution, Steam is a 32 bit only program. So, if it’s not already the case, you need to activate the [multilib] repository for pacman. (just uncoment the lines for it in /etc/pacman.conf)

First thing first, to do VR, you do need a good GPU. Here I’m running an Nvidia GTX 1070 with the proprietary drivers. You can use an AMD one with the latest version of the mesa drivers apparently. I don’t have access to any modern AMD graphics card so I can’t tell.

Then, you will need to install the following packages:

  • steam
  • lsb-release
  • your graphic’s driver packages in 32 bit (lib32-nvidia-utils, lib32-libvdpau)

Once you have installed Steam, launch it, connect or create an account, and install the SteamVR package, then you want to turn on the beta:

Once you have steam VR on your machine, and before you plug the Vive on the computer, you will need to install some udev rules to permit the app to access the device directly.

Create a file /lib/udev/rules.d/60-HTC-Vive-perms.rules and write the following content in it:

You can now open SteamVR, and you should be prompted to run the room setup program to configure your “play space”

Then, if you want to use OpenVR to render to the Vive with OpenGL, you will need a Vulkan runtime and development package. For this, you need to install the  vulkan-devel  package.

You may also need to launch your programs under under the steam runtime, this is done using this script :

~/.steam/steam/ubuntu12_32/steam-runtime/run.sh ./my_steamvr_app

And that should be about it. Programs that link against the OpenVR API will “just work” now.
The info in this article comes from my own experience, and this document from Valve : https://github.com/ValveSoftware/SteamVR-for-Linux

Annwvyn, my game engine is now “officially” compatible with Linux with OpenVR. 😉

I would be really happy if it was possible to use the Oculus Rift on Linux, yet, there aren’t any good solution right now. Oculus froze Linux development effort years ago, so we aren’t going to see an official SDK any time soon. The OpenHMD project has made “some” progress” but nothing really usable for now.

“Scenario Testing” a game engine by misusing an unit test framework.

I don’t post regularly on this blog, but I really should post more… ^^”

If you have ever read me here before, you probably know that one of my pet project is a game engine called Annwvyn.

Where did I get from

Annwvyn was just “a few classes to act as glue code around a few free software library”. I really thought that in 2 months I had some piece of software worthy of bearing the name game engine. Obviously, I was just a foolish little nerd playing around with an Oculus DK1 in his room, but still, I did actually manage to have something render in real time on the rift with some physics and sound inside! That was cool!

Everything started as just a test project, then, I decided to remove the int main(void)  function I had and stash everything else inside a DLL file. That was quickly done (after banging my head against the MSDN website and Visual Studio’s 2010 project settings, and writing a macro to insert __declspec(dllexport) or __declspec(dllimport) everywhere.)

The need for testability and the difficulties of retrofitting tests

So let’s be clear: I know about good development practice, about automated testing, about TDD, about software architecture, about UML Class Diagrams and all that jazz. Heck, I’m a student in those things. But, the little hobby project wasn’t intended to grow as a 17000 lines of C++ with a lot of modules and bindings to a scripting language, and an event dispatch system, and a lot of interconnected components that abstract writing data to the file system (well, it’s for video game save files) or rendering to multiple different kind of VR hardware, to go expand the Resource Manager of Ogre. Hell, I did not know that Ogre had such a complex resource management system. I thought that Ogre was a C++ thing that drew polygon on the screen without me having to learn OpenGL. (I still had to actually learn quite a lot about OpenGL because I needed to hack into it’s guts, but I blogged about that already.).

Lets just say that things are really getting out of hands, and that I seriously needed to start thinking about making the code saner, and to be able to detect when I break stuff.


Shoehorning anything (with `operator<<()`) into `qDebug()` the quick and dirty templated way

So, the other day I was working on some Ogre + Qt5 code.

I haven’t really worked with Qt that much since Qt4 was the hot new thing, so I was a bit rusty, but I definitively like the new things I’ve seen in version 5. But I’m not here to discuss Qt 5 today. ^^

There’s a few weird things Qt does that I can’t really warp my head around. One is the incompatibility between QString and std::string (There’s probably a nasty problem called “unicode” behind this), but one other one is that QDebug is not an std::ostream derived object.

If you don’t know, in the Qt world, a QApplication is expected write it’s debugging output by writing into a QDebug object, with an output stream operator (“<<“) operator. A QDebug object is easilly accessible by calling “qDebug()” this makes this code fairly common :

This is pretty standard things to do in C++, for instance, the standard library itself makes heavy use of the stream operator for io (hence, the main header is called iostream), and on a personal note : they are, the cleanest way to represent in code how to push stuff in and out of a program, IMO.

Qt choose not to use the standard output stream object as the base for their streams, but to rebuild it from scratch, that’s fine, except when you are trying to interact with something non-Qt.

Every object in Ogre that contains relevant data (the vectors, matrices, colors and other stuff) that can be logged, has an operator<<() defined to write to standard streams, but obviously, it will not work with Qt.

If you are lazy like me, and consider that it’s code “for development” and that you intend to remove it/switch it out in production, here’s a snippet you can paste in an header somewhere to redirect theses streams to QDebug’s output :

This is absolutely not ideal, for example, that stringstream object will be instantiated at each call of this templated function. If your can attempt to, for example, stream out an Ogre::Vector3 inside a QDebug, the compiler will stamp out an operator<< that will write the text into a stream, extract the string, and call the operator<< of QDebug that take C-style char* strings.

Also, I have no idea why they choose to pass the QDebug object by value in this function, I did not took the time to dig much under the hood, but it seems to be the way Qt deal with this.

It works well enough to me to check the content of some Ogre::Vector3 objects into the debug panel of Qt Creator, or the terminal output on Linux. ^^”

The locomotion problem in Virtual Reality

(Seriously, I hesitated some time between this version and the original, but that’s not the point of this article, and I kinda like the 80’s vibe anyway…)

I think we can all agree here, Virtual Reality (VR) is now, and not science-fiction anymore. “Accessible” (not cheap by any stretch of the imagination) hardware is available for costumers to buy and enjoy. Now you can experience being immersed in virtual worlds generated in real time by a gaming computer and feel presence in it.

The subject that I’m about to address doesn’t really apply to mobile (smartphone powered) VR since theses experiences tend to be static ones. Mobile VR will need to have reliable positional tracking of the user’s head before hitting this issue… We will limit the discussion on actual computer-based VR

One problem still bother me, and the whole VR community as well is: In order to explore a virtual world, you have to, well, walk inside the virtual world. And doing this comfortably for the user is, interestingly, more complex that you can think.

You will allways have a limited space for your VR play room. You can’t physically walk from one town to another in Skyrim inside your living room, the open world of that game is a bit bigger than a few square meters.

The case of cockpit games like Elite:Dangerous aside, simulating locomotion is tricky. Any situation where you’re moving can induce nausea.

Cockpit-based game grounds you in the fact that you’re seated somewhere and “not moving” because most of the object around you don’t move (the inside of the spaceship/car/plane). This make it mostly a no problem, you can do barrel rolls and looping all day long and keep your meal inside your stomach. And you have less chance to kill yourself than inside an actual fighter jet 😉

Simulator (VR) sickness is induced by a disparity between the visual cues of acceleration you get from your visual system, and what your vestibular system sense. The vestibular system is your equilibrium center, it’s a bit like a natural accelerometer located inside your inner ears.


The Annwvyn Game Engine, and how I started doing VR

If you know me, you also probably know that I’m developing a small C++ game engine, aimed at simplifying the creation of VR games and experiences for “consumer grade” VR systems (mainly the Oculus Rift, more recently the Vive too), called Annwvyn.

The funny question is : With the existence of tools like Unreal Engine 4 or Unity 5, that are free (or almost free) to use, why bother?

There are multiple reasons, but to understand why, I should add some context. This story started in 2013, at a time where you had to actually pay to use Unity with the first Oculus Rift Development Kit (aka DK1), and where UDK (the version of the Unreal Engine 3 you were able to use) was such a mess I wouldn’t want to touch it…


My Linux handheld game console. Part 1

A custom made Linux powered handheld. This sounds interesting, don’t you think ?

I had this project from quite some time: I have some hardware that is collecting dust, and I want to make use of it, I just recently had the idea to blog about it, and release everything on GitHub afterwards, when it will be actually something useful.
Also, this thing will need a name… But that’s not important right now ^^”

I’m a fan of the Ben Heck Show from quite some time. In multiple episodes of the show, Ben makes portables consoles from scratch as electronic projects. With the existences of small single-board computers like the Raspberry Pi, this is actually surprisingly easy to build.


Using Ogre’s OpenGL renderer with the OpenVR API (SteamVR, HTC-Vive SDK)

Note to the reader: If some things in this article are unclear and/or ommited, it’s probably because I’ve allready explained them in the precedent article about the Oculus Rift SDK here

The OpenVR API, and the whole “SteamVR” software stack is really interesting to target, because it’s compatible with many VR systems from the get go. Making you code something once, and running it on all of them.

In practice, the OpenVR API is a bit simpler to code with than the Oculus SDK. It’s naming conventions are from the 90’s (as with all Valve’s SDKs. But at least they are consistent with themselves!)

Also, the resulting code is less verbose that with the Oculus SDK. It’s almost like Oculus wanted to make their code fit the Windows/DirectX style (setting up structures with a lot of parameters and sending pointers to them to functions) and Valve’s took a more OpenGL approach (functions that take fixed types and flags to tell what to do with it), they even named their library OpenVR.


Using Ogre3D’s OpenGL renderer with the Oculus Rift SDK

Hello there!
The process of getting a scene rendered by Ogre to the Oculus Rift is a bit envolved process. With a basic conaissance of Ogre, and trials and error while browsing the Ogre wiki, Documentation and source code itself I got the thing runing each time Oculus changed the way it worked.
Since we are in the version 0.8 of the SDK, and that 1.0 will come with probably not much change in this front, I think I can write some sort of guide, while browing my Ogre powered VR game engine, and tell you the story of how it works, step by step.

I’ll paste here some code with explaination. It’s not structured into classes because I don’t know how you want to do. I don’t use the Ogre Application framework because I want to choose myself the order where things happen