The locomotion problem in Virtual Reality

(Seriously, I hesitated some time between this version and the original, but that’s not the point of this article, and I kinda like the 80’s vibe anyway…)

I think we can all agree here, Virtual Reality (VR) is now, and not science-fiction anymore. “Accessible” (not cheap by any stretch of the imagination) hardware is available for costumers to buy and enjoy. Now you can experience being immersed in virtual worlds generated in real time by a gaming computer and feel presence in it.

The subject that I’m about to address doesn’t really apply to mobile (smartphone powered) VR since theses experiences tend to be static ones. Mobile VR will need to have reliable positional tracking of the user’s head before hitting this issue… We will limit the discussion on actual computer-based VR

One problem still bother me, and the whole VR community as well is: In order to explore a virtual world, you have to, well, walk inside the virtual world. And doing this comfortably for the user is, interestingly, more complex that you can think.

You will allways have a limited space for your VR play room. You can’t physically walk from one town to another in Skyrim inside your living room, the open world of that game is a bit bigger than a few square meters.

The case of cockpit games like Elite:Dangerous aside, simulating locomotion is tricky. Any situation where you’re moving can induce nausea.

Cockpit-based game grounds you in the fact that you’re seated somewhere and “not moving” because most of the object around you don’t move (the inside of the spaceship/car/plane). This make it mostly a no problem, you can do barrel rolls and looping all day long and keep your meal inside your stomach. And you have less chance to kill yourself than inside an actual fighter jet 😉

Simulator (VR) sickness is induced by a disparity between the visual cues of acceleration you get from your visual system, and what your vestibular system sense. The vestibular system is your equilibrium center, it’s a bit like a natural accelerometer located inside your inner ears.

(more…)

The Annwvyn Game Engine, and how I started doing VR

If you know me, you also probably know that I’m developing a small C++ game engine, aimed at simplifying the creation of VR games and experiences for “consumer grade” VR systems (mainly the Oculus Rift, more recently the Vive too), called Annwvyn.

The funny question is : With the existence of tools like Unreal Engine 4 or Unity 5, that are free (or almost free) to use, why bother?

There are multiple reasons, but to understand why, I should add some context. This story started in 2013, at a time where you had to actually pay to use Unity with the first Oculus Rift Development Kit (aka DK1), and where UDK (the version of the Unreal Engine 3 you were able to use) was such a mess I wouldn’t want to touch it…

The current VR “revolution” has been started by the presentation by John Carmack at E3 2012 with DOOM with a hacked together VR headset.

An LCD screen, some lenses and a sensor (IMU) held together with a bit of tape on ski goggles.

This prototype hardware was made by some guy called Palmer Luckey, and Carmack basically found about the kit and asked him so send a prototype to Texas (I think they contacted each other via the MTBS3D forum), and patched DOOM 3: BFG to use it, and showed that at E3. (he explain the story many times to vistors of the ID Software booth, search that on YouTube if you want to see the interviews)

That Palmer guy found some partner in crime, and started up “Oculus VR” to develop a VR headset called “the Rift”, and had a very successful kickstarter, then later they offered pre-orders of the first “development kit” for it.

The kit was more or less $300 and I jumped on it after some friends showed me the existence of the project (I wasn’t aware and the kickstarter was finished already). It was at the very start of the year 2013.

They started shipping out that stuff in march, but I finally got mine in June, the very last day of school for that year for me.

UPS droped this on my doorstep June 14th 2013

UPS dropped this on my doorstep June 14th 2013

The developement Kit itself

The development Kit itself, we’ve come a long way from that!

By chance it was early in the morning, and I directly took the box to KPS, ESIEA‘s computer science club without having the time to properly unbox it or try it out.

I set this thing up at school. I already had some demo software on my laptop already so it was working right away. I tried this with a bunch of friends. They had more or less comfortable experiences, but everybody was blown away.
Good news :  I’m totally insensible from VR’s “Simulator Sickness”. (I don’t get motion sickness, and I have no trouble doing things like reading books in cars). The tracking of the DK1 was only in orientation, with an head/neck model to compensate in translation. The screen inside was genuinely crappy, the image was blurry when you moved your head, but it was doing the job for the time, and was amazing!

So, I spent a few hours with this. Valve just made Team Fortress 2 and Half Life 2 playable “In VR”, and I spent a few multi hours sessions playing as Gordon Freeman in City 17, without problems. (well, the game at the time broke at the part where you had to drive an hovercraft :-P).

After having used this exiting new toy, I was like “Well, I know a bit about 3D modeling (I used to do 3D CAD in “engineering sciences” classes in High School. And I also messed around with Blender since Middle School, so I knew how to make stuff out of polygons). And, being a programmer, I had this in mind:

“I know nothing about computer graphics programing. But I can learn. I want to write code for that thing”

So, I evaluated the “software development” landscape for the Rift prototype, at the time :

  • The Unity Game Engine needed a plugin. The plugin system was for the professional version only at the time. Oculus gave a few months of free Unity Pro trial, but I did not want to reliy on proprietary software that I had to pay for
  • UDK was free, but the integration was messy and I couldn’t get it to work in a reasonable amount of efforts

Being pretty comfortable with C++, I decided that I will not use one of theses ready made game engines.

I layered a simple outline for what I would need to simulate a simple VR worlds with 3D objects around me in multiple things :

  • I need to make a 3D scene with objects inside, and render this properly for the rift, including the head tracking
  • Each object should at least have rigid body physics and be submitted to gravity to feel realistic
  • Spatialized sound should exist. Object should be able to emit sounds
  • I just want to be able to create a 3D scene with a few lines in the main() function

The 1st thing was to do 3D render to the headset. At the time, it was simply seen as an external screen by the system. So all you needed to do was to put a window in the right place with the right content to make it work.

Without any previous experience in programming with OpenGL or Direct3D, I was looking for a library that will gave me a nice abstraction layer between me and the graphics card.

I had came across an “HMD Irrlicht” project on GitHub, and tried to make something work with it. However, I encountered problem with the camera tracking, and Irrlicht looked a bit light for what I wanted to do.

VR render made with Irrlicht, a scene graph based 3D rendering library.

VR render made with Irrlicht, a scene graph based 3D rendering library.

Then I tried out the other “well known open source 3D engine”, called Ogre (Object-Oriented Graphical Rendering Engine), and I was pretty pleased with what I’ve seen.

Somebody called Kojack, on the Oculus developers forum (and apparently he’s a pretty active member of the Ogre community) made a simple integration of the Rift with Ogre, and it was working really well, and the code was easy to understand for me :

Kojack's demo : A dancing Ogre in a cathedral, in VR

Kojack’s demo : A dancing Ogre in a cathedral, in VR

So, I refactored this in a way I was happy about, and then I started writing a few classes to instantiate and run side by side Audio with OpenAL, physics with Bullet and the 3D rendering. I managed to get something that worked a bit.

The way VR is rendered to HMDs now has been completely changed, and there’s nothing left from this old code. Also, in 3 years I learned a lot about 3D rendering and about the inner workings of the Ogre engine.

The main idea was that I wanted to just have to manipulate one object, and have everything updated. So I wrote a few classes that did just that.

Then I thought that putting all of this together was hard enough do to once. So I removed the main function of my program and started to look into how to build a dynamic library with Visual Studio. After that, I was thinking:

“Hey, I can totally make a game engine out of this”

So, after that, I needed a name.

I wanted something that could represent “imaginary worlds”, rooted in history or mythology. I’m a bit into Celtic stuff, and I came across the word “Annwvyn” (the spelling vary, but this is the coolest-looking form, in my humble opinion :-P)

Annwvyn is basically, the “other world” from ancient Celts from Wales. It’s also somewhat tied to Arthurian legends, so it has everything to please me 😉

annvfyn

annvfyn

I spent most of my summer making this thing works, and at the end of august, I published the source code on GitHub here.

Through the “technical and scientific projects” of my engineering grad school, I found a way to make Annwvyn the backbone of my projects, to work on it on school hours. (but I must admit, even when I’m supposed to do other things, I tend to work on that… ^^”

Picture of the engine running in it's first actual release on GitHub, at the end of 2015

Picture of the engine running in it’s first actual release on GitHub, mid 2015

Bellow are some videos (playlist) demonstrating various things running (or not working, but funny) in Annwvyn.

The engine is still in early phases of development. It’s now running on Oculus CV1 and theoretically Vive too (it runs with OpenVR/SteamVR. I did not get the chance to try it out on a Rift)

The default way it represent the player’s position is focused on a FPS like control scheme. This will be expanded in the near future. For now the engine is not feature rich enough to power “real games”. I’m actually working on a few samples/demo that will permit to show what the engine is actually capable. I’m planing to also develop some mini-games or experiences using Annwvyn that will be maintained compatible with the engine itself.

The engine lacks proper documentation. I’m doing some effort on this on a wiki site at wiki.annwvyn.org and the code is commented in a way that I can generate an explicit documentation for the API via Doxygen on api.annwvyn.org.

Anyway, this is a student project, it’s not a real usable piece of software. It’s not properly documented. It’s not properly tested. It’s in an on-going development state so the API is not stable. It still lacks fundamental features. But it has the advantage to be the lightest VR development framework I can run on my laptop or on school’s computers without having real issues.

And, well, it’s just my pet project… ^^”

My Linux handheld game console. Part 1

A custom made Linux powered handheld. This sounds interesting, don’t you think ?

I had this project from quite some time: I have some hardware that is collecting dust, and I want to make use of it, I just recently had the idea to blog about it, and release everything on GitHub afterwards, when it will be actually something useful.
Also, this thing will need a name… But that’s not important right now ^^”

I’m a fan of the Ben Heck Show from quite some time. In multiple episodes of the show, Ben makes portables consoles from scratch as electronic projects. With the existences of small single-board computers like the Raspberry Pi, this is actually surprisingly easy to build.

(more…)

Using Ogre’s OpenGL renderer with the OpenVR API (SteamVR, HTC-Vive SDK)

Note to the reader: If some things in this article are unclear and/or ommited, it’s probably because I’ve allready explained them in the precedent article about the Oculus Rif SDK here

The OpenVR API, and the whole “SteamVR” software stack is really interesting to target, because it’s compatible with many VR systems from the get go. Making you code something once, and running it on all of them.

In practice, the OpenVR API is a bit simpler to code with than the Oculus SDK. It’s naming conventions are from the 90’s (as with all Valve’s SDKs. But at least they are consistant with themselves!)

Also, the resulting code is less verbose that with the Oculus SDK. It’s almost like Oculus wanted to make their code fit the Windows/DirectX style (setting up structures with a lot of parameters and sending pointers to them to functions) and Valve’s took a more OpenGL approach (functions that take fixed types and flags to tell what to do with it), they even named their library OpenVR.

(more…)

Using Ogre3D’s OpenGL renderer with the Oculus Rift SDK

Hello there!
The process of getting a scene rendered by Ogre to the Oculus Rift is a bit envolved process. With a basic conaissance of Ogre, and trials and error while browsing the Ogre wiki, Documentation and source code itself I got the thing runing each time Oculus changed the way it worked.
Since we are in the version 0.8 of the SDK, and that 1.0 will come with probably not much change in this front, I think I can write some sort of guide, while browing my Ogre powered VR game engine, and tell you the story of how it works, step by step.

I’ll paste here some code with explaination. It’s not structured into classes because I don’t know how you want to do. I don’t use the Ogre Application framework because I want to choose myself the order where things happen

(more…)