1 month in the Vive - Part 2/4

This is the second part in Michela's four-part series on her first month's experience with the HTC Vive. Read part 1.

The Vive software setup is rather slick. Love it or hate it, Valve's Steam distribution platform is a marvel of software design and engineering. Steam VR is no exception.

The room-based calibration wizard requires you to trace an outline of your space with a controller and perform a few other tasks but it is easily done in a few minutes. This is handy because if anyone knocks your lighthouses out of position (the advice is to never move them while powered on) you will have to redo the process.  


Once we'd configured, it was time for the Room Tutorial, a rather cute vignette set in Aperture Labs, hosted by Wheatley from Portal 2. It is hard to overemphasise just how immersive the Vive is once you start moving around in your real space and manipulating virtual objects with the controllers. The tracking is spot-on. You do need to ensure that nothing blocks the line-of-sight between your headset and the lighthouses (causing the view to white out momentarily) but in general the tech feels rock solid - quite remarkable for a first release.

Most experiences we've come across utilise both Vive controllers in tandem but for our first development outing, I wanted to focus on using just one. I had a very specific concept in mind.


The Steam OpenVR stack gives developers a lot of room to move. Geddit? The hardware support for the two engines we use most at Mod (Unity and Touch Designer) was superb. Thanks, all you beta testers!

After a short Unity test run with guidance from VR Dev School I was ready to jump into my weapon of choice Touch Designer. For those of you who haven't heard of it, Touch is a fork of the blockbuster VFX tool Houdini and engineered specifically as an authoring tool for real-time experiences. We built ACO Virtual in Touch several years ago and I was hankering to see how easy or otherwise it would be to port this “live” immersive experience to VR.

This was a surprisingly pleasant experience. Touch as a procedural engine is overkill for many simple applications but media artists love it because it places few restrictions on how you work with real-time video. If your hardware can handle it, Touch will probably enable it. Within a few days I was able to get a playable VR demo up and on 7 June we gave our first public exhibition of ACO Virtual VR incorporating the complete 35-min show.

There was still a way to go. We didn’t have the beautiful audio mixing in place - we literally couldn't fit our sound card into our new VR PC so the experience was not quite up to the same level as the ACO Virtual touring show but what we did get was something that can run in our studio, that feels like you are in the experience, but without a van's worth of kit. Very satisfying.

For those of you who haven't visited ACO Virtual, the experience revolves around the ability to swipe across images of the musicians on a tablet and have the immersive sound and visuals respond as if you are conducting. I wanted to see if the Vive controller would make a decent substitute for a touchscreen. We’ll share what we've learned in upcoming posts.

Continue to part 3 of this article


More about Mod