Today we’re following on from last week’s post introducing this project where we want to convert the Google Cardboard A360 samples to work in a more integrated manner with the Samsung Gear VR.
The main purpose of the project is to see how we can hook up the existing, web-based samples to take advantage of the Gear VR’s hardware. We definitely don’t want to re-implement any of the visualization stack to be native; if we can use UI hardware events to control the web-based samples in a meaningful way, that’ll be just fine for this proof-of-concept.
It took some work but I was able to get the Oculus software running on the Note 4 inside Gear VR. I had to downgrade the software for it to work, though – and to a version that has some quirks – but it’s definitely usable. And it’s impressive: the Oculus team have done a great job implementing a nausea-free experience. You can view 360-degree videos and panoramic photos, and even watch a movie while sitting in a virtual cinema. Very neat!
The next step, for this project to work, was to build an Android app using the Oculus Mobile SDK. As mentioned last time, until the Note 4 device I’m using runs Android Lollipop, I’m not actually going to be able to effectively load a page containing the A360 viewer – which is WebGL-based – inside a WebView control, but I could at least work on the UI plumbing in the meantime.
It took me about a day of work to get the initial project building properly. I had to install a lot of software, including…
- JDK – the Java SE Development Kit 8
- Eclipse Juno (I first tried Indigo, but that didn’t work, and it seems Android Studio isn’t yet supported)
- ADT Plugin for Eclipse (this is no longer easy to get hold of, now that Android Studio is the supported Android development IDE)
- Android NDK (we’re not really using native C/C++ code in our sample, but we’re basing our project in a native – rather than Unity-based – sample
- Android 4.4.2 SDK (API 19)
I then had to create a basic project using the VrTemplate project: there’s a handy script to stamp out versions of this template project with your own company and app name. Again, while this has a native component to it, we’re going to avoid using C/C++ code unless we really have to.
There were quite a few dead-ends I followed to get to this point. Getting an .apk built ended up being quite an achievement: getting it to install and load was another one…
I eventually managed to get the app to launch on the Note 4 via USB debugging, but clearly relying on a USB cable to deploy the app meant it couldn’t be mounted in the Gear VR headset at the same time. I could use this approach to work out how to get input from the Bluetooth gamepad, at least, but it was only going to get me so far.
One important step was to generate a device-specific signature file, but in the end it’s this forum thread that got me to the point where I can effectively deploy and run the app inside the Gear VR headset. I needed to enable wifi debugging for adb – which allowed me to get freedom from the USB cable – but also to make some changes to the application’s manifest to make sure it had the “vr_only” flag set (the docs stated that “vr_dual” was needed for testing as a normal Android app or as a VR app, but in reality “dual” really ended up meaning “vanilla Android only”. Setting “vr_only” meant that launching the app brought it up directly inside the VR environment. Perfect!
Now for a look at what’s now working, and where the app needs to go…
I modified the main activity to include a WebView control which then gets pointed at an updated set of samples for Gear VR. This version of the samples has some cosmetic differences: rather than having a single menu of models to pick from, this version has it duplicated, one for each eye. I’ve also adjusted the colours to make the page less visually jarring when viewed in an immersive environment.
Now for the main point of the post… I was able to hook up the UI events provided by the Oculus Mobile and Android SDKs – whether from the Bluetooth gamepad or from the touchpad on the side of the headset – and wire these into the HTML samples. Basically we call JavaScript functions in the WebView whenever we need to, and this controls the HTML page.
Here’s a quick video of the gamepad controlling the UI, as a “for instance”.
Here’s the current state of the main Java file in the sample, showing how the WebView is created and how the Gear VR hardware is able to communicate with it.
The next step is to enable the use of the controls once viewing a particular model: especially zoom and explode. But at this stage that’s a relatively simple problem to solve: the plumbing should now all be in place for this to happen.
At some point I’d like to look at changing the model-selection page away from HTML, integrating it more tightly with the Oculus menuing system (which is much easier on the eyes… having a menu that stays fixed in 3D space – at least as far as your brain is concerned – is much easier to look at than one that follows your eyes). But that would involve some duplication: right now having the models enumerated only in HTML is much simpler to update.
photo credit: TechStage via photopin cc