It’s Friday – and a holiday in the US – so it’s time for some fun…
Some weeks ago I received an interesting device through the post. It’s called the 3D Rudder and is made by a startup in Marseille. They’ve basically created a foot-controlled joystick with three degrees of freedom, so you can control roll, pitch and yaw, all while keeping your hands free.
The makers of the 3D Rudder originally intended it for navigating in 3D design applications – one of the founders is an architect by background – so their interest in me playing with it is mainly to provide a basic integration with AutoCAD. They have a .NET SDK, so it shouldn’t be too hard to put something together and share it on GitHub… expect to see more on this in a future post.
Here’s the device in question. You place it on the ground, connect it via USB to your PC – where all being well it should be recognised as a joystick – and away you go: you place you feet on either side and can then not only tilt forwards/backwards and sideways (for pitch and roll, respectively) but rotate left and right, too (for yaw).
It takes a little while to “find your feet” (haha) but after that it’s fairly intuitive. I gave it a try, a few weeks ago, with the desktop version of Google Earth. I entered “flight simulator” mode and uses the 3D Rudder as a joystick to fly around, as shown (by someone else) in the below video:
It was an interesting experience. I’ve always been pretty bad at flight simulators – this didn’t do much to change that – but I could certainly see the potential for this style of user input device.
Interestingly, while it was initially intended as a peripheral for design applications, the 3D Rudder also has the potential to help navigate in VR. It’s only really usable when seated, rather than at room-scale, of course.
So when Google Earth VR was launched on Steam last week, my first thought was that it might be interesting to combine 3D Rudder and Google Earth VR. That didn’t end up being the case – at least I haven’t found a way to do it, as yet – but I did have an incredible time playing with Google Earth VR using the standard HTC Vive controllers.
Google Earth VR is an amazing application. They’ve created a really compelling interface that allows you to navigate the planet in VR, and have really pushed the envelope when it comes to reducing cybersickness.
Here’s a quick set of captures of me zooming into my home (everyone does this when they load Google Earth, right?).
You’ll notice a grid appearing at the periphery of your vision when performing certain operations. This is completely deliberate: it’s not about performance, it’s about reducing your nausea. If you were fully immersed as you zoomed around at unrealistic speeds, your brain would soon believe something very unnatural was happening. I did end up feeling a little sick, myself, but I believe that was because I was regularly using Steam to snap screenshots for this post and every time I did so the view froze for a second.
Another interesting aspect to the experience is that rather than flying way above the Earth – with a severe risk of vertigo – you are essentially scaled up and stomping around a smaller version of the planet. Which is great for those with a God(zilla) complex. :-)
There is an option that allows you to zoom into human scale, but this has to be enabled manually, and you’re never really floating above the planet – it’s only when you get to a certain zoom factor that it kicks in. This was (once again) very well thought out.
The last UX piece that I simply loved was the ability to grip the Sun using a controller and drag it across the sky to change the time of day. You can just see the arc of the Sun – which it won’t deviate from – as well as the blue outline of my VR area’s boundaries in the below screenshot. What an amazing way to perform sun studies!
From beginning to end Google Earth VR was a sheer delight. Once again Google has broken serious ground dealing with a number of important VR usability questions. The VR industry really owes them a debt of gratitude.