I’ve been sick in bed since returning from AU2013 – the fever I experienced on Thursday morning, before travelling home in the evening, was apparently due to bacterial bronchitis, and I’ve been coughing really badly over the last few days. At least the fever has passed, but I did have to cancel my plans to travel to the Munich DevDay. It would have just been too much for me to be on a series of trains for a total of 7 hours each way. Next year. Anyway, here’s a quick post I’ve had sitting around that I’ve just augmented with some additional info post-AU…
I mentioned a few weeks ago that – partly in preparation for my sessions at AU 2013, but also to share some knowledge within the local organisation – I presented a condensed 1-hour session on both Kinect Fusion and Leap Motion in our two Swiss offices, in Neuchâtel and Gümligen.
Both sessions were rewarding: the audience in Neuchâtel on average less technical – understandable given the mix of roles people have in this office – while the focus of the one in Gümligen allowed me to dive a little deeper into the programming side of things.
Here’s a quick image of me presenting in Neuchâtel… (I don’t often get shots like this, as I find it hard to talk and take pictures at the same time, but thankfully my friend and colleague Simon Freihart was there to take snaps in both locations. :-)
Here’s one of the presentation in Gümligen, where you can see the chair we partially captured using Kinect Fusion… (the key learning from that session was to use extension cables for both the USB and power cables – something I’ve since remedied. :-)
It was great to connect more with the developers in the Gümligen office, most of whom are now working on the InfraWorks product.
I jumped at the chance to get a few tech demos from the team there, too (thanks in particular to Halil Bolukbashi for that). They have a prototype integration of Leap Motion with InfraWorks as well as something comparable with Oculus Rift. While the latter integration unfortunately wasn’t up & running while I was there, I did get to play around with the device itself for the first time:
Since then I’ve also had the chance to try the HD edition of Oculus Rift in the Exhibit Hall at AU2013 (thanks to Stephane Bersot from our CTO’s Office for snapping the pics):
I find the concept behind this technology to be incredibly interesting: rather than having two separate screens mounted in a headset, as with Oculus Rift, you wear glasses with individual pico projectors mounted over each eye. These then project the image to be picked up by the eye outwards into the room – the system is certainly dependent on head-tracking to work properly, but they seem to have addressed that – and the image gets reflected back by commodity reflective safety material… really cool, especially the story of how it got started accidentally.
I particularly like the concept as it’s less immersive – and I personally find full AR to be a little sickness-inducing – and more likely to be used interactively and collaboratively (you can have multiple people with castAR glasses looking at the same reflective surface and getting their own unique views reflected back… huge, huge potential for collaborative design review).
It’s great to see that they not only got funded but met their #5 stretch goal (at $1M – wow) which includes a 3D visualization plug-in for Maya. So I guess they’ll be spending some of their well-deserved crowdfund on an ADN membership… ;-)