Last week I spent a morning at SINDEX 2014, which is apparently Switzerland’s largest technology exhibition. Its main focus is on industrial electronics and automation: not exactly fields I know a lot about, but ones that I do find to be extremely interesting. There were 400 or so exhibitors, many focused on providing sensors and electrical equipment, others providing complete automation solutions.
There was a serious focus on robotics, for instance, something in which regular readers will know I have a strong interest. This post contains a few examples of the robots that were on show.
This robot, from Stäubli, is moving boxes of Tic Tacs around at high speed. Yes, the video is in no way sped up – it really moved that quickly.
Increasing the ability for robots to function more autonomously typically involves computer vision and image processing techniques, a very cool – and increasingly important – part of the technology landscape. Some of the most interesting exhibits, for me, were the ones using integrated depth cameras (e.g. Kinect), laser scanners or just basic image processing to enhance a robot’s ability to deal with varying situations.
Take this system from Isra Vision…
… where the robot is picking parts out of a bin based on input from a laser scanner:
The system scans the top level of the bin and matches portions of the point cloud against a known 3D model of the part. That way the robot can effectively position itself to pick up the “best” part from the bin and then go ahead and do something with it.
If you look at the above video – at around the 45 second mark – you can see a physical debugging console showing the code the robot’s controller is stepping through. Apparently most manufacturers have their own (often quite arcane) languages for robotic control (again – not a field I have experience with… I’d be very happy to get comments from people who do). It was certainly very interesting to see the code in action.
Of course there were a number of fun technology demonstrations that have no serious, practical purpose – and yet demonstrate the technology’s potential – something I admit I like a great deal (check my next post for one more example of this ;-).
For instance, here’s the Xylobot – created at HEIG-VD – playing the Dance of the Sugar Plum Fairy:
It was programmed to play a number of other tunes, too. Fun stuff!
Update:
Martin Müller made a very valid comment that the Xylobot is in some ways quite ugly: not necessarily in terms of its physical appearance but in terms of its perfect precision. I expect it’d be possible to program in a little imperfection, of course – if the goal was to make the performance more “human” – but that clearly wasn’t the point of this particular technology demonstration.
This reminded me of another video I took of some more “sensitive” robots, though, this time made by KUKA. These are not only elegant from a design perspective but from the “care” with which they can perform intricate tasks. Perhaps not beautiful in the way they replicate human imperfection, but nonetheless impressive…