As promised, way back when we started this series of posts looking at various Windows Holographic platform capabilities to build an app that displays an animated ABB industrial robot inside HoloLens, here’s the part where we make it dance. :-)
During Autodesk Switzerland’s 25th anniversary party in late October, people will be able to give the app a try and hopefully see the robot dancing along to whatever tunes the DJ plays. Let’s talk a bit about how the system works…
Quite early on I decided against doing the additional audio processing directly on the HoloLens device itself. So I wanted to set the HoloLens up as a TCP server that a client could connect to and send messages to whenever it detected a beat.
Let’s take a look at what was needed on the HoloLens for this…
The big challenge here is that Unity’s C# implementation is based on Mono, which doesn’t yet have async/await implemented. And as Universal Windows Apps (UWPs) live in a sandbox, it’s not always easy to get things working.
What I really wanted was a System.Net.Sockets.TcpListener, which is apparently available in a new version of the System.Net.Sockets component, but I couldn’t work out how to make use of this from inside a Unity app.
My next attempt was influenced heavily by a StackOverflow thread. I’d hoped to use a SocketStreamListener, but due to the lack of async/await support I was out of luck, once again.
It was time to go *way* back to basics… it was time to crack open some VS 2008 (perhaps even older) code on MSDN.
I took the Server implementation from example code for System.Net.Sockets.SocketAsyncEventArgs. I took the BufferManager implementation from the example code for its SetBuffer method. I took the SocketAsyncEventArgsPool implementation from the example code for its constructor. And finally, I added the missing AsyncUserToken implementation, as per StackOverflow.
With these pieces in my Unity project, it was trivial to start a TCP server from my MonoBehaviour:
using UnityEngine;
using System.Net;
public class Dance : MonoBehaviour
{
private bool toggle = false;
private int port = 4444;
private TcpServer s;
void Start()
{
s = new TcpServer(10, 100);
s.Init(ReceiveCompleted);
s.Start(new IPEndPoint(IPAddress.Any, port));
}
internal void ReceiveCompleted()
{
toggle = true;
}
void Update()
{
if (toggle)
{
this.BroadcastMessage("OnReverse");
toggle = false;
}
}
}
Whenever a TCP message is received, we simply go and reverse the direction of the robot: i.e. all the moving parts of the robot change their rotation direction. We could make it more sophisticated, but for now it at least clearly responds to messages it receives.
So how do we detect beats in the music?
I found some Processing code on the web that performs beat detection on an audio stream and displays the results in a graphical window on my Mac (it uses Soundflower, an OS X system extension). As Processing has some limited support for HTTP & TCP, I went ahead and added a few lines of code to send a message via a TCP connection to the HoloLens device. The client code is really trivial: I just create a Processing.Net.Client object – pointing to the IP address and port of the HoloLens’s TCP server – and use the write() method to send a single character (a “b”, but it could be anything).
Here it is in action:
It doesn’t dance very well, but then I can say that it dances significantly better than I do. So perhaps I’m not in a position to criticise. ;-)