Ever since we loaned Thalmic Labs our first Grizzly prototype to make some early demos for their launch video, we’ve been eagerly following their progress. There’s obviously a lot of interesting applications for using the Myo for robot control, and our team’s been looking forward to getting our hands on one of the first Alpha dev units that make it out into the world. Well, the day finally arrived! I picked Myo up on Tuesday evening and by noon the following day was driving Husky around with it. Needless to say, my dinner wasn’t as elaborate as I’d planned on that Tuesday night, but I rarely get to play with robots myself these days with all of the hiring we’re doing, so I figured this was a pretty great excuse to get side tracked on…

Husky controlled with Myo

Husky controlled with Myo

As you’ll see in the video below, it’s a pretty simple interface. Point forward to drive forward, pull your hand back to reverse, and gesture left and right to turn. We rely on the built-in Myo gesture recognition libraries to enable and disable control of the robot. After all, you don’t want to drive robots all of the time, just most of the time…
So, how’d we do it? First, it’s mostly ROS. Our Husky package exposes a standard Twist interface, so we needed to get the Myo data into that format. This was slightly slowed down by the fact that the Myo SDK doesn’t yet support Linux, and ROS on Windows is a bit of a challenge. Fortunately, we’ve faced this before. We added our experimental cross-platform serialization server and used it in socket mode. If you’ve got ROS Hydro already installed:

sudo apt-get install ros-hydro-rosserial-server ros-hydro-rosserial-python
roslaunch rosserial_server socket.launch

That takes care of the robot side of things. Next, the Myo side. Here, we unfortunately can’t be as detailed yet as we might like as our Windows rosserial client is still not ready for community release, but it’s a matter of adding a bit of standard Windows Socket code into the provided Thalmic example code, and then figuring out the right mapping from the Myo data to the desired robot velocity. As always, we recommend using timeouts and velocity limits.

What’s next for this? We’re not sure yet, but we’ve got a dozen different robots here we could do some more tests with. If you’ve got ideas, feel free to pass them on and we’ll see what we can do!