Making music with my arms

The brilliant Imogen Heap performed in New York a few weeks ago, and I got to experience live how she crafts sounds with her arms and hands

It was a great night of beautiful music and technology, both.

One mystery I couldn't solve from the audience was how her computer detected the position of her arms. Unlike in her early videos, I didn't see something akin to a Kinect on stage.

Now I think maybe I know.

That's because this week I took a workshop from Hannah Davis on using the ml5.js coding library, which touts itself as "friendly machine learning for the web," letting me use machine learning models in a browser. The class was part of the art+tech Eyeo Festival in Minneapolis.

One of the models Davis demonstrated was PoseNet (also here), which estimates the position of various body parts — elbows, wrists, knees, etc — in an image or video. I'd never seen PoseNet work before, let alone in JavaScript and in a browser.


Inspired by Heap, I set out to quickly code a music controller based on my arm movements, as seen by PoseNet through my laptop camera.

Try it yourself

It's pretty rough, but you can try it here. Just let the site use your camera, toggle the sound on, and try controlling the pitch by moving your right hand up and down in the camera frame!

I put it on Glitch, which means you can remix it. Or take a peek at the code on Github.

There are lots more ml5.js examples you can try. Just put the index.html, script.js, and models (if there's such a folder) someplace on the web where the files can be hosted. Or put them on your local machine and run a simple "localhost" server.