Virtual Puppeteer

→ Dynamically controlling a digital marionette through physical hand gestures

200w.gif

Abstract

This project leverages p5.js and ml5.js HandPose to create an interactive marionette simulation, where a puppet is controlled from above by strings or wires attached to its body. Using real-time hand and finger tracking, certain positions of fingertips are connected via strings to the joints of a marionette. The marionette responds naturally to the user’s hand gestures, with specific finger movements and hand configurations controlling specific parts of the arms and legs. The marionette’s motion is lifelike, responding naturally to the forces and dynamics of the strings for an immersive and satisfying experience. This piece encourages users to create their own movements, mimicking a dance or telling a story through the marionette’s gestures.

Inspiration

One of my favorite scenes in The Sound of Music is the marionette performance to “The Lonely Goatherd.” I’ve always been impressed by how well puppeteers bring inanimate objects to life through precise movements. This inspired me to think about the different types of art and performances we create with our hands, and how those gestures could be reimagined in a digital space. I wanted to replicate a satisfying tangible experience while exploring gesture-based interaction + how we can interact with digital environments using physical input.

tumblr_e680cdf0861f1a217c60d8a74b2bfd30_3667c948_540.webp

image.png

puppet-hand.gif

Audience:

Process: How did you make this? What did you struggle with? What were you able to implement easily and what was difficult?

Screen Recording 2024-11-30 at 3.54.56 PM.mov

https://editor.p5js.org/khz2004/sketches/nWkNsiGVw

I began by attempting to simulate a rectangle suspended by two strings, dynamically adjusting as the strings were pulled. During the process, I discovered a function in p5.js called https://p5js.org/reference/p5/atan2/, which calculates the angle formed between a point, the origin, and the positive x-axis—perfect for orienting geometry.

Screen Recording 2024-12-02 at 5.31.48 PM.mov

I started adding more body parts to create a half skeleton. In this sketch, the shoulder(red ellipse) and elbow(blue ellipse) are draggable. Initially, I planned to make the wrist dynamic as well, however atan2 only calculates the angle between 2 points, making it difficult to coordinate the positions and angles of 3 points

Screenshot 2024-12-02 at 5.28.37 PM.png

I then decided to test and implement HandPose to get a sense of how everything would eventually connect. It was interesting to experiment with translating a very physical and tangible art form into a digital space. In physical puppeteering, your hands are typically palm down, manipulating the puppet from above. Here, HandPose achieves the most accurate detection when your palm is flat and facing the camera. I had to decide what aspects of physical puppeteering I wanted to keep, with technical constraints in mind. This also helped me start thinking through the instructions I would provide to users.

Screen Recording 2024-12-03 at 2.57.56 PM.mov

https://editor.p5js.org/khz2004/sketches/j11y2loSU

It was pretty easy to connect! The rectangle is suspended by strings attached to/following the x + y positions pointer and ring fingertips.

At this point, I was able take the sketch to user testing and get some feedback

User testing:

I had users interact with the sketch with minimal instructions to see how they naturally engaged with it. I wanted to see if they used one hand or two and which fingers they instinctively tried to control the system.

I also asked for feedback on the hand controls and the visual design/interface, and received some good suggestions: