Watch

Machine Learning for Human Creative Practice by Dr. Rebecca Fiebrink at Eyeo 2018.

I really enjoyed watching Rebecca’s talk about how machine learning can augment and encourage creativity. Her ideas offer a refreshing perspective on the role of ML as a tool for play, expression, and exploration rather than just for efficiency or automation.

A lot of what we’ve been doing with HandPose, BodyPose, and FaceMesh relates to her point of ML allowing for new forms of interactions + expression. Using sensors to track gesture and movement allows for a whole new medium for creative expression, where our bodies become an interactive controller of visuals, sound, etc.

Another big thing that Rebecca mentioned was how ML is a great teaching and learning tool, making it easier and more inviting for people of all ages to engage in creative projects. When I have an ideas for a project, ML tools are great at giving a broad, high level overview of the steps to get started. I often use chatbots to help narrow my research, guiding me to the right resources and pointing me in the right direction. I’ve found it to make the creative process less daunting and more approachable. As I move through the making process, it’s helpful on the technical side, so I’m able to focus on being more visionary. Knowing that I have technical assistance allows me to push the idea further.

Proposal

For my project, I’m thinking of building off last week’s sketch where I played around with applying filters to different bounds created by my hands. I’m thinking of creating some glass/mirror/window effect, where users can blow to fog the screen, then use their hand to wipe it away or draw shapes.

I’ll be using FaceMesh to detect the user making a blowing gesture with their mouth and HandPose to detect the user’s fingers.

To make it more complex and realistic I want to play around with using particle systems for the fog/water droplets and see if I can create a reflective glass/mirror effect.