PoseNet Art is an interactive app where the user’s motion activates real-time animations and sounds. It is meant to be used for creative and entertainment purposes.
PoseNet is a machine learning model for real-time pose estimation. We used a PoseNet implementation from ml5.js. We used p5.js to create animations and Tone.js to create sound effects.
PoseNet Art was created by three MIT students in the fall of 2020 as part of the inaugural cohort of AI@MIT Labs:
- Daniel Dangond
- Kathleen Esfahany
- Sanja Simonovikj
https://posenet-art.netlify.app/
- clap to make fire and smoke
- clap to make lightning
- raise hands to sparkle
- choose from 6 different background sounds
- change master volume
- toggle sounds on/off
- enjoy a set of associated sound effects with each visual effect
To test the app locally:
- Run a local python server:
python3 -m http.server
- Navigate to
localhost:8000
in your browser.
index.html
- the homepage / entry pointposeEngine.js
- PoseNet model initialization and progression, triggers event listeners for effectsmain.js
- defines event listeners for effectsdrawingEngine.js
- functions needed for drawing visual effectseffects.js
- implementation of the visual effects