EmbodyPaint
An interactive shoe and glove generating real-time visualization in the Web or AR environment.
By Rosalie Lin
Advisor: Neil Gershenfeld (Founder of Center of Bits and Atoms)
Nathan Melenbrink (HarvardGSD), Iulian Radu (HarvardGSE)
​
EmbodyPaint aims to turn choreography into calligraphy, where people can visualize their steps on a virtual canvas and create their unique art piece! It maps body movement data such as rotation and moving speed from an interactive sensing shoe and generates real-time visualization in the web browser and AR.
The technical part of this project is to read in data from an accelerometer and then send it via WebSocket to the client, and build wireless networking, thus a cable-free wearable!
​
How To Make Almost Anything
In 2021 Fall, I took the course How to Make Almost Anything at MIT, where I gained the superpower of integrating CAD, fabrication, electronics, and programming in one project. Coming from a designer background, I'm now comfortable with electronics(circuit design, PCB soldering, validation) and programming(input, output, interface, networking). This class is a catalyst to transform me from a designer to a design-engineer!
​
Some highlight projects:
week2 Computer-Aided Cutting
week3 3D Scanning and Printing
week7 Embedded Programming
week8 Molding and Casting
week9 Input Devices
week14 Final Project
​
​
Idea
The initial idea came from turning choreography into calligraphy, where people can visualize their dancing steps on a virtual canvas and create their unique art piece! with wire-less wearables. To add more layers of visualizing the steps, the position, the height between shoes and ground, the pressure users hit the ground, the orientations...etc, can all become parameters of the 'painter'. The final goal is to let multiple users visualize their dancing/art pieces lively, in a way the visualization could loop back to influence how they dance.