For our midterm assignment in PComp we had to take some type of interaction and give it a physical interface. Edson (my team partner) and I were both interested in dance and decided voguing could be something fun to try and give a physical interface. This meant trying to quantify the movement of voguing, where one is judged based on fierceness, attitude, and other subjective qualities. How do you create an algorithm based off of these parameters?
First we discussed the many ways to capture the movement. We decided to zone in on the arm, since most of the movement is initiated by this limb. We ordered a bunch a flex sensors and some screw terminals, started prototyping our design, and dived into the world of voguing. Here is a short video documenting this journey:
To give voguing a physical interface we attached three flex sensors to our wrist, elbow, and armpit. Processing read the values of these sensors and manipulated three sliders that we created for each dancer to indicate how much each dancer was flexing their joints. The sliders move across their corresponding bar based on the value from the sensor which is mapped over the length of the bar.
Throughout the iterations of our prototype we also had many iterations of the code base. For the final project we scaled everything back because we wanted everything to correlate to exactly what the user was doing. However, I would be happy to share the entire code base if there are any requests! Here is the final, slimmed down version of the code: