Beyond your Reflection with Elizabeth C. White

Beyond your Reflection

Beyond you Reflection is an interactive mirror installation that uses the Kinect, a two way reflective acrylic mirror and Processing. It acts a stand alone mirror but when you get close enough (within the specified range of depth from the Kinect) the screen behind the mirror lights up to show a sketch of you.

This is far from our original concept of controlling light and audio outputs based on proximity of dancers for live performance but the core concept holds the same thought. Instead of using ultrasonic range finding sensors we are using a Kinect. Although it could also be done with a webcam and instead of depth sensitive sketches we could use 2D pixelation sketches.

Requirements - Kinect, Monitor/Display/Screen, 2-way reflective acrylic, Arduino

Tools  - Processing, Arduino IDE

Making -  We laser cut the acrylic mirror to fit the screen exactly leaving the frame of the Acer monitor. We then used a black acrylic to make a frame to cover the kinect and monitor. We mounted the kinect to the top of the monitor with industrial strength adhesive

frame.jpg

Neopixels and proximity

Elizabeth's blog has the proximity sensor testing videos!

The neopixel consumes a lot of power so it needs to be powered separately, our next attempt is controlling multiple neopixels with an arduino.

We ran into a lot of errors like avrdude, and the arduino crashing cause of lack of power. We then #included the <power.h> lib which I dont completely understand what it does but seems like the neopixel behaves better with that included.

our to-do list for this weekend is

  1. Check other micro-controllers besides Uno for multiple neopixels
  2. Attach multiple neopixels and control them with the proximity sensor
  3. Fabricate the lenticular surface and mirror and light enclosure
  4. Test the projection mapping system in case that is a good addition
  5. Test the Skanect and check if the 3D character can be controlled by the proximity sensor

Basically, A LOT.

Let's see what works and what doesn't so we can have a good user testing next week!

Project Description & Bill of Materials

PROJECT DESCRIPTION

The interactive dance piece is a solo performance choreographically depicting desire and fear. Metaphorically pointing at the need to be close yet fearing the uncertainty of closeness through movements of push and pull. The closer the dancer is to the sensor, the brighter the display becomes, likewise the farther they are, the dimmer it becomes.

A belt of ultrasonic sensors around the LED display will sense the proximity to the dancer and use that information to either brighten or dim the LEDs.

(Rephrased Elizabeth's description)

SYSTEM DIAGRAM

fullsizeoutput_5.jpeg
fullsizeoutput_7.jpeg

PLAYTESTING 2.0

After our first in-class play test, we realized we needed to do another accurate play test, so we picked two LED lights one with a red filter to depict two dancers and their distance to the center piece as our initial idea involved two dancers.

BILL OF MATERIALS

Bill-of-materials.png

Play testing 1.0

playtesting 1.jpg

We are working on an interactive live dance performance piece with two dancers. The choreography revolves around the difficulties and eases of getting close to someone. The central piece we plan to make will have ultrasonic sensors which will depict whether a dancer is close to the piece or away and accordingly will be bright or dim respectively. We play tested this in class and got some important feedback.

We however need to play test with several scenarios, adjusting different parameters and see what works the best. Ideally the primary user of this interaction would be the dancers but an integral part of this is the audience. So we have layers of users and need to be cognitive of this fact.

We plan on using the sensors and LEDs on the piece for the basic version and we can then work on bringing in the kinect and projection mapping for a later phase.

-- A project-in-the-making by Elizabeth White and Rushali

Midterm Project: Trial Round

For midterms I was partnered with Jasmine and together we brainstormed and finalized on making a paper towel counter! So every time a towel gets pulled some display shows the count and maybe a GIF of a tree falling or something rolling or maybe the sound of thunder! Maybe this is about sustainability or maybe our aim is to just make people feel awkward about using paper towels! Why? you ask, why not? we say!

So the Paper Towel Dispenser looks like that -->

And on careful observation thou shalt notice springs! THAT STRETCH! Thou shalt also notice dust but that does not concern us at this moment.

The motion of the spring was perfect to place a stretch sensor at that point and there is ample room to keep an arduino in the dispenser as well. In case of space crunches we could also consider an arduino mini. We also thought it would be ideal to hook it up to bluetooth so our display could be outside the restroom instead of inside. We then found another paper towel dispenser outside the restroom so I guess we shall use that.

 

pgr.JPG
pgr2.JPG

STEP TWO!!

Circuits and Testing

We first checked our sensor and saw the values it gave messed around with a few resistors such that the values were far apart. We got a good range from 500 - 800 on a 2k ohm resistor!

We then checked the bluetooth module to see how it functions, connected it's Tx to it's own Rx and saw that it was sending data! But we shall get back to the bluetooth module later.

STEP THREE !!!

Our p5 sketch that shows gifs on a trigger - fetches gifs from giphy using it's API -

click on it -->

Serial communication - please speak my baud!

Lab with asynchronous serial communication was printing values in hex, dec, ASCII and Binary.

Then we sent values of the x and y axis of the accelerometer and the high and low value of the switch

The next lab was connecting the pot to a p5 sketch which we did in synthesis.

When I did it again for the lab I put in delay() on the arduino sketch and saw what  a terrible lag it was going through!!

Computational Media Synthesis

This week we controlled our ICM animations with a physical input. We first used the potentiometer and tried analog and digital input to make a ball move along the x'axis and make it disappear respectively.

Daniella - who I was partnered with and I decided to try out most of our sensors and we realized the values we get are very unstable and dynamic. The pot seemed to be the most stable one.

We finally decided to use our accelerometer and make the ball move along to our gesture like we do when we use MouseX, MouseY. We succeeded in translating the X-value to the X-axis by mapping the values we even stabilized our output. But we weren't able to do that for the Y-axis cause the arduino was sending the values of X and Y together.