Beyond your Reflection with Elizabeth C. White

Beyond your Reflection

Beyond you Reflection is an interactive mirror installation that uses the Kinect, a two way reflective acrylic mirror and Processing. It acts a stand alone mirror but when you get close enough (within the specified range of depth from the Kinect) the screen behind the mirror lights up to show a sketch of you.

This is far from our original concept of controlling light and audio outputs based on proximity of dancers for live performance but the core concept holds the same thought. Instead of using ultrasonic range finding sensors we are using a Kinect. Although it could also be done with a webcam and instead of depth sensitive sketches we could use 2D pixelation sketches.

Requirements - Kinect, Monitor/Display/Screen, 2-way reflective acrylic, Arduino

Tools  - Processing, Arduino IDE

Making -  We laser cut the acrylic mirror to fit the screen exactly leaving the frame of the Acer monitor. We then used a black acrylic to make a frame to cover the kinect and monitor. We mounted the kinect to the top of the monitor with industrial strength adhesive



MemoryBox.Space is meant to be a social networking site.  It captures an image using your webcam/phone camera and pastes it on a box within a 3D world where people can navigate and look through these boxes.

The project was inspired by which was built on three.js. This project used the p5 WEBGL library.

My code is uploaded on github here

Computational media final project ideas

The following are my ideas for the ICM finals -

  1. Lyrics Visualization -  Based on Robert Hodgins Solar project, I wanted to make an interface that visualizes lyrics of various artists based on the most recurring word in their albums. I found a lyrics api that fetches albums, artists and their lyrics.
  2. Ketto Data Analysis - I wanted to visualize the data collected by a crowdfunding start-up named I worked for and  do predictions based on patterns. The data would include donors, campaigners and would revolve around where the users are based, how much they tend to donate, etc
  3.  Visualizations for pcom project - Ideally, I'd love to combine physical computing and icm so if we can make a few sketches and visualizations which are controlled by the proximity sensor so the live performance can have more detailed interactivity.
  4. 3D social media space - I wanted to explore more of WEBGL and how we could exploit it, our current social media sites are very text/images based and recently more videos are being incorporated, but I envisioned facebook in a 3D world and wanted to make a version of it. A website that this idea was inspired by is

Final project ideas: draft

For physical computing finals, Elizabeth and I have teamed up to make an interactive live dance performance piece. We are still in the 'figuring-out' stage so need to finalize on a lot of parameters but we would like to incorporate computation in the projection mapping and the physical distance sensitive light thing we plan to make.

In case my physical computing project does not require a lot of computation, I plan to do a data visualization app that uses the lyrics API which will go through all the songs of an artist and return back their most used words/phrases. Maybe even a a list of their most recurring words. I still need to go through a few projects and research more on this!

Making DOM things

This was my playground to try out some of  the DOM elements available.  I am yet to add other elements and functionality like recording sound!

The Div that i called to in the sketch seems to be wrong cause it was meant to change the color of the background and not that of the button!!

The capture button works only if it is opened with https:// , I don't know how iframes work so maybe it won't pop up on this blog! Also, the capture video was running throughout the sketch instead of just when the button was clicked! something about an if statement in the draw() function did not resonate with my logic :(

Hopefully, when exploring p5 sound the change music button works!!

Computational Media Synthesis

This week we controlled our ICM animations with a physical input. We first used the potentiometer and tried analog and digital input to make a ball move along the x'axis and make it disappear respectively.

Daniella - who I was partnered with and I decided to try out most of our sensors and we realized the values we get are very unstable and dynamic. The pot seemed to be the most stable one.

We finally decided to use our accelerometer and make the ball move along to our gesture like we do when we use MouseX, MouseY. We succeeded in translating the X-value to the X-axis by mapping the values we even stabilized our output. But we weren't able to do that for the Y-axis cause the arduino was sending the values of X and Y together.

We like to move it, move it...

Using Bezier() is not the most intuitive way of figuring out curves was last weeks conclusion, which can be solved by using mouseX and mouseY. Display the position of the mouse depending on what curve you need and then use those co-ordinates on your function!

The problem is there is one mouse and two controlling points, so I tried doing things with the keyboard using keyPressed() - difficult, have not reached a solution but seems like it can work.

For this week we have to use map(), random() and mouseX, mouseY

I made this abstract which messes with shapes, colors and backgrounds -


What confused me was the last condition -

One element that is different every time you run the sketch.

What does this element have to be? A physical one like a shape or a something like a delay?

Let's find out...