Ishan Sain
Cart 0
 

pixel pond

 
vlcsnap-2018-08-11-02h49m37s976 copy.png
 
 

Overview

Pixel Pond is an installation that prototypes digital spatial interactions. The project explores the possibility of a digital fish pond that allows people to interact with it using their smartphones. It makes use of a camera-projection system to render an environment for digital fishes which react to human intervention.

This project is done in collaboration with Aaron McKenzie.

Objective

To build a prototype for spatial interactions based on the theme of Enchanted Environments. The prototype, therefore, would contribute to an experience which is inviting, playful and joyous to interact with.

Skills

Prototyping, Programming 

Tools

Processing, Tramontana

 
 
 

Experience it through video

 
 
 
 

How it works

The whole rig is composed of five main components – first of which is a custom built wooden table with a Spandex Fabric as the top. The rest four are digital components connected with each other indirectly, and together they propel the whole experience.

 
 
 

Components

Computer works in the backend. It receives actions from the camera and Human (via phone) and translates them to reactions projected on the table top.

Camera stays above the table, it mostly looks over to locate the phone in relation to the dimension of the table.

Phone acts as the primary touchpoint for humans and allows them to intervene in the digital world of fishes here. It is connected to the computer wirelessly.

Projector rests under the table and projects merely the visuals up on the tabletop.

Tabletop holds the white stretchy Spandex Neoprene fabric which is translucent enough to view the projection with much clarity from the other side.

 
 
 
 

Detection through camera

The first core aspect of the interaction is camera locating the phone over the tabletop surface. The camera identifies the marker on the phone screen to locate it which is then used to manipulate the digital surface of our table.
For example, the projected square follows the phones in the animation on the right.

Similarly, in ‘pixel pond’ the phone is used to drop fish food on any point of the pond.

 

Box-howItWorks-anim_1.gif
 
 
 
 

Interaction through phone

The second core aspect of the interaction is leveraging the phone’s screen to trigger actions. With the help of Tramontana, the phone wirelessly sends events such as screen taps to the computer. Combined with the previous aspect, we can act on different locations on the digital surface of the tabletop.
For example, tapping on phone screen places the square as shown in the animation on the right.

Similarly, we use the phone to perform multiple actions on any point of the pond.

 

howItWorks-anim 2_1.gif
 
 

Process

 
 
 

MAterials

After getting the sense of how each component connects on paper, we started testing each of them individually on their own. Developing the first version of digital fishes in code allowed us to try out different materials to project on. After that, we moved forward to test the dimensions and placement of all components of the system

 

DIMENSIONS

After finalizing on Spandex Neoprene fabric as our tabletop material, we brought the camera into the scene. Using a standard webcam and a long tripod, it became convenient to try and place the camera on different heights. Similarly, we figured out the height at which our spandex material would act as the tabletop because setting the projector on the ground disallows you to move it vertically.

 

PUTTING IT ALL TOGETHER

We built the table from scrap wood we found based on the dimensions we figured out. Finally, after joining all the physical pieces together of this system, we focused on refining the digital structure of our system. From there on, we iterated on the implementation of interactions and started exploring all the possibilities that would serve our goal.

 
 
 
 

Reflection

Calibrating to the real-world variables

When working with spatial interaction (specially which include a camera), certain real-world variables such as lighting may change the perception through the camera. I figured it is vital to build the mechanism of calibrating to the real-world early on.

Measuring direct impact

As iteration is the essential aspect of prototyping, being able to test the prototype with humans lends much more value to improve the interactions. Measuring direct impact enabled me to evaluate how dropping the fish food or any technical glitches affect the reactions of humans using it and therefore, iterate for a better version.