Axolotl was conceived and developed as an academic project at ITP program of NYU. The name 'Axolotl' was inspired by a short story by Julio Cortazar, about a man who develops a deep emotional obsession to passive creatures he observes in an aquarium. Axolotl is an interactive installation based on a life-like algorithmic animation of creatures responding to the presence of observers. The project is an exploration of the possibilities to portray life, through motion and form, and is aiming at getting people to reflect on the way that we perceive life, and look at what kind of emotional response can be evoked to the perception of a life-form, knowing that it is artificial. The creature of Axolotl is projected on an aquarium, an object that is associated with looking at animals, be it fish or other animals and serves the purpose of the piece.

The choice to design an under water creature allowed for more freedom of motion in all dimensions and the ability to come-up with an abstract shape that can still be perceived as a living being.
The whole process for the animation and interaction of Axolotl was done using the Processing environment.
Motion: Oscillating sine waves were the main driving force used in the process of animating the creature, in addition to many small features that added the desired life-like, unpredictable nuances of motion. The animation code was implemented using calculations of physical forces like gravity and inertia. That allowed for the creature to move in an organic smooth motion, and also had the tentacles respond to spatial motion in 3 dimension.

Installation and Sensing: The sensing technique to make the animation responsive uses face detection, as an indication for presence of observers, looking at the creature. The Processing environment was used for this aspect as well, using the OpenCV framework. It took a lot of experimentation to fine-tune the threshold values in order to get the desired sensitivity. In addition, motion tracking code was implemented to maintain smooth and consistent face detection.
The two aspects - animation and sensing are communicating using OSC (open sound control) protocol.
The whole piece is an integrated audio-visual experience with sound elements tied up to the different states and behaviors of the creature.
At this point there are two main behavior modes to the creature, one being 'shy but curious' and the other 'totally terrified'. More modes are being explored as of writing this lines and will be implemented soon.

Presented at the ITP Spring Show 2009

Created by Eyal Ohana and Filippo Vanucci
May 2009