FaceTime is an interactive timepiece which incorporates the viewer's face into the time display. It is built using OpenFrameworks with face tracking video processing.
FaceTime has two scenes, the first shows a black screen with the current time as a series of straight lines connected to a bright red mouth shape. The mouth tracks that of the viewer and when they open their mouth very wide it makes the lines thicker and brighter. Scene two tracks the face of the viewer and crops an image around their face and redraws this on the screen multiple times with a delay and also behind each numeral of the clock display.
Concept and background research
The work of Zach Liberman inspired me greatly. He has done a fantastic amount of work around face tracking, where the viewer's face or parts of it are remixed and altered in real time. Recently some of them have been added to Instagram as filters which are now very popular. I am also really interested in time as a concept and how we relate to it and timepieces which show the passing of time as an integral part of the artwork. I wanted this work to incorporate those ideas. The focus of my idea was that as a person viewed the timepiece, they become integrated with it. Ideally, the duration spent looking at the work would be reflected by back in the amount of time their face continues to appear in the work after they have left.
There's a lot more development I could add to this work. I started on the ability to switch scenes, but only had time to implement two scenes. Ideally, I would add many more scenes, also adding more dynamic variability within the scenes. I'd also like to look at changing the parameters of the scene more with the time so that it reflects ideas around when the person is viewing the piece. So at night, it would show a different style of interaction compared to daytime, but also perhaps it has a different style in winter compared to summer.
Overall I was pleased with the work. At the exhibition, it was really enjoyable to watch people interacting with the work without knowing that I was the author and to see how they understood and worked with the piece. It seemed the mouth clock scene was more popular than the webcam delay, so I left it mostly playing that scene. People understood the mode of interaction quickly, but the face tracking itself wasn't that reliable. There's another library OpenFace which I think might be more reliable, but it wasn't clear how easily I could use it with my existing OpenFrameworks code, so I didn't try to use it.My code was quite well structured, but I'm still struggling with understanding how to pass references around in C++, so it limited some of the modularity of the code, and also made it harder to reused code between scenes.
- ofxFaceTracker - https://github.com/kylemcdonald/ofxFaceTracker
- ofxFaceTracker2 - https://github.com/HalfdanJ/ofxFaceTracker2
- ofxDatGui - http://braitsch.github.io/ofxDatGui/
- CLAHE implementation code adapted from https://gist.github.com/gu-ma/eae2f72e740631a31b20eb8b2810c370