Eye Catcher
Using a combination of industrial robotics and high power magnets, a seemingly inconspicuous frame on a wall, magically comes to life. Through a series of experimental films, photography and physical prototypes, the primitive effects of eye (and eye-like) stimuli have been investigated. The Eye Catcher project in its conclusion has developed a novel expressive interface where emotion recognition algorithms read audience faces and in-turn trigger the animation of a face formed of ferrofluid.
As people walk by, unaware of the interactive installation, out of the corner of their eye, they see an unexpected movement. Turning their head, they find an empty frame on the wall appearing to move towards them and as they stop in disbelief it positions itself to look straight at them. Suddenly from the murky black liquid sitting in the bottom of the frame, two primordial pupils rise up and seem to stare back at its viewer. A hidden pinhole camera in the frame captures the facial expressions of the onlooker and responds with a range of emotions crafted out of the subtle manipulation of motion cues. An uncanny and playful interaction is formed as expressions are exchanged.
Principle Researchers: Lin Zhang, Ran Xie
Supervisors: Ruairi Glynn and Dr Christopher Leung with William Bondin
KEY REFERENCES
Farroni, T., Csibra, G., Simion, F., & Johnson, M. H. (2002). Eye contact detection in humans from birth. Proceedings of the National Academy of Sciences, 99(14), 9602-9605.
Frijda, N. H., & Swagerman, J. (1987). Can computers feel? Theory and design of an emotional system. Cognition and emotion, 1(3), 235-257.
Gage, S. A. (2006). The wonder of trivial machines. Systems Research and Behavioral Science, 23(6), 771-778.
Glynn, R. (2008). Conversational environments revisited. In 19th Meeting of Cybernetics & System research, Graz (Austria).Brooks, R. (1991). New Approaches to Robotics. Science, 253(5025), 1227—1232.
Glynn, R. (2014). Animating Architecture: Coupling High-Definition Sensing with High-Definition Actuation. Architectural Design, 84(1), 100-105.
Gregory, R. L. (1997). Eye and brain: The psychology of seeing. Princeton university press.
Hess, E. H., & Polt, J. M. (1960). Pupil size as related to interest value of visual stimuli. Science.
Kowler, E. (2011). Eye movements: The past 25years. Vision research, 51(13), 1457-1483.
Langton, S. R., Watt, R. J., & Bruce, V. (2000). Do the eyes have it? Cues to the direction of social attention. Trends in cognitive sciences, 4(2), 50-59.
Loke, L., Larssen, A. T., Robertson, T., & Edwards, J. (2007). Understanding movement for interaction design: frameworks and approaches. Personal and Ubiquitous Computing, 11(8), 691-701.
Nicholson, N. (1998). How hardwired is human behavior?. Harvard Business Review, 76, 134-147
Nettle, D., Nott, K., & Bateson, M. (2012). ‘Cycle Thieves, We Are Watching You’: Impact of a Simple Signage Intervention against Bicycle Theft. PloS one, 7(12), e51738.
RELATED PROJECTS
Richard The and Willy Sengewald, ‘Omnivisu‘ at the S/U station Warschauer in Berlin, Germany (2004)
Opto-Isolator by Golan Levin(2007)
All eyes on you by Britzpetermann(2012).
Comments