



An invisible interface, controlled by subconscious behavior
In collaboration with Ka Hei Suen, Yuri Klebanov, and Y. Sato Lab
April 2017
-
Using advanced computer vision technology, we’ve developed a set of interfaces that present an evolution of the interface from one controlled by conscious, learned behaviors to a subconscious, premonitory one controlled by human instincts. The three objects, a light, speaker, and fan, are controlled at the first station through an action with one of the objects on the table. At the second station, a combination of eye gaze and hand gesture controls the objects. At the final station, the objects observe one’s behavior and adjust themselves accordingly.
Photos by Gottingham
-