Initial list of potential inputs we set out to test. We then determined which combination of IMU sensors could best detect the gestures.
Created a configuration app for Jet that surfaced IMU sensor values. This allowed us to test triggers for different angle thresholds. We then assigned different actions to each trigger (e.g. open message on look up)
Early Project Cirrus demo captures from Jet.
Screencasting Jet to a laptop to demonstrate subtle head gestures from Project Cirrus.