From video description:
In this video we demonstrate a custom eye tracker for Glass that can be built for $25 and using a Makey Makey with Glass.
Glass supports touch gestures (e.g., swipe, tap, scroll), head gestures (e.g, tilt up turns display on, gesture down turns display off), and voice controls (e.g., "ok glass", voice input). By using the IMU sensors directly (as we show here) it's simple to extend the range of head gestures. There is a proximity sensor that is used by Glass to determine if the device is being worn. It is capable of recognizing wink/blink gestures but it cannot actually track the gaze of the user.
We hope that these new input methods can be used to expand when Glass is relevant for use (e.g., with your hands full) or who can use it (e.g., users with disabilities). They are intended for developers and researchers, we don't intend for people to use our eye tracker while walking around. It's essentially a very cheap and easy way for all of us to have this feature before it is integrated into the device directly (eventually some manufacturer will do it) and if we find interesting use cases it may even advance the timeline for their inclusion. All of the code and 3D models are available in http://www.WearScript.com.
http://blog.brandynwhite.com/new-glass-input-methods_eye-tracking_touch-sensitive-clothing