Role: Researcher and Designer
Team: Worked closely with Dave Merkoski, former Creative Director at Frog Design
Duration: 6 months
I worked on a new interaction language for the eyes, helping define the equivalent of select, swipe, and zoom through eye movements for Eyefluence
, a startup creating eye-control technology for augmented and virtual reality devices acquired by Google in 2016. I explored the applications for their technology in head-mounted devices.
What applications exist for eye-controlled head-mounted devices?
I explored this question over six months with a former creative director at frog design, Dave Merkoski. Dave helped structure and advise the research while I conducted it.
I reviewed all official and unofficial Google Glass applications to evaluate what already existed, and surveyed mobile augmented reality and location-based networking apps. I explored Tobii's interaction model and design patterns. Tobii is one of the largest developers of eye-tracking technology for laptops and tablets. I also reviewed academic research on eye movements to inform interaction model exploration.
Story Writing and Personal Interviews
I wrote 300 stories, each a sentence long exploring how a person might use an eye-controlled head-mounted device. I organized the stories into clusters to identify opportunity areas, then interviewed 15 individuals to explore these areas from different perspective. I had each interviewee try on Google Glass to understand what a head-mounted device was. I used Glass because it was the most accessible device at the time. After a few minutes, I removed Glass because I didn't want participants getting too familiar with it. I replaced it with a sheet of plastic that I could easily write on, then we role played five scenarios. Because the interviews were exploratory, my goals were simple - What topics did interviewees respond to? When did they seem most excited and engaged? These observations led me to deeper insights.
Survey and Use Cases
Next, I surveyed 50 individuals to explore how they'd respond to the prompt: You are superman. You can do and control anything with your eyes. What do you want to do at work? At home? Outside? And with other people? I documented their responses and organized them into insights. The team then brainstormed to translate the insights into use cases.
Dave and I worked closely with an engineer to prototype the most compelling use cases. Prototypes aren't shown due to confidentiality. Eyefluence was acquired by Google in October, 2016.