Interaction Design for Challenging Environments

How should a digital assistant help you reserve a restaurant or send a text while you’re dashing around campus or town? Can legible text and large buttons add efficiency to an existing voice interface without overly distracting a driver?

These are the kinds of questions I’ve been grappling with for around 20 years.

I started at speech recognition pioneer Dragon Systems, working in grammar design and localization for the voice-activated navigation system in the 2001 Jaguar X-Type. At Mitsubishi Electric Research Labs a few years later, I contributed to the nav system for the 2006+ Mercedes C-Class, and led soup-to-nuts design, engineering and user research for a series of voice and multi-modal application prototypes (see papers/patents here).

Later I managed a 5-person engineering team working on the GUI for the Dragon Drive demonstration app at Nuance. We developed a new interaction technique that allowed “no look” operation of touchscreens while driving. We called it Bullseye, because the whole screen acted as one big touch target.

"Bullseye" low-attention touch UI schematic
"Bullseye" low-attention touch UI schematic

Upon moving to Apple in 2013 I became the Siri design lead for CarPlay. Among other things, I explored, user-tested and spec’d a variety of improvements to Siri’s business search and mapping features. One small but important contribution was the inclusion of a “more with Siri” button at the end of length-limited lists. (The CarPlay protocol gives manufacturers the option of forcing the Music app to show at most 10 items in lists.)