Dr. Dan Bohus (Microsoft Research)


Title: Attention and Gaze in Situated Language Interaction


The ability to engage in natural language interaction in physically situated settings hinges on a set of competencies such as managing conversational engagement, turn taking, understanding, language and behavior generation, and interaction planning. In human-human interaction these are mixed-initiative, collaborative processes, that often involve a wide array of finely coordinated verbal and non-verbal actions. Eye gaze, and more generally attention, among many other channels, play a fundamental role.

In this talk, I will discuss samples of research work we have conducted over the last couple of years on developing models for supporting physically situated dialog in relatively unconstrained environments. Throughout, I will highlight the role that gaze and attention play in these models, both in interpretation and in generation. I will discuss and showcase several prototype systems that we have developed, and describe opportunities for reasoning about, interpreting and producing gaze signals in support of fluid, seamless spoken language interaction.