This is a collaborative project we’ve been working on in the lab along with Dr. Marilyn Walker and Dr. Michael Neff and their teams that explores the production and perception of nonverbal behaviors in autonomous, interactive virtual agents (IVAs for short). We’re particularly interested in two aspects of an agent’s nonverbal expressive behavior: 1) whether an agent’s gestures are perceived as cues to their personality, so that we can script agents with particular styles of expression and particular personality types, and 2) whether interacting with an agent will lead a human user to adapt their communicative style to match the agent’s the way that two humans adapt to each other over the course of a conversation.
These research goals have lead to the construction of a number of expressive agents, which participants in our lab either observe and interpret, or interact with directly. The interactive experiment has been particularly challenging and fun, involving the creation of a sort of Wizard of Oz set up our lab so that the research assistants and I can control the agent’s behavior behind the scenes while the participant interacts with it.
This work is still ongoing, but we hope to share some interesting developments soon!
Liu, K., Tolins, J., Fox Tree, J. E., Walker, M., & Neff, M. (2013). Judging IVA personality using an open-ended question. In R. Aylett et al. (Eds.), Springer’s Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence (LNCS/LNAI), LNAI 8108 (pp. 396–405). Heidelberg: Springer-Verlag.
Tolins, J., Liu, K., Wang, Y.-Y., Fox Tree, J. E., Neff, M., & Walker, M. (2013). Gestural adaptation in extravert-introvert pairs and implications for IVAs. In R. Aylett et al. (Eds.), Springer’s Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence (LNCS/LNAI), LNAI 8108 (pp. 481–482). Heidelberg: Springer-Verlag.