Of all the experiences we have in life, face-to-face interaction fills many of our most meaningful moments. The complex interplay of facial expressions, eye gaze, head movements, and vocalizations in quickly evolving "social interaction loops" has enormous influence on how a situation will unfold. From birth, these interactions are a fundamental element of learning and lay the foundation for successful social and emotional functioning through life.
What are the underlying processes from which this most human form of interaction emerges? Will we be able to interact with computers in a face-to-face way that feels natural? This article discusses the unique challenges of realistically simulating the appearance and behavior of the face to create interactive autonomous virtual human models that support naturalistic learning and have the "illusion of life." We describe our recent progress toward this goal with "BabyX," an autonomously animated psycho-biological model of a virtual infant. While we explore drivers of facial behavior, we also expect this foundational approach has the potential for more "human" computer interfaces. We also describe our work on our "Auckland Face Simulator" we are developing to broaden this work beyond infants and give a more realistic face and a greater biological basis to adult conversational agents.
No entries found