Educational technology can be used not only to automatically monitor and tailor students’ cognitive progress through lessons, but also to reduce threats to public self-esteem and the need for autonomy. People respond to computers and technology as if they are social beings, suggesting that we can design the features of our intelligent tutoring systems to elicit social responses benecial to help-seeking and learning.
This proposal builds on my previous work exploring the social presence of human and robotic pedagogical agents, their perceived social roles (teacher or helper), and their impact on learning in one-on-one tutoring situations.
In this talk, I propose additional experiments examining the practical implications of social presence and role of pedagogical dialogue agents as well as examining the causal mechanism of public threat to self-esteem. My work attempts to answer the questions: (1) Can we increase learner help-seeking by reducing the social presence of our educational technology? (2) Is public threat to self-esteem the mechanism behind this change in behavior? and (3) How does it relate to threats to student autonomy? These ndings will contribute recommendations for automated tutor design as well as furthering our understanding of how threats to self-esteem and autonomy influence student help-seeking behavior in interactive learning environments.
Carolyn Penstein Rose (Chair)
Marsha Lovett (Psychology)
Stuart Karabenick (University of Michigan)