How can we use students’ expressions of engagement, based on non-verbal signs such as facial expressions, body and eye movements, physiological reactions, posture, to enhance learning? The purpose of this project is to improve student learning by the automated capture of non-verbal cues of engagement.
Fig. 1 shows data collection for behavioral and emotional engagement:
Videos captured by webcams are used to extract facial; information and eye-gaze. Video captured by wall-cam is used to hand and body movement. 3D reconstruction shows the students’ heads and gazes relative to teacher’s board.
This research has been supported by NSF Award #1900456.