Measuring Student Engagement in STEM Classes

Introduction:

How can we use students’ expressions of engagement, based on non-verbal signs such as facial expressions, body and eye movements, physiological reactions, posture, to enhance learning? The purpose of this project is to improve student learning by the automated capture of non-verbal cues of engagement.

Goal:

  1. Establishment of a robust network of non-obtrusive and non-invasive sensors in mid-size classes to enable real-time extraction of facial and vital signs, which will be integrated and displayed on instructors’ dashboards.
  2. Identification of robust descriptors for modeling the emotional and behavioral components of engagement using data collected by the sensor networks.
  3. Gathering meaningful data for subsequent work on emotional, behavioral, and cognitive metrics of engagement.
  4. Exploring effectiveness of artificial intelligent and machine learning for delivering education and training of STEM subjects.

 

Methods:

Fig. 1 shows data collection for behavioral and emotional engagement:

  • eye-gaze
  • head movement
  • hand movement
  • facial expressions.

Videos captured by webcams are used to extract facial; information and eye-gaze. Video captured by wall-cam is used to hand and body movement. 3D reconstruction shows the students’ heads and gazes relative to teacher’s board.

Results:

Fig. 1.

Sample of Measuring Student Engagement.

Research Team:

Publications:

  1. I. Alkabbany, A. Ali, A. Farag, I. Bennett, M. Ghanoum and A. Farag, “Measuring Student Engagement Level Using Facial Information,” 2019 IEEE International Conference on Image Processing (ICIP), 2019, pp. 3337-3341, doi: 10.1109/ICIP.2019.8803590.

Acknowledgement:

This research has been supported by NSF Award #1900456.

 


News