Automatic detection of gaze and body orientation in elementary school classrooms
Artículo
Open/ Download
Access note
Acceso abierto
Publication date
2021Metadata
Show full item record
Cómo citar
Araya Schulz, Roberto
Cómo citar
Automatic detection of gaze and body orientation in elementary school classrooms
Abstract
Detecting the direction of the gaze and orientation of the body of both teacher and students is essential to estimate who is paying attention to whom. It also provides vital clues for understanding their unconscious, non-verbal behavior. These are called "honest signals" since they are unconscious subtle patterns in our interaction with other people that help reveal the focus of our attention. Inside the classroom, they provide important clues about teaching practices and students' responses to different conscious and unconscious teaching strategies. Scanning this non-verbal behavior in the classroom can provide important feedback to the teacher in order for them to improve their teaching practices. This type of analysis usually requires sophisticated eye-tracking equipment, motion sensors, or multiple cameras. However, for this to be a useful tool in the teacher's daily practice, an alternative must be found using only a smartphone. A smartphone is the only instrument that a teacher always has at their disposal and is nowadays considered truly ubiquitous. Our study looks at data from a group of first-grade classrooms. We show how video recordings on a teacher's smartphone can be used in order to estimate the direction of the teacher and students' gaze, as well as their body orientation. Using the output from the OpenPose software, we run Machine Learning (ML) algorithms to train an estimator to recognize the direction of the students' gaze and body orientation. We found that the level of accuracy achieved is comparable to that of human observers watching frames from the videos. The mean square errors (RMSE) of the predicted pitch and yaw angles for head and body directions are on average 11% lower than the RMSE between human annotators. However, our solution is much faster, avoids the tedium of doing it manually, and makes it possible to design solutions that give the teacher feedback as soon as they finish the class.
Patrocinador
ANID/PIA/Basal Funds for Centers of Excellence FB0003
Indexation
Artículo de publícación WoS
Quote Item
Frontiers in Robotics and AI Volume 8 Article Number 729832 Published Sep 1 2021
Collections
The following license files are associated with this item: