Show simple item record

Authordc.contributor.authorAraya Schulz, Roberto
Authordc.contributor.authorSossa Rivera, Jorge Felipe
Admission datedc.date.accessioned2021-12-16T20:20:14Z
Available datedc.date.available2021-12-16T20:20:14Z
Publication datedc.date.issued2021
Cita de ítemdc.identifier.citationFrontiers in Robotics and AI Volume 8 Article Number 729832 Published Sep 1 2021es_ES
Identifierdc.identifier.other10.3389/frobt.2021.729832
Identifierdc.identifier.urihttps://repositorio.uchile.cl/handle/2250/183276
Abstractdc.description.abstractDetecting the direction of the gaze and orientation of the body of both teacher and students is essential to estimate who is paying attention to whom. It also provides vital clues for understanding their unconscious, non-verbal behavior. These are called "honest signals" since they are unconscious subtle patterns in our interaction with other people that help reveal the focus of our attention. Inside the classroom, they provide important clues about teaching practices and students' responses to different conscious and unconscious teaching strategies. Scanning this non-verbal behavior in the classroom can provide important feedback to the teacher in order for them to improve their teaching practices. This type of analysis usually requires sophisticated eye-tracking equipment, motion sensors, or multiple cameras. However, for this to be a useful tool in the teacher's daily practice, an alternative must be found using only a smartphone. A smartphone is the only instrument that a teacher always has at their disposal and is nowadays considered truly ubiquitous. Our study looks at data from a group of first-grade classrooms. We show how video recordings on a teacher's smartphone can be used in order to estimate the direction of the teacher and students' gaze, as well as their body orientation. Using the output from the OpenPose software, we run Machine Learning (ML) algorithms to train an estimator to recognize the direction of the students' gaze and body orientation. We found that the level of accuracy achieved is comparable to that of human observers watching frames from the videos. The mean square errors (RMSE) of the predicted pitch and yaw angles for head and body directions are on average 11% lower than the RMSE between human annotators. However, our solution is much faster, avoids the tedium of doing it manually, and makes it possible to design solutions that give the teacher feedback as soon as they finish the class.es_ES
Patrocinadordc.description.sponsorshipANID/PIA/Basal Funds for Centers of Excellence FB0003es_ES
Lenguagedc.language.isoenes_ES
Publisherdc.publisherFrontiers Mediaes_ES
Type of licensedc.rightsAttribution-NonCommercial-NoDerivs 3.0 United States*
Link to Licensedc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/us/*
Sourcedc.sourceFrontiers in Robotics and AIes_ES
Keywordsdc.subjectGaze detectiones_ES
Keywordsdc.subjectBody orientation detectiones_ES
Keywordsdc.subjectNon-verbal behaviores_ES
Keywordsdc.subjectTeaching practiceses_ES
Keywordsdc.subjectStudent attentiones_ES
Títulodc.titleAutomatic detection of gaze and body orientation in elementary school classroomses_ES
Document typedc.typeArtículo de revistaes_ES
dc.description.versiondc.description.versionVersión publicada - versión final del editores_ES
dcterms.accessRightsdcterms.accessRightsAcceso abiertoes_ES
Catalogueruchile.catalogadorcrbes_ES
Indexationuchile.indexArtículo de publícación WoSes_ES


Files in this item

Icon

This item appears in the following Collection(s)

Show simple item record

Attribution-NonCommercial-NoDerivs 3.0 United States
Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivs 3.0 United States