Show simple item record

Authordc.contributor.authorCorrea Pérez, Mauricio 
Authordc.contributor.authorRuiz del Solar, Javier 
Authordc.contributor.authorVerschae Tannenbaum, Rodrigo 
Admission datedc.date.accessioned2016-05-22T04:07:55Z
Available datedc.date.available2016-05-22T04:07:55Z
Publication datedc.date.issued2016
Cita de ítemdc.identifier.citationPattern Recognition 52 (2016) 160–173en_US
Identifierdc.identifier.otherDOI: 10.1016/j.patcog.201.11.008
Identifierdc.identifier.urihttps://repositorio.uchile.cl/handle/2250/138414
General notedc.descriptionArtículo de publicación ISIen_US
Abstractdc.description.abstractThis paper proposes a new tool for the evaluation of face analysis systems under dynamic experimental conditions. The tool primarily consists of a virtual environment where a virtual agent (e.g., a simulated robot) carries out a face analysis process (e.g. face detection and recognition). This virtual agent can navigate in the virtual environment, where one or more subjects are present, and it can observe the subjects' faces from different distances and angles (yaw, pitch, and roll), and under different illumination conditions (indoor or outdoor). The current view of the agent, i.e. the image that the agent observes, is generated by composing real face and background images acquired prior to their usage in the virtual environment. In the virtual environment, different kinds of agents and agents' trajectories can be simulated, such as an agent navigating in a scene with people looking in different directions (mimicking a home-like environment), an agent performing a circular scanning (such as in a security checkpoint), or a camera-based surveillance system observing a person. In addition, during the recognition process the agent can actively change its viewpoint seeking to improve the recognition results. The proposed tool provides to the developer all functionalities needed to build the evaluation scenario: a set of real face images with real background information, a virtual agent with navigation capabilities, a scenario configuration (number, position and pose of the subjects to be observed), an agent trajectory definition, the generation of the simulated agent's view-dependent images, some basic active vision mechanisms, and the ground truth data (e.g. face id and pose for every observation), allowing the evaluation of face analysis methods under realistic conditions. Three usage examples are presented: the study of the robustness of face detection and face recognition methods under pose variations, and the evaluation of an integrated face analysis system to be used by a service robot. The proposed methodology may be of interest for researchers and developers of face analysis methods, in particular in the robotic and biometrics communities.en_US
Patrocinadordc.description.sponsorshipFONDECYT-Chile 3120218 1130153en_US
Lenguagedc.language.isoenen_US
Publisherdc.publisherElsevieren_US
Type of licensedc.rightsAtribución-NoComercial-SinDerivadas 3.0 Chile*
Link to Licensedc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/cl/*
Keywordsdc.subjectFace analysisen_US
Keywordsdc.subjectFace recognitionen_US
Keywordsdc.subjectFaceen_US
Keywordsdc.subjectRecognition benchmarken_US
Keywordsdc.subjectEvaluation methodologiesen_US
Keywordsdc.subjectVirtual simulation environmenten_US
Keywordsdc.subjectSimulatoren_US
Títulodc.titleA realistic virtual environment for evaluating face analysis systems under dynamic conditionsen_US
Document typedc.typeArtículo de revista


Files in this item

Icon

This item appears in the following Collection(s)

Show simple item record

Atribución-NoComercial-SinDerivadas 3.0 Chile
Except where otherwise noted, this item's license is described as Atribución-NoComercial-SinDerivadas 3.0 Chile